Depending on context, the term entropy is used for a thermodynamic quantity, a measure of available choice, a quantity to measure information, or, in the context of statistical inference, a… Click to show full abstract
Depending on context, the term entropy is used for a thermodynamic quantity, a measure of available choice, a quantity to measure information, or, in the context of statistical inference, a maximum configuration predictor. For systems in equilibrium or processes without memory, the mathematical expression for these different concepts of entropy appears to be the so-called Boltzmann–Gibbs–Shannon entropy, H. For processes with memory, such as driven- or self- reinforcing-processes, this is no longer true: the different entropy concepts lead to distinct functionals that generally differ from H. Here we focus on the maximum configuration entropy (that predicts empirical distribution functions) in the context of driven dissipative systems. We develop the corresponding framework and derive the entropy functional that describes the distribution of observable states as a function of the details of the driving process. We do this for sample space reducing (SSR) processes, which provide an analytically tractable model for driven dissipative systems with controllable driving. The fact that a consistent framework for a maximum configuration entropy exists for arbitrarily driven non-equilibrium systems opens the possibility of deriving a full statistical theory of driven dissipative systems of this kind. This provides us with the technical means needed to derive a thermodynamic theory of driven processes based on a statistical theory. We discuss the Legendre structure for driven systems.
               
Click one of the above tabs to view related content.