(click to copy)

Publication

The Statistical Foundations of Entropy

During the last few decades, the notion of entropy has become omnipresent in many scientific disciplines, ranging from traditional applications in statistical physics and chemistry, information theory, and statistical estimation to more recent applications in biology, astrophysics, geology, financial markets, or social networks. All these examples belong to the large family of complex dynamical systems that is typified by phase transitions, scaling behavior, multiscale and emerging phenomena, and many other non-trivial phenomena. Frequently, it turns out that in these systems, the usual Boltzmann–Gibbs–Shannon entropy and ensuing statistical physics are not adequate concepts. This especially happens in cases where the sample space in question does not grow exponentially.
This Special Issue, “The Statistical Foundations of Entropy”, is dedicated to discussing solutions and delving into concepts, methods, and algorithms for improving our understanding of the statistical foundations of entropy in complex systems, with a particular focus on the so-called generalized entropies that go beyond the usual Boltzmann–Gibbs–Shannon framework.
The nine high-quality articles included in this Special Issue propose and discuss new tools and concepts derived from information theory, non-equilibrium statistical physics, and the theory of complex dynamical systems to investigate various non-conventional aspects of entropy with assorted applications. They illustrate the potential and pertinence of novel conceptual tools in statistical physics that, in turn, help us to shed fresh light on the statistical foundations of entropy.
P. Jizba, J. Korbel, The Statistical Foundations of Entropy, Entropy 23 (10) (2021) 1367.
0 Pages 0 Press 0 News 0 Events 0 Projects 0 Publications 0 Person 0 Visualisation 0 Art

Signup

CSH Newsletter

Choose your preference
   
Data Protection*