**DATE / VENUE**** ============**
Tuesday, 20/01/2015
10,00 a.m.
ISI Red Room
SPEAKER
========
Stefan Thurner
Medical University of Vienna
Webpage: http://www.complex-systems.meduniwien.ac.at/people/sthurner/

**ABSTRACT**** ==========**

Shannon and Khinchin built their foundational information theoretic work on four axioms that completely determine the information-theoretic entropy to be of Boltzmann-Gibbs type.

For non-ergodic systems the separation axiom (Shannon-Khinchin axiom 4) is not valid. We show that whenever this axiom is violated -as is the case in most complex systems - entropy takes a more general form in terms of the incomplete Gamma function whose parameters express equivalence classes which precisely characterise all (!) interacting and non-interacting statistical systems in the thermodynamic limit, including those that typically exhibit power laws or stretched exponential distributions. This allows us for example to derive Tsallis entropy (as a special case) from solid first principles.

Further we show how the knowledge of the phase space volume of a system and the requirement of extensivity allows to uniquely determine the parameters of the generalized entropy. We ask how the these entropies are related to the 'Maximum entropy principle' (MEP). In particular we show how the first Shannon-Khinchin axiom allows us to separate the value for observing the most likely distribution function of a statistical system, into a 'maximum entropy' (log of multiplicity) and constraint terms. Remarkably, the generalized extensive entropy is not necessarily identical with the generalized maximum entropy functional. In general for non-ergodic systems both concepts are tightly related but distinct. We demonstrate the practical relevance of our results on path-dependent random walks (non-Markovian systems with long-term memory) where the random walker's choices (left or right) depending on the history of the trajectory. We are able to compute the time dependent distribution functions from the knowledge of the maximum entropy, which is analytically derived from the microscopic update rules. Self-organized critical systems such as sand piles or particular types of spin systems with densifying interactions are other examples that can be understood within the presented framework.

Related references:

[1] R. Hanel, S. Thurner, A comprehensive classification of complex statistical sys- tems and an axiomatic derivation of their entropy and distribution functions, EPL 93, 20006 (2011).

[2] R. Hanel, S. Thurner, When do generalized entropies apply? How phase space volume determines entropy, EPL 96, 50003 (2011).

[3] R. Hanel, S. Thurner, M. Gell-Mann, How multiplicity determines entropy and derivation of the maximum entropy principle for complex systems, PNAS 111, 6905-6910 (2014).