19 research outputs found
Reply to C. Tsallis’ “Conceptual Inadequacy of the Shore and Johnson Axioms for Wide Classes of Complex Systems”
In a recent PRL (2013, 111, 180604), we invoked the Shore and Johnson axioms which demonstrate that the least-biased way to infer probability distributions {pi} from data is to maximize the Boltzmann-Gibbs entropy. We then showed which biases are introduced in models obtained by maximizing nonadditive entropies. A rebuttal of our work appears in entropy (2015, 17, 2853) and argues that the Shore and Johnson axioms are inapplicable to a wide class of complex systems. Here we highlight the errors in this reasoning
Conceptual Inadequacy of the Shore and Johnson Axioms for Wide Classes of Complex Systems
It is by now well known that the Boltzmann-Gibbs-von Neumann-Shannon
logarithmic entropic functional () is inadequate for wide classes of
strongly correlated systems: see for instance the 2001 Brukner and Zeilinger's
{\it Conceptual inadequacy of the Shannon information in quantum measurements},
among many other systems exhibiting various forms of complexity. On the other
hand, the Shannon and Khinchin axioms uniquely mandate the BG form
; the Shore and Johnson axioms follow the same
path. Many natural, artificial and social systems have been satisfactorily
approached with nonadditive entropies such as the one (), basis of nonextensive
statistical mechanics. Consistently, the Shannon 1948 and Khinchine 1953
uniqueness theorems have already been generalized in the literature, by Santos
1997 and Abe 2000 respectively, in order to uniquely mandate . We argue
here that the same remains to be done with the Shore and Johnson 1980 axioms.
We arrive to this conclusion by analyzing specific classes of strongly
correlated complex systems that await such generalization.Comment: This new version has been sensibly modified and updated. The title
and abstract have been modifie
The foundations of statistical physics: entropy, irreversibility, and inference
Statistical physics aims to describe properties of macroscale systems in
terms of distributions of their microscale agents. Its central tool is the
maximization of entropy, a variational principle. We review the history of this
principle, first considered as a law of nature, more recently as a procedure
for inference in model-making. And while equilibria (EQ) have long been
grounded in the principle of Maximum Entropy (MaxEnt), until recently no
equally foundational generative principle has been known for non-equilibria
(NEQ). We review evidence that the variational principle for NEQ is Maximum
Caliber. It entails maximizing \textit{path entropies}, not \textit{state
entropies}. We also describe the role of entropy in characterizing
irreversibility, and describe the relationship between MaxCal and other
prominent approaches to NEQ physics, including Stochastic Thermodynamics (ST),
Large Deviations Theory (LDT), Macroscopic Fluctuation Theory (MFT), and
non-extensive entropies.Comment: 21 pages, 3 figure
The Statistical Foundations of Entropy
In the last two decades, the understanding of complex dynamical systems underwent important conceptual shifts. The catalyst was the infusion of new ideas from the theory of critical phenomena (scaling laws, renormalization group, etc.), (multi)fractals and trees, random matrix theory, network theory, and non-Shannonian information theory. The usual Boltzmann–Gibbs statistics were proven to be grossly inadequate in this context. While successful in describing stationary systems characterized by ergodicity or metric transitivity, Boltzmann–Gibbs statistics fail to reproduce the complex statistical behavior of many real-world systems in biology, astrophysics, geology, and the economic and social sciences.The aim of this Special Issue was to extend the state of the art by original contributions that could contribute to an ongoing discussion on the statistical foundations of entropy, with a particular emphasis on non-conventional entropies that go significantly beyond Boltzmann, Gibbs, and Shannon paradigms. The accepted contributions addressed various aspects including information theoretic, thermodynamic and quantum aspects of complex systems and found several important applications of generalized entropies in various systems
Reformulation of quantum mechanics and strong complementarity from Bayesian inference requirements
This paper provides an epistemic reformulation of quantum mechanics (QM) in terms of inference consistency requirements of objective Bayesianism, which include the principle of maximum entropy under physical constraints. Physical laws themselves are understood in terms of inference and physical consistency requirements. Strong complementarity - that different observers may "live" in separate Hilbert spaces - follows as a consequence, which resolves the firewall paradox. Other clues pointing to this reformulation are analyzed. The reformulation, with the addition of novel transition probability arithmetic, resolves the measurement problem completely, thereby eliminating charge of subjectivity of measurements from quantum mechanics. An illusion of collapse comes from Bayesian updates by observer's continuous outcome data. Spacetime is to be understood in epistemic sense, instead of existing independently of an observer, following spirits of black hole complementarity. Dark matter and dark energy pop up directly as entropic tug-of-war in the reformulation
Reformulation of quantum mechanics and strong complementarity from Bayesian inference requirements
This paper provides an epistemic reformulation of quantum mechanics (QM) in terms of inference consistency requirements of objective Bayesianism, which include the principle of maximum entropy under physical constraints. Physical constraints themselves are understood in terms of consistency requirements. The by-product of this approach is that QM must additionally be understood as providing the theory of theories. Strong complementarity - that different observers may "live" in separate Hilbert spaces - follows as a consequence, which resolves the firewall paradox. Other clues pointing to this reformulation are analyzed. The reformulation, with the addition of novel transition probability arithmetic, resolves the measurement problem completely, thereby eliminating subjectivity of measurements from quantum mechanics. An illusion of collapse comes from Bayesian updates by observer's continuous outcome data. Dark matter and dark energy pop up directly as entropic tug-of-war in the reformulation
Reformulation of quantum mechanics and strong complementarity from Bayesian inference requirements
This paper provides an epistemic reformulation of quantum mechanics (QM) in terms of inference consistency requirements of objective Bayesianism, which include the principle of maximum entropy under physical constraints. Physical constraints themselves are understood in terms of consistency requirements. The by-product of this approach is that QM must additionally be understood as providing the theory of theories. Strong complementarity - that different observers may "live" in separate Hilbert spaces - follows as a consequence, which resolves the firewall paradox. Other clues pointing to this reformulation are analyzed. The reformulation, with the addition of novel transition probability arithmetic, resolves the measurement problem completely, thereby eliminating subjectivity of measurements from quantum mechanics. An illusion of collapse comes from Bayesian updates by observer's continuous outcome data. Dark matter and dark energy pop up directly as entropic tug-of-war in the reformulation
Reformulation of quantum mechanics and strong complementarity from Bayesian inference requirements
This paper provides an epistemic reformulation of quantum mechanics (QM) in terms of inference consistency requirements of objective Bayesianism, which include the principle of maximum entropy under physical constraints. Physical constraints themselves are understood in terms of consistency requirements. The by-product of this approach is that QM must additionally be understood as providing the theory of theories. Strong complementarity - that different observers may "live" in separate Hilbert spaces - follows as a consequence. The firewall paradox, analyzed by a parallel with Hardy's paradox, is used as an example supporting necessity of the reformulation and its consequential results. Other clues pointing to this reformulation are analyzed. The reformulation, with the addition of novel transition probability arithmetic, eliminates basis ambiguity and the collapse postulate, thereby eliminating subjectivity of measurements from quantum mechanics, and resolving the measurement problem completely
Reformulation of quantum mechanics and strong complementarity from Bayesian inference requirements
This paper provides an epistemic reformulation of quantum mechanics (QM) in terms of inference consistency requirements of objective Bayesianism, which include the principle of maximum entropy under physical constraints. Physical laws themselves are understood in terms of inference and physical consistency requirements. Strong complementarity - that different observers may "live" in separate Hilbert spaces - follows as a consequence, which resolves the firewall paradox. Other clues pointing to this reformulation are analyzed. The reformulation, with the addition of novel transition probability arithmetic, resolves the measurement problem completely, thereby eliminating charge of subjectivity of measurements from quantum mechanics. An illusion of collapse comes from Bayesian updates by observer's continuous outcome data. Spacetime is to be understood in epistemic sense, instead of existing independently of an observer, following spirits of black hole complementarity. Dark matter and dark energy pop up directly as entropic tug-of-war in the reformulation