14 research outputs found

    Conceptual Inadequacy of the Shore and Johnson Axioms for Wide Classes of Complex Systems

    Full text link
    It is by now well known that the Boltzmann-Gibbs-von Neumann-Shannon logarithmic entropic functional (SBGS_{BG}) is inadequate for wide classes of strongly correlated systems: see for instance the 2001 Brukner and Zeilinger's {\it Conceptual inadequacy of the Shannon information in quantum measurements}, among many other systems exhibiting various forms of complexity. On the other hand, the Shannon and Khinchin axioms uniquely mandate the BG form SBG=kipilnpiS_{BG}=-k\sum_i p_i \ln p_i; the Shore and Johnson axioms follow the same path. Many natural, artificial and social systems have been satisfactorily approached with nonadditive entropies such as the Sq=k1ipiqq1S_q=k \frac{1-\sum_i p_i^q}{q-1} one (qR;S1=SBGq \in {\cal R}; \,S_1=S_{BG}), basis of nonextensive statistical mechanics. Consistently, the Shannon 1948 and Khinchine 1953 uniqueness theorems have already been generalized in the literature, by Santos 1997 and Abe 2000 respectively, in order to uniquely mandate SqS_q. We argue here that the same remains to be done with the Shore and Johnson 1980 axioms. We arrive to this conclusion by analyzing specific classes of strongly correlated complex systems that await such generalization.Comment: This new version has been sensibly modified and updated. The title and abstract have been modifie

    Reply to C. Tsallis’ “Conceptual Inadequacy of the Shore and Johnson Axioms for Wide Classes of Complex Systems”

    Get PDF
    In a recent PRL (2013, 111, 180604), we invoked the Shore and Johnson axioms which demonstrate that the least-biased way to infer probability distributions {pi} from data is to maximize the Boltzmann-Gibbs entropy. We then showed which biases are introduced in models obtained by maximizing nonadditive entropies. A rebuttal of our work appears in entropy (2015, 17, 2853) and argues that the Shore and Johnson axioms are inapplicable to a wide class of complex systems. Here we highlight the errors in this reasoning

    Reformulation of quantum mechanics and strong complementarity from Bayesian inference requirements

    Get PDF
    This paper provides an epistemic reformulation of quantum mechanics (QM) in terms of inference consistency requirements of objective Bayesianism, which include the principle of maximum entropy under physical constraints. Physical laws themselves are understood in terms of inference and physical consistency requirements. Strong complementarity - that different observers may "live" in separate Hilbert spaces - follows as a consequence, which resolves the firewall paradox. Other clues pointing to this reformulation are analyzed. The reformulation, with the addition of novel transition probability arithmetic, resolves the measurement problem completely, thereby eliminating charge of subjectivity of measurements from quantum mechanics. An illusion of collapse comes from Bayesian updates by observer's continuous outcome data. Spacetime is to be understood in epistemic sense, instead of existing independently of an observer, following spirits of black hole complementarity. Dark matter and dark energy pop up directly as entropic tug-of-war in the reformulation

    Reformulation of quantum mechanics and strong complementarity from Bayesian inference requirements

    Get PDF
    This paper provides an epistemic reformulation of quantum mechanics (QM) in terms of inference consistency requirements of objective Bayesianism, which include the principle of maximum entropy under physical constraints. Physical constraints themselves are understood in terms of consistency requirements. The by-product of this approach is that QM must additionally be understood as providing the theory of theories. Strong complementarity - that different observers may "live" in separate Hilbert spaces - follows as a consequence, which resolves the firewall paradox. Other clues pointing to this reformulation are analyzed. The reformulation, with the addition of novel transition probability arithmetic, resolves the measurement problem completely, thereby eliminating subjectivity of measurements from quantum mechanics. An illusion of collapse comes from Bayesian updates by observer's continuous outcome data. Dark matter and dark energy pop up directly as entropic tug-of-war in the reformulation

    Reformulation of quantum mechanics and strong complementarity from Bayesian inference requirements

    Get PDF
    This paper provides an epistemic reformulation of quantum mechanics (QM) in terms of inference consistency requirements of objective Bayesianism, which include the principle of maximum entropy under physical constraints. Physical constraints themselves are understood in terms of consistency requirements. The by-product of this approach is that QM must additionally be understood as providing the theory of theories. Strong complementarity - that different observers may "live" in separate Hilbert spaces - follows as a consequence, which resolves the firewall paradox. Other clues pointing to this reformulation are analyzed. The reformulation, with the addition of novel transition probability arithmetic, resolves the measurement problem completely, thereby eliminating subjectivity of measurements from quantum mechanics. An illusion of collapse comes from Bayesian updates by observer's continuous outcome data. Dark matter and dark energy pop up directly as entropic tug-of-war in the reformulation

    Reformulation of quantum mechanics and strong complementarity from Bayesian inference requirements

    Get PDF
    This paper provides an epistemic reformulation of quantum mechanics (QM) in terms of inference consistency requirements of objective Bayesianism, which include the principle of maximum entropy under physical constraints. Physical constraints themselves are understood in terms of consistency requirements. The by-product of this approach is that QM must additionally be understood as providing the theory of theories. Strong complementarity - that different observers may "live" in separate Hilbert spaces - follows as a consequence. The firewall paradox, analyzed by a parallel with Hardy's paradox, is used as an example supporting necessity of the reformulation and its consequential results. Other clues pointing to this reformulation are analyzed. The reformulation, with the addition of novel transition probability arithmetic, eliminates basis ambiguity and the collapse postulate, thereby eliminating subjectivity of measurements from quantum mechanics, and resolving the measurement problem completely

    Conceptual Inadequacy of the Shore and Johnson Axioms for Wide Classes of Complex Systems

    No full text
    It is by now well known that the Boltzmann-Gibbs-von Neumann-Shannon logarithmic entropic functional ((S_{BG})) is inadequate for wide classes of strongly correlated systems: see for instance the 2001 Brukner and Zeilinger\u27s {it Conceptual inadequacy of the Shannon information in quantum measurements}, among many other systems exhibiting various forms of complexity. On the other hand, the Shannon and Khinchin axioms uniquely mandate the BG form (S_{BG}=-ksum_i p_i ln p_i); the Shore and Johnson axioms follow the same path. Many natural, artificial and social systems have been satisfactorily approached with nonadditive entropies such as the (S_q=k frac{1-sum_i p_i^q}{q-1}) one ((q in {cal R}; ,S_1=S_{BG})), basis of nonextensive statistical mechanics. Consistently, the Shannon 1948 and Khinchine 1953 uniqueness theorems have already been generalized in the literature, by Santos 1997 and Abe 2000 respectively, in order to uniquely mandate (S_q). We argue here that the same remains to be done with the Shore and Johnson 1980 axioms. We arrive to this conclusion by analyzing specific classes of strongly correlated complex systems that await such generalization
    corecore