223 research outputs found

    Optimality of entropic uncertainty relations

    Get PDF
    The entropic uncertainty relation proven by Maassen and Uffink for arbitrary pairs of two observables is known to be non-optimal. Here, we call an uncertainty relation optimal, if the lower bound can be attained for any value of either of the corresponding uncertainties. In this work we establish optimal uncertainty relations by characterising the optimal lower bound in scenarios similar to the Maassen-Uffink type. We disprove a conjecture by Englert et al. and generalise various previous results. However, we are still far from a complete understanding and, based on numerical investigation and analytical results in small dimension, we present a number of conjectures.Comment: 24 pages, 10 figure

    Back to the Future: Economic Self-Organisation and Maximum Entropy Prediction

    Get PDF
    This paper shows that signal restoration methodology is appropriate for predicting the equilibrium state of certain economic systems. A formal justification for this is provided by proving the existence of finite improvement paths in object allocation problems under weak assumptions on preferences, linking any initial condition to a Nash equilibrium. Because a finite improvement path is made up of a sequence of systematic best-responses, backwards movement from the equilibrium back to the initial condition can be treated like the realisation of a noise process. This underpins the use of signal restoration to predict the equilibrium from the initial condition, and an illustration is provided through an application of maximum entropy signal restoration to the Schelling model of segregation

    Fusion rules from entanglement

    Get PDF
    We derive some of the axioms of the algebraic theory of anyon (Kitaev, 2006) from a conjectured form of entanglement area law for two-dimensional gapped systems. We derive the fusion rules of topological charges and show that the multiplicities of the fusion rules satisfy these axioms. Moreover, even though we make no assumption about the exact value of the constant sub-leading term of the entanglement entropy of a disk-like region, this term is shown to be equal to ln D, where D is the total quantum dimension of the underlying anyon theory. These derivations are rigorous and follow from the entanglement area law alone. More precisely, our framework starts from two local entropic constraints which are implied by the area law. From these constraints, we prove what we refer to as the “isomorphism theorem.” The existence of superselection sectors and fusion multiplicites follow from this theorem, even without assuming anything about the parent Hamiltonian. These objects and the axioms of the anyon theory are shown to emerge from the structure and the internal self-consistency relations of the information convex sets

    Contextual advantage for state discrimination

    Full text link
    Finding quantitative aspects of quantum phenomena which cannot be explained by any classical model has foundational importance for understanding the boundary between classical and quantum theory. It also has practical significance for identifying information processing tasks for which those phenomena provide a quantum advantage. Using the framework of generalized noncontextuality as our notion of classicality, we find one such nonclassical feature within the phenomenology of quantum minimum error state discrimination. Namely, we identify quantitative limits on the success probability for minimum error state discrimination in any experiment described by a noncontextual ontological model. These constraints constitute noncontextuality inequalities that are violated by quantum theory, and this violation implies a quantum advantage for state discrimination relative to noncontextual models. Furthermore, our noncontextuality inequalities are robust to noise and are operationally formulated, so that any experimental violation of the inequalities is a witness of contextuality, independently of the validity of quantum theory. Along the way, we introduce new methods for analyzing noncontextuality scenarios, and demonstrate a tight connection between our minimum error state discrimination scenario and a Bell scenario.Comment: 18 pages, 9 figure

    Quantum entanglement

    Get PDF
    All our former experience with application of quantum theory seems to say: {\it what is predicted by quantum formalism must occur in laboratory}. But the essence of quantum formalism - entanglement, recognized by Einstein, Podolsky, Rosen and Schr\"odinger - waited over 70 years to enter to laboratories as a new resource as real as energy. This holistic property of compound quantum systems, which involves nonclassical correlations between subsystems, is a potential for many quantum processes, including ``canonical'' ones: quantum cryptography, quantum teleportation and dense coding. However, it appeared that this new resource is very complex and difficult to detect. Being usually fragile to environment, it is robust against conceptual and mathematical tools, the task of which is to decipher its rich structure. This article reviews basic aspects of entanglement including its characterization, detection, distillation and quantifying. In particular, the authors discuss various manifestations of entanglement via Bell inequalities, entropic inequalities, entanglement witnesses, quantum cryptography and point out some interrelations. They also discuss a basic role of entanglement in quantum communication within distant labs paradigm and stress some peculiarities such as irreversibility of entanglement manipulations including its extremal form - bound entanglement phenomenon. A basic role of entanglement witnesses in detection of entanglement is emphasized.Comment: 110 pages, 3 figures, ReVTex4, Improved (slightly extended) presentation, updated references, minor changes, submitted to Rev. Mod. Phys

    Entropic Geometry of Crowd Dynamics

    Get PDF

    A decomposition method for global evaluation of Shannon entropy and local estimations of algorithmic complexity

    Get PDF
    We investigate the properties of a Block Decomposition Method (BDM), which extends the power of a Coding Theorem Method (CTM) that approximates local estimations of algorithmic complexity based on Solomonoff–Levin’s theory of algorithmic probability providing a closer connection to algorithmic complexity than previous attempts based on statistical regularities such as popular lossless compression schemes. The strategy behind BDM is to find small computer programs that produce the components of a larger, decomposed object. The set of short computer programs can then be artfully arranged in sequence so as to produce the original object. We show that the method provides efficient estimations of algorithmic complexity but that it performs like Shannon entropy when it loses accuracy. We estimate errors and study the behaviour of BDM for different boundary conditions, all of which are compared and assessed in detail. The measure may be adapted for use with more multi-dimensional objects than strings, objects such as arrays and tensors. To test the measure we demonstrate the power of CTM on low algorithmic-randomness objects that are assigned maximal entropy (e.g., π) but whose numerical approximations are closer to the theoretical low algorithmic-randomness expectation. We also test the measure on larger objects including dual, isomorphic and cospectral graphs for which we know that algorithmic randomness is low. We also release implementations of the methods in most major programming languages—Wolfram Language (Mathematica), Matlab, R, Perl, Python, Pascal, C++, and Haskell—and an online algorithmic complexity calculator.Swedish Research Council (Vetenskapsrådet
    • …
    corecore