6,563 research outputs found

    Conjugates, Filters and Quantum Mechanics

    Full text link
    The Jordan structure of finite-dimensional quantum theory is derived, in a conspicuously easy way, from a few simple postulates concerning abstract probabilistic models (each defined by a set of basic measurements and a convex set of states). The key assumption is that each system A can be paired with an isomorphic conjugate\textit{conjugate} system, A‾\overline{A}, by means of a non-signaling bipartite state ηA\eta_A perfectly and uniformly correlating each basic measurement on A with its counterpart on A‾\overline{A}. In the case of a quantum-mechanical system associated with a complex Hilbert space H\mathcal H, the conjugate system is that associated with the conjugate Hilbert space H‾\overline{\mathcal H}, and ηA\eta_A corresponds to the standard maximally entangled EPR state on H⊗H‾{\mathcal H} \otimes \overline{\mathcal H}. A second ingredient is the notion of a reversible filter\textit{reversible filter}, that is, a probabilistically reversible process that independently attenuates the sensitivity of detectors associated with a measurement. In addition to offering more flexibility than most existing reconstructions of finite-dimensional quantum theory, the approach taken here has the advantage of not relying on any form of the "no restriction" hypothesis. That is, it is not assumed that arbitrary effects are physically measurable, nor that arbitrary families of physically measurable effects summing to the unit effect, represent physically accessible observables. An appendix shows how a version of Hardy's "subspace axiom" can replace several assumptions native to this paper, although at the cost of disallowing superselection rules.Comment: 33 pp. Minor corrections throughout; some revision of Appendix

    Conservation of information and the foundations of quantum mechanics

    Get PDF
    We review a recent approach to the foundations of quantum mechanics inspired by quantum information theory. The approach is based on a general framework, which allows one to address a large class of physical theories which share basic information-theoretic features. We first illustrate two very primitive features, expressed by the axioms of causality and purity-preservation, which are satisfied by both classical and quantum theory. We then discuss the axiom of purification, which expresses a strong version of the Conservation of Information and captures the core of a vast number of protocols in quantum information. Purification is a highly non-classical feature and leads directly to the emergence of entanglement at the purely conceptual level, without any reference to the superposition principle. Supplemented by a few additional requirements, satisfied by classical and quantum theory, it provides a complete axiomatic characterization of quantum theory for finite dimensional systems.Comment: 11 pages, contribution to the Proceedings of the 3rd International Conference on New Frontiers in Physics, July 28-August 6 2014, Orthodox Academy of Crete, Kolymbari, Cret

    Discrete Time Generative-Reactive Probabilistic Processes with Different Advancing Speeds

    Get PDF
    We present a process algebra expressing probabilistic external/internal choices, multi-way synchronizations, and processes with different advancing speeds in the context of discrete time, i.e. where time is not continuous but is represented by a sequence of discrete steps as in discrete time Markov chains (DTMCs). To this end, we introduce a variant of CSP that employs a probabilistic asynchronous parallel operator whose synchronization mechanism is based on a mixture of the classical generative and reactive models of probability. In particular, differently from existing discrete time process algebras, where parallel processes are executed in synchronous locksteps, the parallel operator that we adopt allows processes with different probabilistic advancing speeds (mean number of actions executed per time unit) to be modeled. Moreover, our generative-reactive synchronization mechanism makes it possible to always derive DTMCs in the case of fully specified systems. We then present a sound and complete axiomatization of probabilistic bisimulation over finite processes of our calculus, that is a smooth extension of the axiom system for a standard process algebra, thus solving the open problem of cleanly axiomatizing action restriction in the generative model. As a further result, we show that, when evaluating steady state based performance measures which are expressible by attaching rewards to actions, our approach provides an exact solution even if the advancing speeds are considered not to be probabilistic, without incurring the state space explosion problem that arises with standard synchronous approaches. We finally present a case study on multi-path routing showing the expressiveness of our calculus and that it makes it particularly easy to produce scalable specifications

    Entropy, majorization and thermodynamics in general probabilistic theories

    Full text link
    In this note we lay some groundwork for the resource theory of thermodynamics in general probabilistic theories (GPTs). We consider theories satisfying a purely convex abstraction of the spectral decomposition of density matrices: that every state has a decomposition, with unique probabilities, into perfectly distinguishable pure states. The spectral entropy, and analogues using other Schur-concave functions, can be defined as the entropy of these probabilities. We describe additional conditions under which the outcome probabilities of a fine-grained measurement are majorized by those for a spectral measurement, and therefore the "spectral entropy" is the measurement entropy (and therefore concave). These conditions are (1) projectivity, which abstracts aspects of the Lueders-von Neumann projection postulate in quantum theory, in particular that every face of the state space is the positive part of the image of a certain kind of projection operator called a filter; and (2) symmetry of transition probabilities. The conjunction of these, as shown earlier by Araki, is equivalent to a strong geometric property of the unnormalized state cone known as perfection: that there is an inner product according to which every face of the cone, including the cone itself, is self-dual. Using some assumptions about the thermodynamic cost of certain processes that are partially motivated by our postulates, especially projectivity, we extend von Neumann's argument that the thermodynamic entropy of a quantum system is its spectral entropy to generalized probabilistic systems satisfying spectrality.Comment: In Proceedings QPL 2015, arXiv:1511.0118

    Fifty years of Hoare's Logic

    Get PDF
    We present a history of Hoare's logic.Comment: 79 pages. To appear in Formal Aspects of Computin

    Quantum mechanics as a theory of probability

    Get PDF
    We develop and defend the thesis that the Hilbert space formalism of quantum mechanics is a new theory of probability. The theory, like its classical counterpart, consists of an algebra of events, and the probability measures defined on it. The construction proceeds in the following steps: (a) Axioms for the algebra of events are introduced following Birkhoff and von Neumann. All axioms, except the one that expresses the uncertainty principle, are shared with the classical event space. The only models for the set of axioms are lattices of subspaces of inner product spaces over a field K. (b) Another axiom due to Soler forces K to be the field of real, or complex numbers, or the quaternions. We suggest a probabilistic reading of Soler's axiom. (c) Gleason's theorem fully characterizes the probability measures on the algebra of events, so that Born's rule is derived. (d) Gleason's theorem is equivalent to the existence of a certain finite set of rays, with a particular orthogonality graph (Wondergraph). Consequently, all aspects of quantum probability can be derived from rational probability assignments to finite "quantum gambles". We apply the approach to the analysis of entanglement, Bell inequalities, and the quantum theory of macroscopic objects. We also discuss the relation of the present approach to quantum logic, realism and truth, and the measurement problem.Comment: 37 pages, 3 figures. Forthcoming in a Festschrift for Jeffrey Bub, ed. W. Demopoulos and the author, Springer (Kluwer): University of Western Ontario Series in Philosophy of Scienc

    Markovian Testing Equivalence and Exponentially Timed Internal Actions

    Full text link
    In the theory of testing for Markovian processes developed so far, exponentially timed internal actions are not admitted within processes. When present, these actions cannot be abstracted away, because their execution takes a nonzero amount of time and hence can be observed. On the other hand, they must be carefully taken into account, in order not to equate processes that are distinguishable from a timing viewpoint. In this paper, we recast the definition of Markovian testing equivalence in the framework of a Markovian process calculus including exponentially timed internal actions. Then, we show that the resulting behavioral equivalence is a congruence, has a sound and complete axiomatization, has a modal logic characterization, and can be decided in polynomial time

    On the relation between the second law of thermodynamics and classical and quantum mechanics

    Full text link
    In textbooks on statistical mechanics, one finds often arguments based on classical mechanics, phase space and ergodicity in order to justify the second law of thermodynamics. However, the basic equations of motion of classical mechanics are deterministic and reversible, while the second law of thermodynamics is irreversible and not deterministic, because it states that a system forgets its past when approaching equilibrium. I argue that all "derivations" of the second law of thermodynamics from classical mechanics include additional assumptions that are not part of classical mechanics. The same holds for Boltzmann's H-theorem. Furthermore, I argue that the coarse-graining of phase-space that is used when deriving the second law cannot be viewed as an expression of our ignorance of the details of the microscopic state of the system, but reflects the fact that the state of a system is fully specified by using only a finite number of bits, as implied by the concept of entropy, which is related to the number of different microstates that a closed system can have. While quantum mechanics, as described by the Schroedinger equation, puts this latter statement on a firm ground, it cannot explain the irreversibility and stochasticity inherent in the second law.Comment: Invited talk given on the 2012 "March meeting" of the German Physical Society To appear in: B. Falkenburg and M. Morrison (eds.), Why more is different (Springer Verlag, 2014
    • …
    corecore