65 research outputs found

    Contextuality and truth-value assignment

    Full text link
    In the paper, the question whether truth values can be assigned to the propositions before their verification is discussed. To answer this question, a notion of a propositionally noncontextual theory is introduced that in order to explain the verification outcomes provides a map linking each element of a complete lattice identified with a proposition to a truth value. The paper demonstrates that no model obeying such a theory and at the same time the principle of bivalence can be consistent with the occurrence of a non-vanishing "two-path" quantum interference term and the quantum collapse postulate.Comment: 7 pages. Version to appear in Quantum Studies: Mathematics and Foundation

    Quantum Supervaluationism

    Full text link
    As it is known, neither classical logical conjunction "and" nor classical logical alternative "either...or" can replace "+" representing a linear superposition of two quantum states. Therefore, to provide a logical account of the quantum superposition, one must either reconsider the standard interpretation of quantum mechanics (making it fit for classical bivalent logic) or replace the standard logic with a deviant logic suitable for describing the superposition. In the paper, a supervaluation approach to the description of the quantum superposition is considered. In accordance with this approach, the indefinite propositions, which correspond to the superposition states, lack truth-values of any kind even granting that their compounds (such as logical alternative "either...or") can have truth-values. As an illustration, the supervaluationist account of the superposition of spin states is presented.Comment: Major revision accepted for publicatio

    The quantum pigeonhole principle as a violation of the principle of bivalence

    Full text link
    In the paper, it is argued that the phenomenon known as the quantum pigeonhole principle (namely, three quantum particles are put in two boxes, yet no two particles are in the same box) can be explained not as a violation of Dirichlet's box principle in the case of quantum particles but as a nonvalidness of a bivalent logic for describing not-yet verified propositions relating to quantum mechanical experiments.Comment: This is a pre-print of an article published in Quantum Studies: Mathematics and Foundations. The final authenticated version is available online at: https://doi.org/10.1007/s40509-018-0157-

    No-cloning implies unalterability of the past

    Full text link
    A common way of stating the non-cloning theorem -- one of distinguishing characteristics of quantum theory -- is that one cannot make a copy of an arbitrary unknown quantum state. Even though this theorem is an important part of the ongoing discussion of the nature of a quantum state, the role of the theorem in the logical-algebraic approach to quantum theory has not yet been systematically studied. According to the standard point of view (which is in line with the logical tradition), quantum cloning amounts to two classical rules of inference, namely, monotonicity and idempotency of entailment. One can conclude then that the whole of quantum theory should be described through a logic wherein these rules do not hold, which is linear logic. However, in accordance with a supervaluational semantics (that allows one to retain all the theorems of classical logic while admitting `truth-value gaps'), quantum cloning necessitates the permanent loss of the truth values of experimental quantum propositions which violates the unalterability of the past. The present paper demonstrates this.Comment: 9 page

    Algebraic structures identified with bivalent and non-bivalent semantics of experimental quantum propositions

    Full text link
    The failure of distributivity in quantum logic is motivated by the principle of quantum superposition. However, this principle can be encoded differently, i.e., in different logico-algebraic objects. As a result, the logic of experimental quantum propositions might have various semantics. E.g., it might have either a total semantics, or a partial semantics (in which the valuation relation -- i.e., a mapping from the set of atomic propositions to the set of two objects, 1 and 0 -- is not total), or a many-valued semantics (in which the gap between 1 and 0 is completed with truth degrees). Consequently, closed linear subspaces of the Hilbert space representing experimental quantum propositions may be organized differently. For instance, they could be organized in the structure of a Hilbert lattice (or its generalizations) identified with the bivalent semantics of quantum logic or in a structure identified with a non-bivalent semantics. On the other hand, one can only verify -- at the same time -- propositions represented by the closed linear subspaces corresponding to mutually commuting projection operators. This implies that to decide which semantics is proper -- bivalent or non-bivalent -- is not possible experimentally. Nevertheless, the latter allows simplification of certain no-go theorems in the foundation of quantum mechanics. In the present paper, the Kochen-Specker theorem asserting the impossibility to interpret, within the orthodox quantum formalism, projection operators as definite {0,1}-valued (pre-existent) properties, is taken as an example. The paper demonstrates that within the algebraic structure identified with supervaluationism (the form of a partial, non-bivalent semantics), the statement of this theorem gets deduced trivially.Comment: This is a pre-print of an article published in Quantum Studies: Mathematics and Foundations. The final authenticated version is available online at: https://doi.org/10.1007/s40509-019-00212-

    An empiric logic approach to Einstein's version of the double-slit experiment

    Full text link
    As per Einstein's design, particles are introduced into the double-slit experiment through a small hole in a plate which can either move up and down (and its momentum can be measured) or be stopped (and its position can be measured). Suppose one measures the position of the plate and this act verifies the statement that the interference pattern is observed in the experiment. However, if it is possible to think about the outcome that one would have obtained if one had measured plate's momentum instead of its position, then it is possible to consider, together with the aforesaid statement, another statement that each particle passes through either slit of the double-slit screen. Hence, the proposition affirming the wave-like behavior and the proposition affirming the particle-like behavior might be true together, which would imply that Bohr's complementarity principle is incorrect. The analysis of Einstein's design and ways to refute it based on an approach that uses exclusively assignments of the truth values to experimental propositions is presented in this paper.Comment: 12 pages, 1 figur

    Any realistic model of a physical system must be computationally realistic

    Full text link
    It is argued that any possible definition of a realistic physics theory -- i.e., a mathematical model representing the real world -- cannot be considered comprehensive unless it is supplemented with requirement of being computationally realistic. That is, the mathematical structure of a realistic model of a physical system must allow the collection of all the system's physical quantities to compute all possible measurement outcomes on some computational device not only in an unambiguous way but also in a reasonable amount of time. In the paper, it is shown that a deterministic quantum model of a microscopic system evolving in isolation should be regarded as realistic since the NP-hard problem of finding the exact solution to the Schrodinger equation for an arbitrary physical system can be surely solved in a reasonable amount of time in the case, in which the system has just a small number of degrees of freedom. In contrast to this, the deterministic quantum model of a truly macroscopic object ought to be considered as non-realistic since in a world of limited computational resources the intractable problem possessing that enormous amount of degrees of freedom would be the same as mere unsolvable.Comment: 5 pages, replaces the earlier attempt arXiv:1401.1747 and answers some critiques of arXiv:1403.768

    Can category-theoretic semantics resolve the problem of the interpretation of the quantum state vector?

    Full text link
    Do correctness and completeness of quantum mechanics jointly imply that quantum state vectors are necessarily in one-to-one correspondence with elements of the physical reality? In terms of category theory, such a correspondence would stand for an isomorphism, so the problem of the status of the quantum state vector could be turned into the question of whether state vectors are necessarily isomorphic to elements of the reality. As it is argued in the present paper, in order to tackle this question, one needs to complement the category-theoretic approach to quantum mechanics with the computational-complexity-theoretic considerations. Based on such considerations, it is demonstrated in the paper that the hypothesis of the isomorphism existing between state vectors and elements of the reality is expected to be unsuitable for a generic quantum system.Comment: 10 page

    Constructibility of the universal wave function

    Full text link
    This paper focuses on a constructive treatment of the mathematical formalism of quantum theory and a possible role of constructivist philosophy in resolving the foundational problems of quantum mechanics, particularly, the controversy over the meaning of the wave function of the universe. As it is demonstrated in the paper, unless the number of the universe's degrees of freedom is fundamentally upper bounded (owing to some unknown physical laws) or hypercomputation is physically realizable, the universal wave function is a non-constructive entity in the sense of constructive recursive mathematics. This means that even if such a function might exist, basic mathematical operations on it would be undefinable and subsequently the only content one would be able to deduce from this function would be pure symbolical.Comment: 17 page

    Contextuality and the fundamental theorem of noncommutative algebra

    Full text link
    In the paper it is shown that the Kochen-Specker theorem follows from Burnside's theorem on noncommutative algebras. Accordingly, contextuality (as an impossibility of assigning binary values to projection operators independently of their contexts) is merely an inference from Burnside's fundamental theorem of the algebra of linear transformations on a Hilbert space of finite dimension.Comment: 10 page
    • …
    corecore