140 research outputs found

    Pre- and Post-selection paradoxes and contextuality in quantum mechanics

    Get PDF
    Many seemingly paradoxical effects are known in the predictions for outcomes of intermediate measurements made on pre- and post-selected quantum systems. Despite appearances, these effects do not demonstrate the impossibility of a noncontextual hidden variable theory, since an explanation in terms of measurement-disturbance is possible. Nonetheless, we show that for every paradoxical effect wherein all the pre- and post- selected probabilities are 0 or 1 and the pre- and post-selected states are nonorthogonal, there is an associated proof of contextuality. This proof is obtained by considering all the measurements involved in the paradoxical effect -- the pre-selection, the post-selection, and the alternative possible intermediate measurements -- as alternative possible measurements at a single time.Comment: 5 pages, 1 figure. Submitted to Phys. Rev. Lett. v2.0 revised in the light of referee comments, results unchange

    Why interference phenomena do not capture the essence of quantum theory

    Get PDF
    Quantum interference phenomena are widely viewed as posing a challenge to the classical worldview. Feynman even went so far as to proclaim that they are the only mystery and the basic peculiarity of quantum mechanics. Many have also argued that such phenomena force us to accept a number of radical interpretational conclusions, including: that a photon is neither a particle nor a wave but rather a Jekyll-and-Hyde sort of entity that toggles between the two possibilities, that reality is observer-dependent, and that systems either do not have properties prior to measurements or else have properties that are subject to nonlocal or backwards-in-time causal influences. In this work, we show that such conclusions are not, in fact, forced on us by the phenomena. We do so by describing an alternative to quantum theory, a statistical theory of a classical discrete field (the `toy field theory') that reproduces the relevant phenomenology of quantum interference while rejecting these radical interpretational claims. It also reproduces a number of related interference experiments that are thought to support these interpretational claims, such as the Elitzur-Vaidman bomb tester, Wheeler's delayed-choice experiment, and the quantum eraser experiment. The systems in the toy field theory are field modes, each of which possesses, at all times, both a particle-like property (a discrete occupation number) and a wave-like property (a discrete phase). Although these two properties are jointly possessed, the theory stipulates that they cannot be jointly known. The phenomenology that is generally cited in favour of nonlocal or backwards-in-time causal influences ends up being explained in terms of inferences about distant or past systems, and all that is observer-dependent is the observer's knowledge of reality, not reality itself.Comment: In this updated version we have added appendices elaborating on a few points. Primarily, we discuss how one can describe our toy field theory in terms of spatially localized modes, in a manner analogous to a cellular automaton. Comments welcome. 47 pages, 11 figure

    What aspects of the phenomenology of interference witness nonclassicality?

    Get PDF
    Interference phenomena are often claimed to resist classical explanation. However, such claims are undermined by the fact that the specific aspects of the phenomenology upon which they are based can in fact be reproduced in a noncontextual ontological model [Catani et al. arXiv:2111.13727]. This raises the question of what other aspects of the phenomenology of interference do in fact resist classical explanation. We answer this question by demonstrating that the most basic quantum wave-particle duality relation, which expresses the precise trade-off between path distinguishability and fringe visibility, cannot be reproduced in any noncontextual model. We do this by showing that it is a specific type of uncertainty relation, and then leveraging a recent result establishing that noncontextuality restricts the functional form of this uncertainty relation [Catani et al. arXiv:2207.11779]. Finally, we discuss what sorts of interferometric experiment can demonstrate contextuality via the wave-particle duality relation.Comment: 12 pages, 5 figure

    What is Nonclassical about Uncertainty Relations?

    Get PDF
    Uncertainty relations express limits on the extent to which the outcomes of distinct measurements on a single state can be made jointly predictable. The existence of nontrivial uncertainty relations in quantum theory is generally considered to be a way in which it entails a departure from the classical worldview. However, this perspective is undermined by the fact that there exist operational theories which exhibit nontrivial uncertainty relations but which are consistent with the classical worldview insofar as they admit of a generalized-noncontextual ontological model. This prompts the question of what aspects of uncertainty relations, if any, cannot be realized in this way and so constitute evidence of genuine nonclassicality. We here consider uncertainty relations describing the tradeoff between the predictability of a pair of binary-outcome measurements (e.g., measurements of Pauli X and Pauli Z observables in quantum theory). We show that, for a class of theories satisfying a particular symmetry property, the functional form of this predictability tradeoff is constrained by noncontextuality to be below a linear curve. Because qubit quantum theory has the relevant symmetry property, the fact that its predictability tradeoff describes a section of a circle is a violation of this noncontextual bound, and therefore constitutes an example of how the functional form of an uncertainty relation can witness contextuality. We also deduce the implications for a selected group of operational foils to quantum theory and consider the generalization to three measurements

    Picturing classical and quantum Bayesian inference

    Full text link
    We introduce a graphical framework for Bayesian inference that is sufficiently general to accommodate not just the standard case but also recent proposals for a theory of quantum Bayesian inference wherein one considers density operators rather than probability distributions as representative of degrees of belief. The diagrammatic framework is stated in the graphical language of symmetric monoidal categories and of compact structures and Frobenius structures therein, in which Bayesian inversion boils down to transposition with respect to an appropriate compact structure. We characterize classical Bayesian inference in terms of a graphical property and demonstrate that our approach eliminates some purely conventional elements that appear in common representations thereof, such as whether degrees of belief are represented by probabilities or entropic quantities. We also introduce a quantum-like calculus wherein the Frobenius structure is noncommutative and show that it can accommodate Leifer's calculus of `conditional density operators'. The notion of conditional independence is also generalized to our graphical setting and we make some preliminary connections to the theory of Bayesian networks. Finally, we demonstrate how to construct a graphical Bayesian calculus within any dagger compact category.Comment: 38 pages, lots of picture

    Entropy and Information Causality in General Probabilistic Theories

    Get PDF
    We investigate the concept of entropy in probabilistic theories more general than quantum mechanics, with particular reference to the notion of information causality (IC) recently proposed by Pawlowski et al (2009 arXiv:0905.2292). We consider two entropic quantities, which we term measurement and mixing entropy. In the context of classical and quantum theory, these coincide, being given by the Shannon and von Neumann entropies, respectively; in general, however, they are very different. In particular, while measurement entropy is easily seen to be concave, mixing entropy need not be. In fact, as we show, mixing entropy is not concave whenever the state space is a non-simplicial polytope. Thus, the condition that measurement and mixing entropies coincide is a strong constraint on possible theories. We call theories with this property monoentropic. Measurement entropy is subadditive, but not in general strongly subadditive. Equivalently, if we define the mutual information between two systems A and B by the usual formula I (A : B) = H(A) + H(B)− H(AB), where H denotes the measurement entropy and AB is a non-signaling composite of A and B, then it can happen that I (A : BC) \u3c I (A : B). This is relevant to IC in the sense of Pawlowski et al: we show that any monoentropic non-signaling theory in which measurement entropy is strongly subadditive, and also satisfies a version of the Holevo bound, is informationally causal, and on the other hand we observe that Popescu–Rohrlich boxes, which violate IC, also violate strong subadditivity. We also explore the interplay between measurement and mixing entropy and various natural conditions on theories that arise in quantum axiomatics

    The lesson of causal discovery algorithms for quantum correlations: Causal explanations of Bell-inequality violations require fine-tuning

    Full text link
    An active area of research in the fields of machine learning and statistics is the development of causal discovery algorithms, the purpose of which is to infer the causal relations that hold among a set of variables from the correlations that these exhibit. We apply some of these algorithms to the correlations that arise for entangled quantum systems. We show that they cannot distinguish correlations that satisfy Bell inequalities from correlations that violate Bell inequalities, and consequently that they cannot do justice to the challenges of explaining certain quantum correlations causally. Nonetheless, by adapting the conceptual tools of causal inference, we can show that any attempt to provide a causal explanation of nonsignalling correlations that violate a Bell inequality must contradict a core principle of these algorithms, namely, that an observed statistical independence between variables should not be explained by fine-tuning of the causal parameters. In particular, we demonstrate the need for such fine-tuning for most of the causal mechanisms that have been proposed to underlie Bell correlations, including superluminal causal influences, superdeterminism (that is, a denial of freedom of choice of settings), and retrocausal influences which do not introduce causal cycles.Comment: 29 pages, 28 figs. New in v2: a section presenting in detail our characterization of Bell's theorem as a contradiction arising from (i) the framework of causal models, (ii) the principle of no fine-tuning, and (iii) certain operational features of quantum theory; a section explaining why a denial of hidden variables affords even fewer opportunities for causal explanations of quantum correlation

    Statins and Exercise Training Response in Heart Failure Patients: Insights From HF-ACTION.

    Get PDF
    OBJECTIVES: The aim of this study was to assess for a treatment interaction between statin use and exercise training (ET) response. BACKGROUND: Recent data suggest that statins may attenuate ET response, but limited data exist in patients with heart failure (HF). METHODS: HF-ACTION (Heart Failure: A Controlled Trial Investigating Outcomes of Exercise Training) was a randomized trial of 2,331 patients with chronic HF with ejection fraction ≤35% who were randomized to usual care with or without ET. We evaluated whether there was a treatment interaction between statins and ET response for the change in quality of life and aerobic capacity (peak oxygen consumption and 6-min walk distance) from baseline to 3 months. We also assessed for a treatment interaction among atorvastatin, simvastatin, and pravastatin and change in these endpoints with ET. Multiple linear regression analyses were performed for each endpoint, adjusting for baseline covariates. RESULTS: Of 2,331 patients in the HF-ACTION trial, 1,353 (58%) were prescribed statins at baseline. Patients treated with statins were more likely to be older men with ischemic HF etiology but had similar use of renin angiotensin system blockers and beta-blockers. There was no evidence of a treatment interaction between statin use and ET on changes in quality of life or exercise capacity, nor was there evidence of differential association between statin type and ET response for these endpoints (all p values \u3e0.05). CONCLUSIONS: In a large chronic HF cohort, there was no evidence of a treatment interaction between statin use and short-term change in aerobic capacity and quality of life with ET. These findings contrast with recent reports of an attenuation in ET response with statins in a different population, highlighting the need for future prospective studies. (Exercise Training Program to Improve Clinical Outcomes in Individuals With Congestive Heart Failure; NCT00047437)

    Liquefaction Susceptibility: Proposed New York City Building Code Revision

    Get PDF
    A simplified procedure is presented for evaluating liquefaction susceptibility of cohesionless saturated soils based on available technology. In 2001, a Committee of engineers working in the New York City (NYC) area was formed under the direction of the first Author, to review the liquefaction aspects of the 1995 New York City Building Code. The purpose was to gain consensus on a possible revision and augmentation of the exisiting regulations as part of the ongoing Code review by the Structural Engineers Association of New York (SEAoNY). This article summarizes the recommendations of the Committee, as compiled in 2002. The following topics are reviewed: (a) history of the current code; (b) seismicity and design motions in NYC; (c) updated screening criteria for liquefaction susceptibility. With reference to the topic in (c), recommendations are developed for Code language pertaining to: (1) method of analysis; (2) site classification schemes; (3) design considerations for bearing capacity and displacements of foundations in liquefied soil; (4) maximum depth of liquefaction; (5) field methods to evaluate soil resistance; (6) parameters to be considered in analyses; (7) treatment of sloped strata. Analytical results for typical NYC profiles subjected to 500-year rock motions are presented. Based on the these results, the Committee proposed a revised liquefaction screening diagram

    User, Use & Utility Research - The Digital User as New Design Perspective in Business and Information Systems Engineering

    Get PDF
    Business and Information Systems Engineer- ing (BISE) is at a turning point. Planning, de- signing, developing and operating IT used to be a management task of a few elites in pub- lic ad-ministrations and corporations. But the continuous digitization of nearly all areas of life changes the IT landscape fundamentally. Success in this new era requires putting the human perspective – the digital user – at the very heart of the new digitized service-led economy. BISE faces not just a temporary trend but a complex socio-technical phenomenon with far-reaching implications. The challenges are manifold and have major consequences for all stakeholders, both in information systems and management research as well as in practice. Corporate processes have to be re-designed from the ground up, starting with the user’s perspective, thus putting usage experience and utility of the individual center stage. The digital service economy leads to highly personalized application systems while orga- nizational functions are being fragmented. Entirely new ways of interacting with infor- mation systems, in particular beyond desk- top IT, are being invented and established. These fundamental challenges require novel approaches with regards to innovation and development methods as well as adequate concepts for enterprise or service system ar- chitectures. Gigantic amounts of data are be- ing generated at an accelerating rate by an in- creasing number of devices – data that need to be managed. In order to tackle these extraordinary chal- lenges we introduce ‘user, use & utility’ as a new field of BISE that focuses primarily on the digital user, his or her usage behavior and the utility associated with system usage in the digitized service-led economy. The research objectives encompass the de- velopment of theories, methods and tools for systematic requirement elicitation, sys- tems design, and business development for successful Business and Information Systems Engineering in a digitized economy – infor- mation systems that digital users enjoy us- ing. This challenge calls for leveraging in- sights from various scientific disciplines such as Design, Engineering, Computer Science, Psychology and Sociology. BISE can provide an integrated perspective, thereby assuming a pivotal role within the digitized service led economy
    • …
    corecore