635 research outputs found

    Contextuality with vanishing coherence and maximal robustness to dephasing

    Full text link
    Generalized contextuality is a resource for a wide range of communication and information processing protocols. However, contextuality is not possible without coherence, and so can be destroyed by dephasing noise. Here, we explore the robustness of contextuality to partially dephasing noise in a scenario related to state discrimination (for which contextuality is a resource). We find that a vanishing amount of coherence is sufficient to demonstrate the failure of noncontextuality in this scenario, and we give a proof of contextuality that is robust to arbitrary amounts of partially dephasing noise. This is in stark contrast to partially depolarizing noise, which is always sufficient to destroy contextuality.Comment: 13 pages, 7 figures. Comments are welcome

    Contextual advantage for state discrimination

    Full text link
    Finding quantitative aspects of quantum phenomena which cannot be explained by any classical model has foundational importance for understanding the boundary between classical and quantum theory. It also has practical significance for identifying information processing tasks for which those phenomena provide a quantum advantage. Using the framework of generalized noncontextuality as our notion of classicality, we find one such nonclassical feature within the phenomenology of quantum minimum error state discrimination. Namely, we identify quantitative limits on the success probability for minimum error state discrimination in any experiment described by a noncontextual ontological model. These constraints constitute noncontextuality inequalities that are violated by quantum theory, and this violation implies a quantum advantage for state discrimination relative to noncontextual models. Furthermore, our noncontextuality inequalities are robust to noise and are operationally formulated, so that any experimental violation of the inequalities is a witness of contextuality, independently of the validity of quantum theory. Along the way, we introduce new methods for analyzing noncontextuality scenarios, and demonstrate a tight connection between our minimum error state discrimination scenario and a Bell scenario.Comment: 18 pages, 9 figure

    Certified Quantum Measurement of Majorana Fermions

    Full text link
    We present a quantum self-testing protocol to certify measurements of fermion parity involving Majorana fermion modes. We show that observing a set of ideal measurement statistics implies anti-commutativity of the implemented Majorana fermion parity operators, a necessary prerequisite for Majorana detection. Our protocol is robust to experimental errors. We obtain lower bounds on the fidelities of the state and measurement operators that are linear in the errors. We propose to analyze experimental outcomes in terms of a contextuality witness WW, which satisfies ⟨W⟩≤3\langle W \rangle \le 3 for any classical probabilistic model of the data. A violation of the inequality witnesses quantum contextuality, and the closeness to the maximum ideal value ⟨W⟩=5\langle W \rangle=5 indicates the degree of confidence in the detection of Majorana fermions.Comment: 13 pages, 3 figure

    Constraints on Macroscopic Realism Without Assuming Non-Invasive Measurability

    Get PDF
    Macroscopic realism is the thesis that macroscopically observable properties must always have definite values. The idea was introduced by Leggett and Garg (1985), who wished to show a conflict with the predictions of quantum theory. However, their analysis required not just the assumption of macroscopic realism per se, but also that the observable properties could be measured non-invasively. In recent years there has been increasing interest in experimental tests of the violation of the Leggett-Garg inequality, but it has remained a matter of controversy whether this second assumption is a reasonable requirement for a macroscopic realist view of quantum theory. In a recent critical assessment Maroney and Timpson (2017) identified three different categories of macroscopic realism, and argued that only the simplest category could be ruled out by Leggett-Garg inequality violations. Allen, Maroney, and Gogioso (2016) then showed that the second of these approaches was also incompatible with quantum theory in Hilbert spaces of dimension 4 or higher. However, we show that the distinction introduced by Maroney and Timpson between the second and third approaches is not noise tolerant, so unfortunately Allen's result, as given, is not directly empirically testable. In this paper we replace Maroney and Timpson's three categories with a parameterization of macroscopic realist models, which can be related to experimental observations in a noise tolerant way, and recover the original definitions in the noise-free limit. We show how this parameterization can be used to experimentally rule out classes of macroscopic realism in Hilbert spaces of dimension 3 or higher, including the category tested by the Leggett-Garg inequality, without any use of the non-invasive measurability assumption.Comment: 20 pages, 10 figure
    • …
    corecore