271 research outputs found

    Energy-time entanglement, Elements of Reality, and Local Realism

    Full text link
    The Franson interferometer, proposed in 1989 [J. D. Franson, Phys. Rev. Lett. 62:2205-2208 (1989)], beautifully shows the counter-intuitive nature of light. The quantum description predicts sinusoidal interference for specific outcomes of the experiment, and these predictions can be verified in experiment. In the spirit of Einstein, Podolsky, and Rosen it is possible to ask if the quantum-mechanical description (of this setup) can be considered complete. This question will be answered in detail in this paper, by delineating the quite complicated relation between energy-time entanglement experiments and Einstein-Podolsky-Rosen (EPR) elements of reality. The mentioned sinusoidal interference pattern is the same as that giving a violation in the usual Bell experiment. Even so, depending on the precise requirements made on the local realist model, this can imply a) no violation, b) smaller violation than usual, or c) full violation of the appropriate statistical bound. Alternatives include a) using only the measurement outcomes as EPR elements of reality, b) using the emission time as EPR element of reality, c) using path realism, or d) using a modified setup. This paper discusses the nature of these alternatives and how to choose between them. The subtleties of this discussion needs to be taken into account when designing and setting up experiments intended to test local realism. Furthermore, these considerations are also important for quantum communication, for example in Bell-inequality-based quantum cryptography, especially when aiming for device independence.Comment: 18 pages, 7 figures, v2 rewritten and extende

    Strict detector-efficiency bounds for n-site Clauser-Horne inequalities

    Get PDF
    An analysis of detector-efficiency in many-site Clauser-Horne inequalities is presented, for the case of perfect visibility. It is shown that there is a violation of the presented n-site Clauser-Horne inequalities if and only if the efficiency is greater than n/(2n-1). Thus, for a two-site two-setting experiment there are no quantum-mechanical predictions that violate local realism unless the efficiency is greater than 2/3. Secondly, there are n-site experiments for which the quantum-mechanical predictions violate local realism whenever the efficiency exceeds 1/2.Comment: revtex, 5 pages, 1 figure (typesetting changes only

    Necessary and sufficient detection efficiency for the Mermin inequalities

    Full text link
    We prove that the threshold detection efficiency for a loophole-free Bell experiment using an nn-qubit Greenberger-Horne-Zeilinger state and the correlations appearing in the nn-partite Mermin inequality is n/(2n2)n/(2n-2). If the detection efficiency is equal to or lower than this value, there are local hidden variable models that can simulate all the quantum predictions. If the detection efficiency is above this value, there is no local hidden variable model that can simulate all the quantum predictions.Comment: REVTeX4, 5 pages, 1 figur

    Minimum detection efficiency for a loophole-free atom-photon Bell experiment

    Get PDF
    In Bell experiments, one problem is to achieve high enough photodetection to ensure that there is no possibility of describing the results via a local hidden-variable model. Using the Clauser-Horne inequality and a two-photon non-maximally entangled state, a photodetection efficiency higher than 0.67 is necessary. Here we discuss atom-photon Bell experiments. We show that, assuming perfect detection efficiency of the atom, it is possible to perform a loophole-free atom-photon Bell experiment whenever the photodetection efficiency exceeds 0.50.Comment: REVTeX4, 4 pages, 1 figur

    Optimal measurement bases for Bell-tests based on the CH-inequality

    Full text link
    The Hardy test of nonlocality can be seen as a particular case of the Bell tests based on the Clauser-Horne (CH) inequality. Here we stress this connection when we analyze the relation between the CH-inequality violation, its threshold detection efficiency, and the measurement settings adopted in the test. It is well known that the threshold efficiencies decrease when one considers partially entangled states and that the use of these states, unfortunately, generates a reduction in the CH violation. Nevertheless, these quantities are both dependent on the measurement settings considered, and in this paper we show that there are measurement bases which allow for an optimal situation in this trade-off relation. These bases are given as a generalization of the Hardy measurement bases, and they will be relevant for future Bell tests relying on pairs of entangled qubits.Comment: 8 pages, 6 figure

    Qubits from Number States and Bell Inequalities for Number Measurements

    Full text link
    Bell inequalities for number measurements are derived via the observation that the bits of the number indexing a number state are proper qubits. Violations of these inequalities are obtained from the output state of the nondegenerate optical parametric amplifier.Comment: revtex4, 7 pages, v2: results identical but extended presentation, v3: published versio

    RNA sequencing-based single sample predictors of molecular subtype and risk of recurrence for clinical assessment of early-stage breast cancer

    Get PDF
    BackgroundMultigene expression assays for molecular subtypes and biomarkers can aid clinical management of early invasive breast cancer. Based on RNA-sequencing we aimed to develop single-sample predictor (SSP) models for conventional clinical markers, molecular intrinsic subtype and risk of recurrence (ROR).MethodsA uniformly accrued breast cancer cohort of 7743 patients with RNA-sequencing data from fresh tissue was divided into a training set and a reserved test set. We trained SSPs for PAM50 molecular subtypes and ROR assigned by nearest-centroid (NC) and SSPs for conventional clinical markers from histopathology data. Additionally, SSP classifications were compared with Prosigna® in two external cohorts. Prognostic value was assessed using distant recurrence-free interval.ResultsIn the test set, agreement between SSP and NC classifications for PAM50 (five subtypes) and Subtype (four subtypes) was high (85%, Kappa=0.78) and very high (90%, Kappa=0.84) respectively. Accuracy for ROR risk category was high (84%, Kappa=0.75, weighted Kappa=0.90). The prognostic value for SSP and NC was assessed as equivalent. Agreement for SSP and histopathology was very high or high for receptor status, while moderate and poor for Ki67 status and Nottingham histological grade, respectively. SSP concordance with Prosigna® was high for subtype and moderate and high for ROR risk category. In pooled analysis, concordance between SSP and Prosigna® for emulated treatment recommendation for chemotherapy (yes vs. no) was high (85%, Kappa=0.66). In postmenopausal ER+/HER2-/N0 patients SSP application suggested changed treatment recommendations for up to 17% of patients, with nearly balanced escalation and de-escalation of chemotherapy.ConclusionsSSP models for histopathological variables, PAM50, and ROR classifications can be derived from RNA-sequencing that closely matches clinical tests. Agreement and outcome analyses suggest that NC and SSP models are interchangeable on a group-level and nearly so on a patient level. Retrospective evaluation in postmenopausal ER+/HER2-/N0 patients suggested that molecular testing could lead to a changed therapy recommendation for almost one-fifth of patients

    O18O and C18O observations of rho Oph A

    Full text link
    Observations of the (N_J=1_1-1_0) ground state transition of O_2 with the Odin satellite resulted in a about 5 sigma detection toward the dense core rho Oph A. At the frequency of the line, 119 GHz, the Odin telescope has a beam width of 10', larger than the size of the dense core, so that the precise nature of the emitting source and its exact location and extent are unknown. The current investigation is intended to remedy this. Telluric absorption makes ground based O_2 observations essentially impossible and observations had to be done from space. mm-wave telescopes on space platforms were necessarily small, which resulted in large, several arcminutes wide, beam patterns. Although the Earth's atmosphere is entirely opaque to low-lying O_2 transitions, it allows ground based observations of the much rarer O18O in favourable conditions and at much higher angular resolution with larger telescopes. In addition, rho Oph A exhibits both multiple radial velocity systems and considerable velocity gradients. Extensive mapping of the region in the proxy C18O (J=3-2) line can be expected to help identify the O_2 source on the basis of its line shape and Doppler velocity. Line opacities were determined from observations of optically thin 13C18O (J=3-2) at selected positions. During several observing periods, two C18O intensity maxima in rho Oph A were searched for in the 16O18O (2_1-0_1) line at 234 GHz with the 12m APEX telescope. Our observations resulted in an upper limit on the integrated O18O intensity of < 0.01 K km/s (3 sigma) into the 26.5" beam. We conclude that the source of observed O_2 emission is most likely confined to the central regions of the rho Oph A cloud. In this limited area, implied O_2 abundances could thus be higher than previously reported, by up to two orders of magnitude.Comment: 7 pages, 6 figures (5 colour), Astronomy & Astrophysic

    Contextuality-by-Default: A Brief Overview of Ideas, Concepts, and Terminology

    Full text link
    This paper is a brief overview of the concepts involved in measuring the degree of contextuality and detecting contextuality in systems of binary measurements of a finite number of objects. We discuss and clarify the main concepts and terminology of the theory called "contextuality-by-default," and then discuss a possible generalization of the theory from binary to arbitrary measurements.Comment: Lecture Notes in Computer Science 9535 (with the corrected list of authors) (2016
    corecore