11,367 research outputs found
Contextuality under weak assumptions
The presence of contextuality in quantum theory was first highlighted by Bell, Kochen and Specker, who discovered that for quantum systems of three or more dimensions, measurements could not be viewed as deterministically revealing pre-existing properties of the system. More precisely, no model can assign deterministic outcomes to the projectors of a quantum measurement in a way that depends only on the projector and not the context (the full set of projectors) in which it appeared, despite the fact that the Born rule probabilities associated with projectors are independent of the context. A more general, operational definition of contextuality introduced by Spekkens, which we will term "probabilistic contextuality", drops the assumption of determinism and allows for operations other than measurements to be considered contextual. Even two-dimensional quantum mechanics can be shown to be contextual under this generalised notion. Probabilistic noncontextuality represents the postulate that elements of an operational theory that cannot be distinguished from each other based on the statistics of arbitrarily many repeated experiments (they give rise to the same operational probabilities) are ontologically identical. In this paper, we introduce a framework that enables us to distinguish between different noncontextuality assumptions in terms of the relationships between the ontological representations of objects in the theory given a certain relation between their operational representations. This framework can be used to motivate and define a "possibilistic" analogue, encapsulating the idea that elements of an operational theory that cannot be unambiguously distinguished operationally can also not be unambiguously distinguished ontologically. We then prove that possibilistic noncontextuality is equivalent to an alternative notion of noncontextuality proposed by Hardy. Finally, we demonstrate that these weaker noncontextuality assumptions are sufficient to prove alternative versions of known "no-go" theorems that constrain ψ-epistemic models for quantum mechanics
The multifocal pattern electroretinogram in chloroquine retinopathy
Purpose: Optimal screening for ocular toxicity caused by chloroquine and hydroxychloroquine is still controversial. With the multifocal pattern electroretinogram (mfPERG), a new electrophysiological technique has recently become available to detect early changes of ganglion cells. In this study this new technique is applied to a series of 10 patients seen consecutively receiving long-term chloroquine medication. Methods: In 10 patients receiving chloroquine medication, clinical examination, Amsler visual field testing and computerized color vision testing were performed. If toxicity was suspected, automated perimetry was carried out. In addition, in all patients conventional pattern electroretinogram (PERG) and mfPERG testing were performed. Results: On clinical examination 8 patients showed no chloroquine-associated maculopathy, while 2 patients did. Of these 2, only 1 reported abnormalities when viewing the Amsler chart, while automated perimetry showed typical, ring-like paracentral scotomas in both affected patients and color vision was significantly abnormal. In the normal patients, 4 of 8 had a mild color vision disturbance, which correlated to age-related macular changes. The amplitudes of the PERG and the central (approximately 10degrees) responses of the mfPERG were markedly reduced in chloroquine maculopathy, while the latencies were unchanged. The peripheral rings of mfPERG (ranging to 48degrees) were not affected by chloroquine toxicity. Both PERG and mfPERG were less affected by age-related macular changes. Conclusions: The reduction of PERG and central mfPERG responses in chloroquine maculopathy may help with the early detection of toxicity. Copyright (C) 2004 S. Karger AG, Basel
Quantum lost property: a possible operational meaning for the Hilbert-Schmidt product
Minimum error state discrimination between two mixed states \rho and \sigma
can be aided by the receipt of "classical side information" specifying which
states from some convex decompositions of \rho and \sigma apply in each run. We
quantify this phenomena by the average trace distance, and give lower and upper
bounds on this quantity as functions of \rho and \sigma. The lower bound is
simply the trace distance between \rho and \sigma, trivially seen to be tight.
The upper bound is \sqrt{1 - tr(\rho\sigma)}, and we conjecture that this is
also tight. We reformulate this conjecture in terms of the existence of a pair
of "unbiased decompositions", which may be of independent interest, and prove
it for a few special cases. Finally, we point towards a link with a notion of
non-classicality known as preparation contextuality.Comment: 3 pages, 1 figure. v2: Less typos in text and less punctuation in
titl
Qubit-Initialisation and Readout with Finite Coherent Amplitudes in Cavity QED
We consider a unitary transfer of an arbitrary state of a two-level atomic
qubit in a cavity to the finite amplitude coherent state cavity field. Such
transfer can be used to either provide an effective readout measurement on the
atom by a subsequent measurement on the light field or as a method for
initializing a fixed atomic state - a so-called "attractor state", studied
previously for the case of an infinitely strong cavity field. We show that with
a suitable adjustment of the coherent amplitude and evolution time the qubit
transfers all its information to the field, attaining a selected state of high
purity irrespectively of the initial state.Comment: 6 pages, 4 figure
Boson Sampling from Gaussian States
We pose a generalized Boson Sampling problem. Strong evidence exists that
such a problem becomes intractable on a classical computer as a function of the
number of Bosons. We describe a quantum optical processor that can solve this
problem efficiently based on Gaussian input states, a linear optical network
and non-adaptive photon counting measurements. All the elements required to
build such a processor currently exist. The demonstration of such a device
would provide the first empirical evidence that quantum computers can indeed
outperform classical computers and could lead to applications
Optimal measurements for relative quantum information
We provide optimal measurement schemes for estimating relative parameters of
the quantum state of a pair of spin systems. We prove that the optimal
measurements are joint measurements on the pair of systems, meaning that they
cannot be achieved by local operations and classical communication. We also
demonstrate that in the limit where one of the spins becomes macroscopic, our
results reproduce those that are obtained by treating that spin as a classical
reference direction.Comment: 6 pages, 1 figure, published versio
Recommended from our members
Pituophis ruthveni
Number of Pages: 16Geological SciencesIntegrative Biolog
Recommended from our members
The Politics of Post-Qualitative Inquiry: History and Power
In this article, we offer a critical reading of the increasingly popular “post-qualitative” approach to research. We draw on insights from postcolonial theory to offer some provocations about the methodological and conceptual claims made by post-qualitative inquiry. The article considers how post-qualitative inquiry opens up possibilities for post-humanist social research. But, our critical reading of these “new” approaches argues that such research needs to attend to political and historical relations of social power, both in the worlds it constitutes and in the processes of its knowledge production. Without explicit attention to power and history, the (non)representational logics of post-qualitative inquiry risk operating less as “new” mechanisms for generative and subversive post-humanist research and more as processes of closure and erasure: closed-off from the worlds and people being researched
Dynamics of a quantum reference frame undergoing selective measurements and coherent interactions
We consider the dynamics of a quantum directional reference frame undergoing
repeated interactions. We first describe how a precise sequence of measurement
outcomes affects the reference frame, looking at both the case that the
measurement record is averaged over and the case wherein it is retained. We
find, in particular, that there is interesting dynamics in the latter situation
which cannot be revealed by considering the averaged case. We then consider in
detail how a sequence of rotationally invariant unitary interactions affects
the reference frame, a situation which leads to quite different dynamics than
the case of repeated measurements. We then consider strategies for correcting
reference frame drift if we are given a set of particles with polarization
opposite to the direction of drift. In particular, we find that by implementing
a suitably chosen unitary interaction after every two measurements we can
eliminate the rotational drift of the reference frame.Comment: 9 pages, 5 figure
The Quantum State of an Ideal Propagating Laser Field
We give a quantum information-theoretic description of an ideal propagating
CW laser field and reinterpret typical quantum-optical experiments in light of
this. In particular we show that contrary to recent claims [T. Rudolph and B.
C. Sanders, Phys. Rev. Lett. 87, 077903 (2001)], a conventional laser can be
used for quantum teleportation with continuous variables and for generating
continuous-variable entanglement. Optical coherence is not required, but phase
coherence is. We also show that coherent states play a priveleged role in the
description of laser light.Comment: 4 pages RevTeX, to appear in PRL. For an extended version see
quant-ph/011115
- …