470 research outputs found
Robust Randomness Amplifiers: Upper and Lower Bounds
A recent sequence of works, initially motivated by the study of the nonlocal
properties of entanglement, demonstrate that a source of
information-theoretically certified randomness can be constructed based only on
two simple assumptions: the prior existence of a short random seed and the
ability to ensure that two black-box devices do not communicate (i.e. are
non-signaling). We call protocols achieving such certified amplification of a
short random seed randomness amplifiers.
We introduce a simple framework in which we initiate the systematic study of
the possibilities and limitations of randomness amplifiers. Our main results
include a new, improved analysis of a robust randomness amplifier with
exponential expansion, as well as the first upper bounds on the maximum
expansion achievable by a broad class of randomness amplifiers. In particular,
we show that non-adaptive randomness amplifiers that are robust to noise cannot
achieve more than doubly exponential expansion. Finally, we show that a wide
class of protocols based on the use of the CHSH game can only lead to (singly)
exponential expansion if adversarial devices are allowed the full power of
non-signaling strategies. Our upper bound results apply to all known
non-adaptive randomness amplifier constructions to date.Comment: 28 pages. Comments welcom
Free randomness can be amplified
Are there fundamentally random processes in nature? Theoretical predictions,
confirmed experimentally, such as the violation of Bell inequalities, point to
an affirmative answer. However, these results are based on the assumption that
measurement settings can be chosen freely at random, so assume the existence of
perfectly free random processes from the outset. Here we consider a scenario in
which this assumption is weakened and show that partially free random bits can
be amplified to make arbitrarily free ones. More precisely, given a source of
random bits whose correlation with other variables is below a certain
threshold, we propose a procedure for generating fresh random bits that are
virtually uncorrelated with all other variables. We also conjecture that such
procedures exist for any non-trivial threshold. Our result is based solely on
the no-signalling principle, which is necessary for the existence of free
randomness.Comment: 5+7 pages, 2 figures. Updated to match published versio
The Impact of a Fundamentals of Speech Course on Public Speaking Anxiety
Thirty to forty percent of Americans suffer from Communication Apprehension (CA) to a degree that impairs their ability and willingness to speak publicly (McCroskey, 1984). McCroskey (1984) defines CA as âan individualâs level of fear or anxiety associated with either real or anticipated communication with another person(s)â (p.13). There are many forms of CA, but âthe most common [form] is Public Speaking Anxietyâ (McCourt, 2007, p.6), which can be defined as the fear of speaking in front of a group of people. Because research has shown that such fears may hinder career aspirations, personal relationships and self-image, scholarly examination of means to reduce CA are merited. Therefore, overcoming CA is a fundamental goal of introductory speech classes. To test the impact of a basic-level speech course on studentsâ CA, 324 students at a large, Midwestern university took McCroskeyâs Personal Report of Public Speaking Anxiety (PRPSA) questionnaire via Questionpro as a pre- and post-test during the first two weeks, and again during the last two weeks of the course, which served as the treatment. Results show a significant decrease in CA after completion of the speech course
Arbitrarily many independent observers can share the nonlocality of a single maximally entangled qubit pair
Alice and Bob each have half of a pair of entangled qubits. Bob measures his
half and then passes his qubit to a second Bob who measures again and so on.
The goal is to maximize the number of Bobs that can have an expected violation
of the Clauser-Horne-Shimony-Holt (CHSH) Bell inequality with the single Alice.
This scenario was introduced in [Phys. Rev. Lett. 114, 250401 (2015)] where the
authors mentioned evidence that when the Bobs act independently and with
unbiased inputs then at most two of them can expect to violate the CHSH
inequality with Alice. Here we show that, contrary to this evidence,
arbitrarily many independent Bobs can have an expected CHSH violation with the
single Alice. Our proof is constructive and our measurement strategies can be
generalized to work with a larger class of two-qubit states that includes all
pure entangled two-qubit states. Since violation of a Bell inequality is
necessary for device-independent tasks, our work represents a step towards an
eventual understanding of the limitations on how much device-independent
randomness can be robustly generated from a single pair of qubits.Comment: 4+7 pages, 2 figures, v2: minor updates to match published versio
An experimental test of all theories with predictive power beyond quantum theory
According to quantum theory, the outcomes of future measurements cannot (in
general) be predicted with certainty. In some cases, even with a complete
physical description of the system to be measured and the measurement
apparatus, the outcomes of certain measurements are completely random. This
raises the question, originating in the paper by Einstein, Podolsky and Rosen,
of whether quantum mechanics is the optimal way to predict measurement
outcomes. Established arguments and experimental tests exclude a few specific
alternative models. Here, we provide a complete answer to the above question,
refuting any alternative theory with significantly more predictive power than
quantum theory. More precisely, we perform various measurements on distant
entangled photons, and, under the assumption that these measurements are chosen
freely, we give an upper bound on how well any alternative theory could predict
their outcomes. In particular, in the case where quantum mechanics predicts two
equally likely outcomes, our results are incompatible with any theory in which
the probability of a prediction is increased by more than ~0.19. Hence, we can
immediately refute any already considered or yet-to-be-proposed alternative
model with more predictive power than this.Comment: 13 pages, 4 figure
No extension of quantum theory can have improved predictive power
According to quantum theory, measurements generate random outcomes, in stark
contrast with classical mechanics. This raises the question of whether there
could exist an extension of the theory which removes this indeterminism, as
suspected by Einstein, Podolsky and Rosen (EPR). Although this has been shown
to be impossible, existing results do not imply that the current theory is
maximally informative. Here we ask the more general question of whether any
improved predictions can be achieved by any extension of quantum theory. Under
the assumption that measurements can be chosen freely, we answer this question
in the negative: no extension of quantum theory can give more information about
the outcomes of future measurements than quantum theory itself. Our result has
significance for the foundations of quantum mechanics, as well as applications
to tasks that exploit the inherent randomness in quantum theory, such as
quantum cryptography.Comment: 6 pages plus 7 of supplementary material, 3 figures. Title changed.
Added discussion on Bell's notion of locality. FAQ answered at
http://perimeterinstitute.ca/personal/rcolbeck/FAQ.htm
Universal topological phase of 2D stabilizer codes
Two topological phases are equivalent if they are connected by a local
unitary transformation. In this sense, classifying topological phases amounts
to classifying long-range entanglement patterns. We show that all 2D
topological stabilizer codes are equivalent to several copies of one universal
phase: Kitaev's topological code. Error correction benefits from the
corresponding local mappings.Comment: 4 pages, 3 figure
Causality - Complexity - Consistency: Can Space-Time Be Based on Logic and Computation?
The difficulty of explaining non-local correlations in a fixed causal
structure sheds new light on the old debate on whether space and time are to be
seen as fundamental. Refraining from assuming space-time as given a priori has
a number of consequences. First, the usual definitions of randomness depend on
a causal structure and turn meaningless. So motivated, we propose an intrinsic,
physically motivated measure for the randomness of a string of bits: its length
minus its normalized work value, a quantity we closely relate to its Kolmogorov
complexity (the length of the shortest program making a universal Turing
machine output this string). We test this alternative concept of randomness for
the example of non-local correlations, and we end up with a reasoning that
leads to similar conclusions as in, but is conceptually more direct than, the
probabilistic view since only the outcomes of measurements that can actually
all be carried out together are put into relation to each other. In the same
context-free spirit, we connect the logical reversibility of an evolution to
the second law of thermodynamics and the arrow of time. Refining this, we end
up with a speculation on the emergence of a space-time structure on bit strings
in terms of data-compressibility relations. Finally, we show that logical
consistency, by which we replace the abandoned causality, it strictly weaker a
constraint than the latter in the multi-party case.Comment: 17 pages, 16 figures, small correction
Size resolved mass concentration and elemental composition of atmospheric aerosols over the Eastern Mediterranean area
International audienceA Berner low pressure impactor was used to collect size-segregated aerosol samples at Finokalia, located on the north-eastern coast of Crete, Greece during July 2000 and January 2001. Several samples were also collected during the summer campaign aboard the research vessel "AEGAIEO" in the Aegean Sea. Gravimetric analysis and inversion techniques yielded daily PM1 and PM10 mass concentrations. The samples were also analysed by PIXE giving the elemental size distributions of Al, Si, K, Ca, Ti, Mn, Fe, Sr, S, Cl, Ni, V, Cu, Cr, Zn, and Pb. The crustal elements and sea-salt had a unimodal supermicron size distribution. Sulphur was found predominantly in submicron fractions. K, V, and Ni exhibited a bimodal distribution with a submicron mode produced by forest fires and oil combustion. The anthropogenic elements had broad and not well-defined distributions. The time series for PM1 and PM10 mass and elemental concentrations showed both daily and seasonal variation. Higher mass concentrations were observed during two incursions of Saharan dust, whilst higher concentrations of S, Cu, Zn, and Pb were encountered in samples collected in air masses arriving from northern Greece or the western coast of Turkey. Elevated concentrations of chlorine were found in samples with air masses either originating above the Atlantic Ocean and arriving at Finokalia via western Europe or recirculating over the western coast of the Black Sea
Keyring models: an approach to steerability
If a measurement is made on one half of a bipartite system, then, conditioned
on the outcome, the other half has a new reduced state. If these reduced states
defy classical explanation -- that is, if shared randomness cannot produce
these reduced states for all possible measurements -- the bipartite state is
said to be steerable. Determining which states are steerable is a challenging
problem even for low dimensions. In the case of two-qubit systems a criterion
is known for T-states (that is, those with maximally mixed marginals) under
projective measurements. In the current work we introduce the concept of
keyring models -- a special class of local hidden state models. When the
measurements made correspond to real projectors, these allow us to study
steerability beyond T-states.
Using keyring models, we completely solve the steering problem for real
projective measurements when the state arises from mixing a pure two-qubit
state with uniform noise. We also give a partial solution in the case when the
uniform noise is replaced by independent depolarizing channels.Comment: 15(+4) pages, 5 figures. v2: references added, v3: minor change
- âŠ