13,089 research outputs found

    Unifying classical and quantum key distillation

    Get PDF
    Assume that two distant parties, Alice and Bob, as well as an adversary, Eve, have access to (quantum) systems prepared jointly according to a tripartite state. In addition, Alice and Bob can use local operations and authenticated public classical communication. Their goal is to establish a key which is unknown to Eve. We initiate the study of this scenario as a unification of two standard scenarios: (i) key distillation (agreement) from classical correlations and (ii) key distillation from pure tripartite quantum states. Firstly, we obtain generalisations of fundamental results related to scenarios (i) and (ii), including upper bounds on the key rate. Moreover, based on an embedding of classical distributions into quantum states, we are able to find new connections between protocols and quantities in the standard scenarios (i) and (ii). Secondly, we study specific properties of key distillation protocols. In particular, we show that every protocol that makes use of pre-shared key can be transformed into an equally efficient protocol which needs no pre-shared key. This result is of practical significance as it applies to quantum key distribution (QKD) protocols, but it also implies that the key rate cannot be locked with information on Eve's side. Finally, we exhibit an arbitrarily large separation between the key rate in the standard setting where Eve is equipped with quantum memory and the key rate in a setting where Eve is only given classical memory. This shows that assumptions on the nature of Eve's memory are important in order to determine the correct security threshold in QKD.Comment: full versio

    Quantum Reverse Shannon Theorem

    Get PDF
    Dual to the usual noisy channel coding problem, where a noisy (classical or quantum) channel is used to simulate a noiseless one, reverse Shannon theorems concern the use of noiseless channels to simulate noisy ones, and more generally the use of one noisy channel to simulate another. For channels of nonzero capacity, this simulation is always possible, but for it to be efficient, auxiliary resources of the proper kind and amount are generally required. In the classical case, shared randomness between sender and receiver is a sufficient auxiliary resource, regardless of the nature of the source, but in the quantum case the requisite auxiliary resources for efficient simulation depend on both the channel being simulated, and the source from which the channel inputs are coming. For tensor power sources (the quantum generalization of classical IID sources), entanglement in the form of standard ebits (maximally entangled pairs of qubits) is sufficient, but for general sources, which may be arbitrarily correlated or entangled across channel inputs, additional resources, such as entanglement-embezzling states or backward communication, are generally needed. Combining existing and new results, we establish the amounts of communication and auxiliary resources needed in both the classical and quantum cases, the tradeoffs among them, and the loss of simulation efficiency when auxiliary resources are absent or insufficient. In particular we find a new single-letter expression for the excess forward communication cost of coherent feedback simulations of quantum channels (i.e. simulations in which the sender retains what would escape into the environment in an ordinary simulation), on non-tensor-power sources in the presence of unlimited ebits but no other auxiliary resource. Our results on tensor power sources establish a strong converse to the entanglement-assisted capacity theorem.Comment: 35 pages, to appear in IEEE-IT. v2 has a fixed proof of the Clueless Eve result, a new single-letter formula for the "spread deficit", better error scaling, and an improved strong converse. v3 and v4 each make small improvements to the presentation and add references. v5 fixes broken reference

    Effect of nonstationarities on detrended fluctuation analysis

    Full text link
    Detrended fluctuation analysis (DFA) is a scaling analysis method used to quantify long-range power-law correlations in signals. Many physical and biological signals are ``noisy'', heterogeneous and exhibit different types of nonstationarities, which can affect the correlation properties of these signals. We systematically study the effects of three types of nonstationarities often encountered in real data. Specifically, we consider nonstationary sequences formed in three ways: (i) stitching together segments of data obtained from discontinuous experimental recordings, or removing some noisy and unreliable parts from continuous recordings and stitching together the remaining parts -- a ``cutting'' procedure commonly used in preparing data prior to signal analysis; (ii) adding to a signal with known correlations a tunable concentration of random outliers or spikes with different amplitude, and (iii) generating a signal comprised of segments with different properties -- e.g. different standard deviations or different correlation exponents. We compare the difference between the scaling results obtained for stationary correlated signals and correlated signals with these three types of nonstationarities.Comment: 17 pages, 10 figures, corrected some typos, added one referenc

    More Randomness from the Same Data

    Full text link
    Correlations that cannot be reproduced with local variables certify the generation of private randomness. Usually, the violation of a Bell inequality is used to quantify the amount of randomness produced. Here, we show how private randomness generated during a Bell test can be directly quantified from the observed correlations, without the need to process these data into an inequality. The frequency with which the different measurement settings are used during the Bell test can also be taken into account. This improved analysis turns out to be very relevant for Bell tests performed with a finite collection efficiency. In particular, applying our technique to the data of a recent experiment [Christensen et al., Phys. Rev. Lett. 111, 130406 (2013)], we show that about twice as much randomness as previously reported can be potentially extracted from this setup.Comment: 6 pages + appendices, 4 figures, v3: version close to the published one. See also the related work arXiv:1309.393

    EPR Paradox,Locality and Completeness of Quantum Theory

    Full text link
    The quantum theory (QT) and new stochastic approaches have no deterministic prediction for a single measurement or for a single time -series of events observed for a trapped ion, electron or any other individual physical system. The predictions of QT being of probabilistic character apply to the statistical distribution of the results obtained in various experiments. The probability distribution is not an attribute of a dice but it is a characteristic of a whole random experiment : '' rolling a dice''. and statistical long range correlations between two random variables X and Y are not a proof of any causal relation between these variable. Moreover any probabilistic model used to describe a random experiment is consistent only with a specific protocol telling how the random experiment has to be performed.In this sense the quantum theory is a statistical and contextual theory of phenomena. In this paper we discuss these important topics in some detail. Besides we discuss in historical perspective various prerequisites used in the proofs of Bell and CHSH inequalities concluding that the violation of these inequalities in spin polarization correlation experiments is neither a proof of the completeness of QT nor of its nonlocality. The question whether QT is predictably complete is still open and it should be answered by a careful and unconventional analysis of the experimental data. It is sufficient to analyze more in detail the existing experimental data by using various non-parametric purity tests and other specific statistical tools invented to study the fine structure of the time-series. The correct understanding of statistical and contextual character of QT has far reaching consequences for the quantum information and quantum computing.Comment: 16 pages, 59 references,the contribution to the conference QTRF-4 held in Vaxjo, Sweden, 11-16 june 2007. To be published in the Proceeding

    Intrinsic randomness in non-local theories: quantification and amplification

    Get PDF
    Quantum mechanics was developed as a response to the inadequacy of classical physics in explaining certain physical phenomena. While it has proved immensely successful, it also presents several features that severely challenge our classicality based intuition. Randomness in quantum theory is one such and is the central theme of this dissertation. Randomness is a notion we have an intuitive grasp on since it appears to abound in nature. It a icts weather systems and nancial markets and is explicitly used in sport and gambling. It is used in a wide range of scienti c applications such as the simulation of genetic drift, population dynamics and molecular motion in fluids. Randomness (or the lack of it) is also central to philosophical concerns such as the existence of free will and anthropocentric notions of ethics and morality. The conception of randomness has evolved dramatically along with physical theory. While all randomness in classical theory can be fully attributed to a lack of knowledge of the observer, quantum theory qualitatively departs by allowing the existence of objective or intrinsic randomness. It is now known that intrinsic randomness is a generic feature of hypothetical theories larger than quantum theory called the non-signalling theories. They are usually studied with regards to a potential future completion of quantum mechanics or from the perspective of recognizing new physical principles describing nature. While several aspects have been studied to date, there has been little work in globally characterizing and quantifying randomness in quantum and non-signalling theories and the relationship between them. This dissertation is an attempt to ll this gap. Beginning with the unavoidable assumption of a weak source of randomness in the universe, we characterize upper bounds on quantum and non-signalling randomness. We develop a simple symmetry argument that helps identify maximal randomness in quantum theory and demonstrate its use in several explicit examples. Furthermore, we show that maximal randomness is forbidden within general non-signalling theories and constitutes a quantitative departure from quantum theory. We next address (what was) an open question about randomness ampli cation. It is known that a single source of randomness cannot be ampli ed using classical resources alone. We show that using quantum resources on the other hand allows a full ampli cation of the weakest sources of randomness to maximal randomness even in the presence of supra-quantum adversaries. The signi cance of this result spans practical cryptographic scenarios as well as foundational concerns. It demonstrates that conditional on the smallest set of assumptions, the existence of the weakest randomness in the universe guarantees the existence of maximal randomness. The next question we address is the quanti cation of intrinsic randomness in non-signalling correlations. While this is intractable in general, we identify cases where this can be quanti ed. We nd that in these cases all observed randomness is intrinsic even relaxing the measurement independence assumption. We nally turn to the study of the only known resource that allows generating certi able intrinsic randomness in the laboratory i.e. entanglement. We address noisy quantum systems and calculate their entanglement dynamics under decoherence. We identify exact results for several realistic noise models and provide tight bounds in some other cases. We conclude by putting our results into perspective, pointing out some drawbacks and future avenues of work in addressing these concerns

    Distributed Channel Synthesis

    Full text link
    Two familiar notions of correlation are rediscovered as the extreme operating points for distributed synthesis of a discrete memoryless channel, in which a stochastic channel output is generated based on a compressed description of the channel input. Wyner's common information is the minimum description rate needed. However, when common randomness independent of the input is available, the necessary description rate reduces to Shannon's mutual information. This work characterizes the optimal trade-off between the amount of common randomness used and the required rate of description. We also include a number of related derivations, including the effect of limited local randomness, rate requirements for secrecy, applications to game theory, and new insights into common information duality. Our proof makes use of a soft covering lemma, known in the literature for its role in quantifying the resolvability of a channel. The direct proof (achievability) constructs a feasible joint distribution over all parts of the system using a soft covering, from which the behavior of the encoder and decoder is inferred, with no explicit reference to joint typicality or binning. Of auxiliary interest, this work also generalizes and strengthens this soft covering tool.Comment: To appear in IEEE Trans. on Information Theory (submitted Aug., 2012, accepted July, 2013), 26 pages, using IEEEtran.cl

    How Quantum Computers Fail: Quantum Codes, Correlations in Physical Systems, and Noise Accumulation

    Full text link
    The feasibility of computationally superior quantum computers is one of the most exciting and clear-cut scientific questions of our time. The question touches on fundamental issues regarding probability, physics, and computability, as well as on exciting problems in experimental physics, engineering, computer science, and mathematics. We propose three related directions towards a negative answer. The first is a conjecture about physical realizations of quantum codes, the second has to do with correlations in stochastic physical systems, and the third proposes a model for quantum evolutions when noise accumulates. The paper is dedicated to the memory of Itamar Pitowsky.Comment: 16 page
    corecore