309 research outputs found

    Entropic Steering Criteria: Applications to Bipartite and Tripartite Systems

    Full text link
    The effect of quantum steering describes a possible action at a distance via local measurements. Whereas many attempts on characterizing steerability have been pursued, answering the question as to whether a given state is steerable or not remains a difficult task. Here, we investigate the applicability of a recently proposed method for building steering criteria from generalized entropic uncertainty relations. This method works for any entropy which satisfy the properties of (i) (pseudo-) additivity for independent distributions; (ii) state independent entropic uncertainty relation (EUR); and (iii) joint convexity of a corresponding relative entropy. Our study extends the former analysis to Tsallis and R\'enyi entropies on bipartite and tripartite systems. As examples, we investigate the steerability of the three-qubit GHZ and W states.Comment: 27 pages, 8 figures. Published version. Title change

    Estimating Mixture Entropy with Pairwise Distances

    Full text link
    Mixture distributions arise in many parametric and non-parametric settings -- for example, in Gaussian mixture models and in non-parametric estimation. It is often necessary to compute the entropy of a mixture, but, in most cases, this quantity has no closed-form expression, making some form of approximation necessary. We propose a family of estimators based on a pairwise distance function between mixture components, and show that this estimator class has many attractive properties. For many distributions of interest, the proposed estimators are efficient to compute, differentiable in the mixture parameters, and become exact when the mixture components are clustered. We prove this family includes lower and upper bounds on the mixture entropy. The Chernoff α\alpha-divergence gives a lower bound when chosen as the distance function, with the Bhattacharyya distance providing the tightest lower bound for components that are symmetric and members of a location family. The Kullback-Leibler divergence gives an upper bound when used as the distance function. We provide closed-form expressions of these bounds for mixtures of Gaussians, and discuss their applications to the estimation of mutual information. We then demonstrate that our bounds are significantly tighter than well-known existing bounds using numeric simulations. This estimator class is very useful in optimization problems involving maximization/minimization of entropy and mutual information, such as MaxEnt and rate distortion problems.Comment: Corrects several errata in published version, in particular in Section V (bounds on mutual information

    Unified entropic measures of quantum correlations induced by local measurements

    Full text link
    We introduce quantum correlations measures based on the minimal change in unified entropies induced by local rank-one projective measurements, divided by a factor that depends on the generalized purity of the system in the case of non-additive entropies. In this way, we overcome the issue of the artificial increasing of the value of quantum correlations measures based on non-additive entropies when an uncorrelated ancilla is appended to the system without changing the computability of our entropic correlations measures with respect to the previous ones. Moreover, we recover as limiting cases the quantum correlations measures based on von Neumann and R\'enyi entropies (i.e., additive entropies), for which the adjustment factor becomes trivial. In addition, we distinguish between total and semiquantum correlations and obtain some relations between them. Finally, we obtain analytical expressions of the entropic correlations measures for typical quantum bipartite systems.Comment: 10 pages, 1 figur

    State-Dependent Approach to Entropic Measurement-Disturbance Relations

    Full text link
    Heisenberg's intuition was that there should be a tradeoff between measuring a particle's position with greater precision and disturbing its momentum. Recent formulations of this idea have focused on the question of how well two complementary observables can be jointly measured. Here, we provide an alternative approach based on how enhancing the predictability of one observable necessarily disturbs a complementary one. Our measurement-disturbance relation refers to a clear operational scenario and is expressed by entropic quantities with clear statistical meaning. We show that our relation is perfectly tight for all measurement strengths in an existing experimental setup involving qubit measurements.Comment: 9 pages, 2 figures. v4: published versio

    Entropy of quantum channel in the theory of quantum information

    Full text link
    Quantum channels, also called quantum operations, are linear, trace preserving and completely positive transformations in the space of quantum states. Such operations describe discrete time evolution of an open quantum system interacting with an environment. The thesis contains an analysis of properties of quantum channels and different entropies used to quantify the decoherence introduced into the system by a given operation. Part I of the thesis provides a general introduction to the subject. In Part II, the action of a quantum channel is treated as a process of preparation of a quantum ensemble. The Holevo information associated with this ensemble is shown to be bounded by the entropy exchanged during the preparation process between the initial state and the environment. A relation between the Holevo information and the entropy of an auxiliary matrix consisting of square root fidelities between the elements of the ensemble is proved in some special cases. Weaker bounds on the Holevo information are also established. The entropy of a channel, also called the map entropy, is defined as the entropy of the state corresponding to the channel by the Jamiolkowski isomorphism. In Part III of the thesis, the additivity of the entropy of a channel is proved. The minimal output entropy, which is difficult to compute, is estimated by an entropy of a channel which is much easier to obtain. A class of quantum channels is specified, for which additivity of channel capacity is conjectured. The last part of the thesis contains characterization of Davies channels, which correspond to an interaction of a state with a thermal reservoir in the week coupling limit, under the condition of quantum detailed balance and independence of rotational and dissipative evolutions. The Davies channels are characterized for one-qubit and one-qutrit systems

    On the quantum Renyi relative entropies and related capacity formulas

    Full text link
    We show that the quantum α\alpha-relative entropies with parameter α(0,1)\alpha\in (0,1) can be represented as generalized cutoff rates in the sense of [I. Csiszar, IEEE Trans. Inf. Theory 41, 26-34, (1995)], which provides a direct operational interpretation to the quantum α\alpha-relative entropies. We also show that various generalizations of the Holevo capacity, defined in terms of the α\alpha-relative entropies, coincide for the parameter range α(0,2]\alpha\in (0,2], and show an upper bound on the one-shot epsilon-capacity of a classical-quantum channel in terms of these capacities.Comment: v4: Cutoff rates are treated for correlated hypotheses, some proofs are given in greater detai

    A family of generalized quantum entropies: definition and properties

    Get PDF
    We present a quantum version of the generalized (h, φ)-entropies, introduced by Salicrú et al. for the study of classical probability distributions.We establish their basic properties and show that already known quantum entropies such as von Neumann, and quantum versions of Rényi, Tsallis, and unified entropies, constitute particular classes of the present general quantum Salicrú form. We exhibit that majorization plays a key role in explaining most of their common features. We give a characterization of the quantum (h, φ)-entropies under the action of quantum operations and study their properties for composite systems. We apply these generalized entropies to the problem of detection of quantum entanglement and introduce a discussion on possible generalized conditional entropies as well.Facultad de Ciencias ExactasInstituto de Física La Plat

    Generalised entropy accumulation

    Full text link
    Consider a sequential process in which each step outputs a system AiA_i and updates a side information register EE. We prove that if this process satisfies a natural "non-signalling" condition between past outputs and future side information, the min-entropy of the outputs A1,,AnA_1, \dots, A_n conditioned on the side information EE at the end of the process can be bounded from below by a sum of von Neumann entropies associated with the individual steps. This is a generalisation of the entropy accumulation theorem (EAT), which deals with a more restrictive model of side information: there, past side information cannot be updated in subsequent rounds, and newly generated side information has to satisfy a Markov condition. Due to its more general model of side-information, our generalised EAT can be applied more easily and to a broader range of cryptographic protocols. As examples, we give the first multi-round security proof for blind randomness expansion and a simplified analysis of the E91 QKD protocol. The proof of our generalised EAT relies on a new variant of Uhlmann's theorem and new chain rules for the Renyi divergence and entropy, which might be of independent interest.Comment: 42 pages; v2 expands introduction but does not change any results; in FOCS 202
    corecore