24,046 research outputs found

    Entanglement properties of bound and resonant few-body states

    Full text link
    Studying the physics of quantum correlations has gained new interest after it has become possible to measure entanglement entropies of few body systems in experiments with ultracold atomic gases. Apart from investigating trapped atom systems, research on correlation effects in other artificially fabricated few-body systems, such as quantum dots or electromagnetically trapped ions, is currently underway or in planning. Generally, the systems studied in these experiments may be considered as composed of a small number of interacting elements with controllable and highly tunable parameters, effectively described by Schr\"odinger equation. In this way, parallel theoretical and experimental studies of few-body models become possible, which may provide a deeper understanding of correlation effects and give hints for designing and controlling new experiments. Of particular interest is to explore the physics in the strongly correlated regime and in the neighborhood of critical points. Particle correlations in nanostructures may be characterized by their entanglement spectrum, i.e. the eigenvalues of the reduced density matrix of the system partitioned into two subsystems. We will discuss how to determine the entropy of entanglement spectrum of few-body systems in bound and resonant states within the same formalism. The linear entropy will be calculated for a model of quasi-one dimensional Gaussian quantum dot in the lowest energy states. We will study how the entanglement depends on the parameters of the system, paying particular attention to the behavior on the border between the regimes of bound and resonant states.Comment: 22 pages, 3 figure

    Formal Groups and ZZ-Entropies

    Get PDF
    We shall prove that the celebrated R\'enyi entropy is the first example of a new family of infinitely many multi-parametric entropies. We shall call them the ZZ-entropies. Each of them, under suitable hypotheses, generalizes the celebrated entropies of Boltzmann and R\'enyi. A crucial aspect is that every ZZ-entropy is composable [1]. This property means that the entropy of a system which is composed of two or more independent systems depends, in all the associated probability space, on the choice of the two systems only. Further properties are also required, to describe the composition process in terms of a group law. The composability axiom, introduced as a generalization of the fourth Shannon-Khinchin axiom (postulating additivity), is a highly non-trivial requirement. Indeed, in the trace-form class, the Boltzmann entropy and Tsallis entropy are the only known composable cases. However, in the non-trace form class, the ZZ-entropies arise as new entropic functions possessing the mathematical properties necessary for information-theoretical applications, in both classical and quantum contexts. From a mathematical point of view, composability is intimately related to formal group theory of algebraic topology. The underlying group-theoretical structure determines crucially the statistical properties of the corresponding entropies.Comment: 20 pages, no figure

    Generalized (c,d)-entropy and aging random walks

    Get PDF
    Complex systems are often inherently non-ergodic and non-Markovian for which Shannon entropy loses its applicability. In particular accelerating, path-dependent, and aging random walks offer an intuitive picture for these non-ergodic and non-Markovian systems. It was shown that the entropy of non-ergodic systems can still be derived from three of the Shannon-Khinchin axioms, and by violating the fourth -- the so-called composition axiom. The corresponding entropy is of the form Sc,diΓ(1+d,1clnpi)S_{c,d} \sim \sum_i \Gamma(1+d,1-c\ln p_i) and depends on two system-specific scaling exponents, cc and dd. This entropy contains many recently proposed entropy functionals as special cases, including Shannon and Tsallis entropy. It was shown that this entropy is relevant for a special class of non-Markovian random walks. In this work we generalize these walks to a much wider class of stochastic systems that can be characterized as `aging' systems. These are systems whose transition rates between states are path- and time-dependent. We show that for particular aging walks Sc,dS_{c,d} is again the correct extensive entropy. Before the central part of the paper we review the concept of (c,d)(c,d)-entropy in a self-contained way.Comment: 8 pages, 5 eps figures. arXiv admin note: substantial text overlap with arXiv:1104.207

    Methods for calculating nonconcave entropies

    Full text link
    Five different methods which can be used to analytically calculate entropies that are nonconcave as functions of the energy in the thermodynamic limit are discussed and compared. The five methods are based on the following ideas and techniques: i) microcanonical contraction, ii) metastable branches of the free energy, iii) generalized canonical ensembles with specific illustrations involving the so-called Gaussian and Betrag ensembles, iv) restricted canonical ensemble, and v) inverse Laplace transform. A simple long-range spin model having a nonconcave entropy is used to illustrate each method.Comment: v1: 22 pages, IOP style, 7 color figures, contribution for the JSTAT special issue on Long-range interacting systems. v2: Open problem and references added, minor typos corrected, close to published versio

    Beyond the Shannon-Khinchin Formulation: The Composability Axiom and the Universal Group Entropy

    Get PDF
    The notion of entropy is ubiquitous both in natural and social sciences. In the last two decades, a considerable effort has been devoted to the study of new entropic forms, which generalize the standard Boltzmann-Gibbs (BG) entropy and are widely applicable in thermodynamics, quantum mechanics and information theory. In [23], by extending previous ideas of Shannon [38], [39], Khinchin proposed an axiomatic definition of the BG entropy, based on four requirements, nowadays known as the Shannon-Khinchin (SK) axioms. The purpose of this paper is twofold. First, we show that there exists an intrinsic group-theoretical structure behind the notion of entropy. It comes from the requirement of composability of an entropy with respect to the union of two statistically independent subsystems, that we propose in an axiomatic formulation. Second, we show that there exists a simple universal class of admissible entropies. This class contains many well known examples of entropies and infinitely many new ones, a priori multi-parametric. Due to its specific relation with the universal formal group, the new family of entropies introduced in this work will be called the universal-group entropy. A new example of multi-parametric entropy is explicitly constructed.Comment: Extended version; 25 page

    Entropic Uncertainty Relations in Quantum Physics

    Full text link
    Uncertainty relations have become the trademark of quantum theory since they were formulated by Bohr and Heisenberg. This review covers various generalizations and extensions of the uncertainty relations in quantum theory that involve the R\'enyi and the Shannon entropies. The advantages of these entropic uncertainty relations are pointed out and their more direct connection to the observed phenomena is emphasized. Several remaining open problems are mentionedComment: 35 pages, review pape

    Group entropies, correlation laws and zeta functions

    Full text link
    The notion of group entropy is proposed. It enables to unify and generalize many different definitions of entropy known in the literature, as those of Boltzmann-Gibbs, Tsallis, Abe and Kaniadakis. Other new entropic functionals are presented, related to nontrivial correlation laws characterizing universality classes of systems out of equilibrium, when the dynamics is weakly chaotic. The associated thermostatistics are discussed. The mathematical structure underlying our construction is that of formal group theory, which provides the general structure of the correlations among particles and dictates the associated entropic functionals. As an example of application, the role of group entropies in information theory is illustrated and generalizations of the Kullback-Leibler divergence are proposed. A new connection between statistical mechanics and zeta functions is established. In particular, Tsallis entropy is related to the classical Riemann zeta function.Comment: to appear in Physical Review
    corecore