8,247 research outputs found

    Using quantum theory to reduce the complexity of input-output processes

    Full text link
    All natural things process and transform information. They receive environmental information as input, and transform it into appropriate output responses. Much of science is dedicated to building models of such systems -- algorithmic abstractions of their input-output behavior that allow us to simulate how such systems can behave in the future, conditioned on what has transpired in the past. Here, we show that classical models cannot avoid inefficiency -- storing past information that is unnecessary for correct future simulation. We construct quantum models that mitigate this waste, whenever it is physically possible to do so. This suggests that the complexity of general input-output processes depends fundamentally on what sort of information theory we use to describe them.Comment: 10 pages, 5 figure

    Guaranteed energy-efficient bit reset in finite time

    Full text link
    Landauer's principle states that it costs at least kTln2 of work to reset one bit in the presence of a heat bath at temperature T. The bound of kTln2 is achieved in the unphysical infinite-time limit. Here we ask what is possible if one is restricted to finite-time protocols. We prove analytically that it is possible to reset a bit with a work cost close to kTln2 in a finite time. We construct an explicit protocol that achieves this, which involves changing the system's Hamiltonian avoiding quantum coherences, and thermalising. Using concepts and techniques pertaining to single-shot statistical mechanics, we further develop the limit on the work cost, proving that the heat dissipated is close to the minimal possible not just on average, but guaranteed with high confidence in every run. Moreover we exploit the protocol to design a quantum heat engine that works near the Carnot efficiency in finite time.Comment: 5 pages + 5 page technical appendix. 5 figures. Author accepted versio

    The classical-quantum divergence of complexity in modelling spin chains

    Full text link
    The minimal memory required to model a given stochastic process - known as the statistical complexity - is a widely adopted quantifier of structure in complexity science. Here, we ask if quantum mechanics can fundamentally change the qualitative behaviour of this measure. We study this question in the context of the classical Ising spin chain. In this system, the statistical complexity is known to grow monotonically with temperature. We evaluate the spin chain's quantum mechanical statistical complexity by explicitly constructing its provably simplest quantum model, and demonstrate that this measure exhibits drastically different behaviour: it rises to a maximum at some finite temperature then tends back towards zero for higher temperatures. This demonstrates how complexity, as captured by the amount of memory required to model a process, can exhibit radically different behaviour when quantum processing is allowed.Comment: 9 pages, 3 figures, comments are welcom

    Maximum one-shot dissipated work from Renyi divergences

    Get PDF
    Thermodynamics describes large-scale, slowly evolving systems. Two modern approaches generalize thermodynamics: fluctuation theorems, which concern finite-time nonequilibrium processes, and one-shot statistical mechanics, which concerns small scales and finite numbers of trials. Combining these approaches, we calculate a one-shot analog of the average dissipated work defined in fluctuation contexts: the cost of performing a protocol in finite time instead of quasistatically. The average dissipated work has been shown to be proportional to a relative entropy between phase-space densities, to a relative entropy between quantum states, and to a relative entropy between probability distributions over possible values of work. We derive one-shot analogs of all three equations, demonstrating that the order-infinity Renyi divergence is proportional to the maximum possible dissipated work in each case. These one-shot analogs of fluctuation-theorem results contribute to the unification of these two toolkits for small-scale, nonequilibrium statistical physics.Comment: 8 pages. Close to published versio

    Characterization of the probabilistic models that can be embedded in quantum theory

    Full text link
    Quantum bits can be isolated to perform useful information-theoretic tasks, even though physical systems are fundamentally described by very high-dimensional operator algebras. This is because qubits can be consistently embedded into higher-dimensional Hilbert spaces. A similar embedding of classical probability distributions into quantum theory enables the emergence of classical physics via decoherence. Here, we ask which other probabilistic models can similarly be embedded into finite-dimensional quantum theory. We show that the embeddable models are exactly those that correspond to the Euclidean special Jordan algebras: quantum theory over the reals, the complex numbers, or the quaternions, and "spin factors" (qubits with more than three degrees of freedom), and direct sums thereof. Among those, only classical and standard quantum theory with superselection rules can arise from a physical decoherence map. Our results have significant consequences for some experimental tests of quantum theory, by clarifying how they could (or could not) falsify it. Furthermore, they imply that all unrestricted non-classical models must be contextual.Comment: 6 pages, 0 figure

    Introducing one-shot work into fluctuation relations

    Get PDF
    Two approaches to small-scale and quantum thermodynamics are fluctuation relations and one-shot statistical mechanics. Fluctuation relations (such as Crooks' Theorem and Jarzynski's Equality) relate nonequilibrium behaviors to equilibrium quantities such as free energy. One-shot statistical mechanics involves statements about every run of an experiment, not just about averages over trials. We investigate the relation between the two approaches. We show that both approaches feature the same notions of work and the same notions of probability distributions over possible work values. The two approaches are alternative toolkits with which to analyze these distributions. To combine the toolkits, we show how one-shot work quantities can be defined and bounded in contexts governed by Crooks' Theorem. These bounds provide a new bridge from one-shot theory to experiments originally designed for testing fluctuation theorems.Comment: 37 pages, 6 figure

    Increased plasticity of the bodily self in eating disorders

    Get PDF
    Background: The rubber hand illusion (RHI) has been widely used to investigate the bodily self in healthy individuals. The aim of the present study was to extend the use of the RHI to examine the bodily self in eating disorders. Methods: The RHI and self-report measures of eating disorder psychopathology (EDI-3 subscales of Drive for Thinness, Bulimia, Body Dissatisfaction, Interoceptive Deficits, and Emotional Dysregulation; DASS-21; and the Self-Objectification Questionnaire) were administered to 78 individuals with an eating disorder and 61 healthy controls. Results: Individuals with an eating disorder experienced the RHI significantly more strongly than healthy controls on both perceptual (i.e., proprioceptive drift) and subjective (self-report questionnaire) measures. Furthermore, both the subjective experience of the RHI and associated proprioceptive biases were correlated with eating disorder psychopathology. Approximately 20% of the variance for embodiment of the fake hand was accounted for by eating disorder psychopathology, with interoceptive deficits and self-objectification significant predictors of embodiment. Conclusions: These results indicate that the bodily self is more plastic in people with an eating disorder. These findings may shed light on both aetiological and maintenance factors involved in eating disorders, particularly visual processing of the body, interoceptive deficits, and self-objectification

    Surveying structural complexity in quantum many-body systems

    Full text link
    Quantum many-body systems exhibit a rich and diverse range of exotic behaviours, owing to their underlying non-classical structure. These systems present a deep structure beyond those that can be captured by measures of correlation and entanglement alone. Using tools from complexity science, we characterise such structure. We investigate the structural complexities that can be found within the patterns that manifest from the observational data of these systems. In particular, using two prototypical quantum many-body systems as test cases - the one-dimensional quantum Ising and Bose-Hubbard models - we explore how different information-theoretic measures of complexity are able to identify different features of such patterns. This work furthers the understanding of fully-quantum notions of structure and complexity in quantum systems and dynamics.Comment: 9 pages, 5 figure
    corecore