235 research outputs found

    Using quantum theory to reduce the complexity of input-output processes

    Full text link
    All natural things process and transform information. They receive environmental information as input, and transform it into appropriate output responses. Much of science is dedicated to building models of such systems -- algorithmic abstractions of their input-output behavior that allow us to simulate how such systems can behave in the future, conditioned on what has transpired in the past. Here, we show that classical models cannot avoid inefficiency -- storing past information that is unnecessary for correct future simulation. We construct quantum models that mitigate this waste, whenever it is physically possible to do so. This suggests that the complexity of general input-output processes depends fundamentally on what sort of information theory we use to describe them.Comment: 10 pages, 5 figure

    Characterization of the probabilistic models that can be embedded in quantum theory

    Full text link
    Quantum bits can be isolated to perform useful information-theoretic tasks, even though physical systems are fundamentally described by very high-dimensional operator algebras. This is because qubits can be consistently embedded into higher-dimensional Hilbert spaces. A similar embedding of classical probability distributions into quantum theory enables the emergence of classical physics via decoherence. Here, we ask which other probabilistic models can similarly be embedded into finite-dimensional quantum theory. We show that the embeddable models are exactly those that correspond to the Euclidean special Jordan algebras: quantum theory over the reals, the complex numbers, or the quaternions, and "spin factors" (qubits with more than three degrees of freedom), and direct sums thereof. Among those, only classical and standard quantum theory with superselection rules can arise from a physical decoherence map. Our results have significant consequences for some experimental tests of quantum theory, by clarifying how they could (or could not) falsify it. Furthermore, they imply that all unrestricted non-classical models must be contextual.Comment: 6 pages, 0 figure

    Guaranteed energy-efficient bit reset in finite time

    Full text link
    Landauer's principle states that it costs at least kTln2 of work to reset one bit in the presence of a heat bath at temperature T. The bound of kTln2 is achieved in the unphysical infinite-time limit. Here we ask what is possible if one is restricted to finite-time protocols. We prove analytically that it is possible to reset a bit with a work cost close to kTln2 in a finite time. We construct an explicit protocol that achieves this, which involves changing the system's Hamiltonian avoiding quantum coherences, and thermalising. Using concepts and techniques pertaining to single-shot statistical mechanics, we further develop the limit on the work cost, proving that the heat dissipated is close to the minimal possible not just on average, but guaranteed with high confidence in every run. Moreover we exploit the protocol to design a quantum heat engine that works near the Carnot efficiency in finite time.Comment: 5 pages + 5 page technical appendix. 5 figures. Author accepted versio

    The classical-quantum divergence of complexity in modelling spin chains

    Full text link
    The minimal memory required to model a given stochastic process - known as the statistical complexity - is a widely adopted quantifier of structure in complexity science. Here, we ask if quantum mechanics can fundamentally change the qualitative behaviour of this measure. We study this question in the context of the classical Ising spin chain. In this system, the statistical complexity is known to grow monotonically with temperature. We evaluate the spin chain's quantum mechanical statistical complexity by explicitly constructing its provably simplest quantum model, and demonstrate that this measure exhibits drastically different behaviour: it rises to a maximum at some finite temperature then tends back towards zero for higher temperatures. This demonstrates how complexity, as captured by the amount of memory required to model a process, can exhibit radically different behaviour when quantum processing is allowed.Comment: 9 pages, 3 figures, comments are welcom

    Maximum one-shot dissipated work from Renyi divergences

    Get PDF
    Thermodynamics describes large-scale, slowly evolving systems. Two modern approaches generalize thermodynamics: fluctuation theorems, which concern finite-time nonequilibrium processes, and one-shot statistical mechanics, which concerns small scales and finite numbers of trials. Combining these approaches, we calculate a one-shot analog of the average dissipated work defined in fluctuation contexts: the cost of performing a protocol in finite time instead of quasistatically. The average dissipated work has been shown to be proportional to a relative entropy between phase-space densities, to a relative entropy between quantum states, and to a relative entropy between probability distributions over possible values of work. We derive one-shot analogs of all three equations, demonstrating that the order-infinity Renyi divergence is proportional to the maximum possible dissipated work in each case. These one-shot analogs of fluctuation-theorem results contribute to the unification of these two toolkits for small-scale, nonequilibrium statistical physics.Comment: 8 pages. Close to published versio

    Introducing one-shot work into fluctuation relations

    Get PDF
    Two approaches to small-scale and quantum thermodynamics are fluctuation relations and one-shot statistical mechanics. Fluctuation relations (such as Crooks' Theorem and Jarzynski's Equality) relate nonequilibrium behaviors to equilibrium quantities such as free energy. One-shot statistical mechanics involves statements about every run of an experiment, not just about averages over trials. We investigate the relation between the two approaches. We show that both approaches feature the same notions of work and the same notions of probability distributions over possible work values. The two approaches are alternative toolkits with which to analyze these distributions. To combine the toolkits, we show how one-shot work quantities can be defined and bounded in contexts governed by Crooks' Theorem. These bounds provide a new bridge from one-shot theory to experiments originally designed for testing fluctuation theorems.Comment: 37 pages, 6 figure

    Memory-efficient tracking of complex temporal and symbolic dynamics with quantum simulators

    Full text link
    Tracking the behaviour of stochastic systems is a crucial task in the statistical sciences. It has recently been shown that quantum models can faithfully simulate such processes whilst retaining less information about the past behaviour of the system than the optimal classical models. We extend these results to general temporal and symbolic dynamics. Our systematic protocol for quantum model construction relies only on an elementary description of the dynamics of the process. This circumvents restrictions on corresponding classical construction protocols, and allows for a broader range of processes to be modelled efficiently. We illustrate our method with an example exhibiting an apparent unbounded memory advantage of the quantum model compared to its optimal classical counterpart.Comment: 15 pages, 5 figure
    corecore