100 research outputs found

    Avoiding Irreversibility: Engineering Resonant Conversions of Quantum Resources

    Full text link
    © 2019 American Physical Society. We identify and explore the intriguing property of resource resonance arising within resource theories of entanglement, coherence, and thermodynamics. While the theories considered are reversible asymptotically, the same is generally not true in realistic scenarios where the available resources are bounded. The finite-size effects responsible for this irreversibility could potentially prohibit small quantum information processors or thermal machines from achieving their full potential. Nevertheless, we show here that by carefully engineering the resource interconversion process any such losses can be greatly suppressed. Our results are predicted by higher order expansions of the trade-off between the rate of resource interconversion and the achieved fidelity, and are verified by exact numerical optimizations of the appropriate underlying approximate majorization conditions

    Quantum Thermodynamics

    Get PDF
    Quantum thermodynamics is an emerging research field aiming to extend standard thermodynamics and non-equilibrium statistical physics to ensembles of sizes well below the thermodynamic limit, in non-equilibrium situations, and with the full inclusion of quantum effects. Fuelled by experimental advances and the potential of future nanoscale applications this research effort is pursued by scientists with different backgrounds, including statistical physics, many-body theory, mesoscopic physics and quantum information theory, who bring various tools and methods to the field. A multitude of theoretical questions are being addressed ranging from issues of thermalisation of quantum systems and various definitions of "work", to the efficiency and power of quantum engines. This overview provides a perspective on a selection of these current trends accessible to postgraduate students and researchers alike.Comment: 48 pages, improved and expanded several sections. Comments welcom

    Noise in Quantum Information Processing

    Get PDF
    Quantum phenomena such as superposition and entanglement imbue quantum systems with information processing power in excess of their classical counterparts. These properties of quantum states are, however, highly fragile. As we enter the era of noisy intermediate-scale quantum (NISQ) devices, this vulnerability to noise is a major hurdle to the experimental realisation of quantum technologies. In this thesis we explore the role of noise in quantum information processing from two different perspectives. In Part I we consider noise from the perspective of quantum error correcting codes. Error correcting codes are often analysed with respect to simplified toy models of noise, such as iid depolarising noise. We consider generalising these techniques for analysing codes under more realistic noise models, including features such as biased or correlated errors. We also consider designing customised codes which not only take into account and exploit features of the underlying physical noise. Considering such tailored codes will be of particular importance for NISQ applications in which finite-size effects can be significant. In Part II we apply tools from information theory to study the finite-resource effects which arise in the trade-offs between resource costs and error rates for certain quantum information processing tasks. We start by considering classical communication over quantum channels, providing a refined analysis of the trade-off between communication rate and error in the regime of a finite number of channel uses. We then extend these techniques to the problem of resource interconversion in theories such as quantum entanglement and quantum thermodynamics, studying finite-size effects which arise in resource-error trade-offs. By studying this effect in detail, we also show how detrimental finite-size effects in devices such as thermal engines may be greatly suppressed by carefully engineering the underlying resource interconversion processes

    Irreversible entropy production, from quantum to classical

    Full text link
    Entropy production is a key quantity in any finite-time thermodynamic process. It is intimately tied with the fundamental laws of thermodynamics, embodying a tool to extend thermodynamic considerations all the way to non-equilibrium processes. It is also often used in attempts to provide the quantitative characterization of logical and thermodynamic irreversibility, stemming from processes in physics, chemistry and biology. Notwithstanding its fundamental character, a unifying theory of entropy production valid for general processes, both classical and quantum, has not yet been formulated. Developments pivoting around the frameworks of stochastic thermodynamics, open quantum systems, and quantum information theory have led to substantial progress in such endeavour. This has culminated in the unlocking of a new generation of experiments able to address stochastic thermodynamic processes and the impact of entropy production on them. This paper aims to provide a compendium on the current framework for the description, assessment and manipulation of entropy production. We present both formal aspects of its formulation and the implications stemming from the potential quantum nature of a given process, including a detailed survey of recent experiments.Comment: Accepted in RM

    The role of reversibility in quantum thermodynamics and the foundations of quantum theory

    Get PDF
    The first half of this thesis is dedicated to the study of generalizations of probability theory which include quantum theory and many of its alternatives. This research program seeks to understand quantum theory by exploring it ‘from the outside’, where it sits within a landscape of all operationally defined theories. We approach this problem with a new perspective: generalized decoherence. Quantum theory recovers classical theory via decoherence, where a ‘classical limit’ is achieved by a quantum system interacting with its environment, losing its quantum coherence. Typically it is the emergence of the classical world from the quantum that is studied. We reverse this paradigm and ask - how does the existence of a classical limit define the structure of quantum theory, or any other post-classical theory? We derive the existence of entangled states as a necessary feature of any theory that allows for the emergence of the classical world in this way. We then use our framework for generalized decoherence to explore the properties of theories that can ‘hyperdecohere’ to quantum theory. We find that any such theory must be very exotic compared to quantum theory, violating either the postulate of tomographic locality, reversibility, causality or purity. In the second half of this thesis we turn our attention to the second law of thermodynamics. We present a framework for deriving the second law under general constraints and explore two pertinent examples, constraining the fluctuations in work and constraining the size of the thermal bath and deriving the second law in these cases. Bounding fluctuations results in a unified free energy that contains the single-shot and Helmholtz free energies as limiting cases. Bounding the size of the thermal bath results in a finite-bath second law that is more general that previous attempts and draws connections between non-asymptotic thermodynamics and second-order information theory.Open Acces

    Finite-Time Thermodynamics

    Get PDF
    The theory around the concept of finite time describes how processes of any nature can be optimized in situations when their rate is required to be non-negligible, i.e., they must come to completion in a finite time. What the theory makes explicit is “the cost of haste”. Intuitively, it is quite obvious that you drive your car differently if you want to reach your destination as quickly as possible as opposed to the case when you are running out of gas. Finite-time thermodynamics quantifies such opposing requirements and may provide the optimal control to achieve the best compromise. The theory was initially developed for heat engines (steam, Otto, Stirling, a.o.) and for refrigerators, but it has by now evolved into essentially all areas of dynamic systems from the most abstract ones to the most practical ones. The present collection shows some fascinating current examples

    Advancing catchment hydrology to deal with predictions under change

    Get PDF
    Throughout its historical development, hydrology as an earth science, but especially as a problem-centred engineering discipline has largely relied (quite successfully) on the assumption of stationarity. This includes assuming time invariance of boundary conditions such as climate, system configurations such as land use, topography and morphology, and dynamics such as flow regimes and flood recurrence at different spatio-temporal aggregation scales. The justification for this assumption was often that when compared with the temporal, spatial, or topical extent of the questions posed to hydrology, such conditions could indeed be considered stationary, and therefore the neglect of certain long-term non-stationarities or feedback effects (even if they were known) would not introduce a large error. However, over time two closely related phenomena emerged that have increasingly reduced the general applicability of the stationarity concept: the first is the rapid and extensive global changes in many parts of the hydrological cycle, changing formerly stationary systems to transient ones. The second is that the questions posed to hydrology have become increasingly more complex, requiring the joint consideration of increasingly more (sub-) systems and their interactions across more and longer timescales, which limits the applicability of stationarity assumptions. Therefore, the applicability of hydrological concepts based on stationarity has diminished at the same rate as the complexity of the hydrological problems we are confronted with and the transient nature of the hydrological systems we are dealing with has increased. The aim of this paper is to present and discuss potentially helpful paradigms and theories that should be considered as we seek to better understand complex hydrological systems under change. For the sake of brevity we focus on catchment hydrology. We begin with a discussion of the general nature of explanation in hydrology and briefly review the history of catchment hydrology. We then propose and discuss several perspectives on catchments: as complex dynamical systems, self-organizing systems, co-evolving systems and open dissipative thermodynamic systems. We discuss the benefits of comparative hydrology and of taking an information-theoretic view of catchments, including the flow of information from data to models to predictions. In summary, we suggest that these perspectives deserve closer attention and that their synergistic combination can advance catchment hydrology to address questions of change
    corecore