18 research outputs found

    Von Neumann's information engine without the spectral theorem

    Full text link
    Von Neumann obtained the formula for the entropy of a quantum state by assuming the validity of the second law of thermodynamics in a thought experiment involving semipermeable membranes and an ideal gas of quantum-labeled particles. Despite being operational in the most part, von Neumann's argument crucially departs from an operational narrative in its use of the spectral theorem. In this work we show that the role of the spectral theorem in von Neumann's argument can be taken over by the operational assumptions of repeatability and reversibility, and using these we are able to explore the consequences of the second law also in theories that do not possess a unique spectral decomposition. As a byproduct, we obtain the Groenewold--Ozawa information gain as a natural monotone for a suitable ordering of instruments, providing it with an operational interpretation valid in quantum theory and beyond.Comment: 10 pages, 4 figures. [v2]: Accepted version for the publication on PR

    A complete and operational resource theory of measurement sharpness

    Get PDF
    We construct a resource theory of sharpness for finite-dimensional positive operator-valued measures (POVMs), where the sharpness-non-increasing operations are given by quantum preprocessing channels and convex mixtures with POVMs whose elements are all proportional to the identity operator. As required for a sound resource theory of sharpness, we show that our theory has greatest (i.e., sharpest) elements, which are all equivalent, and coincide with the set of POVMs that admit a repeatable measurement. Among the greatest elements, conventional non-degenerate observables are characterized as the minimal ones. More generally, we quantify sharpness in terms of a class of monotones, expressed as the EPR--Ozawa correlations between the given POVM and an arbitrary reference POVM. We show that one POVM can be transformed into another by means of a sharpness-non-increasing operation if and only if the former is sharper than the latter with respect to all monotones. Thus, our resource theory of sharpness is complete, in the sense that the comparison of all monotones provide a necessary and sufficient condition for the existence of a sharpness-non-increasing operation between two POVMs, and operational, in the sense that all monotones are in principle experimentally accessible.Comment: 23 pages, 1 figur

    Universal validity of the second law of information thermodynamics

    Full text link
    Feedback control and erasure protocols have often been considered as a model to embody Maxwell's Demon paradox and to study the interplay between thermodynamics and information processing. Such studies have led to the conclusion, now widely accepted in the community, that Maxwell's Demon and the second law of thermodynamics can peacefully coexist because any gain provided by the demon must be offset by the cost of performing measurement and resetting the demon's memory to its initial state. Statements of this kind are collectively referred to as second laws of information thermodynamics and have recently been extended to include quantum theoretical scenarios. However, previous studies in this direction have made several assumptions, in particular about the feedback process and the measurement performed on the demon's memory, and thus arrived at statements that are not universally applicable and whose range of validity is not clear. In this work, we fill this gap by precisely characterizing the full range of quantum feedback control and erasure protocols that are overall consistent with the second law of thermodynamics. This leads us to conclude that the second law of information thermodynamics is indeed universal: it must hold for any quantum feedback control and erasure protocol, regardless of the measurement process involved, as long as the protocol is overall compatible with thermodynamics. Our comprehensive analysis not only encompasses new scenarios but also retrieves previous ones, doing so with fewer assumptions. This simplification contributes to a clearer understanding of the theory. Additionally, our work identifies the Groenewold--Ozawa information gain as the correct information measure characterizing the work extractable by feedback control.Comment: 30 pages, 1 figure. The title is changed from the previous version and one author is added. The contents are significantly update

    Fundamental limits of quantum error mitigation

    No full text
    The inevitable accumulation of errors in near-future quantum devices represents a key obstacle in delivering practical quantum advantages, motivating the development of various quantum error-mitigation methods. Here, we derive fundamental bounds concerning how error-mitigation algorithms can reduce the computation error as a function of their sampling overhead. Our bounds place universal performance limits on a general error-mitigation protocol class. We use them to show (1) that the sampling overhead that ensures a certain computational accuracy for mitigating local depolarizing noise in layered circuits scales exponentially with the circuit depth for general error-mitigation protocols and (2) the optimality of probabilistic error cancellation among a wide class of strategies in mitigating the local dephasing noise on an arbitrary number of qubits. Our results provide a means to identify when a given quantum error-mitigation strategy is optimal and when there is potential room for improvement.Ministry of Education (MOE)Nanyang Technological UniversityNational Research Foundation (NRF)Published versionThis work is supported by the Singapore Ministry of Education Tier 1 Grant RG162/19 and RG146/20, the National Research Foundation under its Quantum Engineering Program NRF2021-QEP2-02-P06, the Singapore Ministry of Education Tier 2 Project MOE-T2EP50221-0005 and the FQXi-RFP-IPW-1903 project, “Are quantum agents more energetically efficient at making predictions?” from the Foundational Questions Institute, Fetzer Franklin Fund, a donor advised fund of Silicon Valley Community Foundation, and the Lee Kuan Yew Postdoctoral Fellowship at Nanyang Technological University Singapore

    Compressive strength and chloride ion permeation resistance of mortar containing clinker with different mineral composition as an aggregate

    No full text
    The purpose of this study is to clarify the performance of cement hardenings containing clinkers which are utilized as aggregate by evaluating the compressive strength and chloride ion penetration resistance of the mortars. Here, the fine aggregate used in this study were the ordinary Portland cement clinker and the clinker with more waste as an alternative raw material. In addition to this, we discussed on the effect of the clinkers on the compressive strength and chloride ion penetration resistance of mortar based on the weight loss on ignition and void structure. The results showed that the compressive strength of the mortars containing clinker aggregate was equal to or more than the mortar used an ISO standard sand, and the apparent diffusion coefficient of chloride ion decreased by using clinker. The reason why improving compressive strength and chloride ion penetration resistance might be attributed to the densification of the voids in the range of 50 nm to 2 μm, which is considered to be the void diameter represented by the interfacial transition zone, by hydration of clinker itself

    Impact of delivery time factor on treatment time and plan quality in tomotherapy

    No full text
    Abstract Delivery time factor (DTF) is a new parameter introduced by the RayStation treatment planning system for tomotherapy treatment planning. This study investigated the effects of this factor on various tomotherapy plans. Twenty-five patients with cancer (head and neck, 6; lung, 9; prostate, 10) were enrolled in this study. Helical tomotherapy plans with a field width of 2.5 cm, pitch of 0.287, and DTF of 2.0 were created. All the initial plans were recalculated by changing the DTF parameter from 1.0 to 3.0 in increments of 0.1. Then, DTF’s impact on delivery efficiency and plan quality was evaluated. Treatment time and modulation factor increased monotonically with increasing DTF. Increasing the DTF by 0.1 increased the treatment time and modulation factor by almost 10%. This relationship was similar for all treatment sites. Conformity index (CI), homogeneity index, and organ at risk doses were improved compared to plans with a DTF of 1.0, except for the CI in the lung cancer case. However, the improvement in most indices ceased at a certain DTF; nevertheless, treatment time continued to increase following an increase in DTF. DTF is a critical parameter for improving the quality of tomotherapy plans
    corecore