6 research outputs found

    Generalised entropy accumulation

    Full text link
    Consider a sequential process in which each step outputs a system AiA_i and updates a side information register EE. We prove that if this process satisfies a natural "non-signalling" condition between past outputs and future side information, the min-entropy of the outputs A1,…,AnA_1, \dots, A_n conditioned on the side information EE at the end of the process can be bounded from below by a sum of von Neumann entropies associated with the individual steps. This is a generalisation of the entropy accumulation theorem (EAT), which deals with a more restrictive model of side information: there, past side information cannot be updated in subsequent rounds, and newly generated side information has to satisfy a Markov condition. Due to its more general model of side-information, our generalised EAT can be applied more easily and to a broader range of cryptographic protocols. As examples, we give the first multi-round security proof for blind randomness expansion and a simplified analysis of the E91 QKD protocol. The proof of our generalised EAT relies on a new variant of Uhlmann's theorem and new chain rules for the Renyi divergence and entropy, which might be of independent interest.Comment: 42 pages; v2 expands introduction but does not change any results; in FOCS 202

    Fundamental limitations on distillation of quantum channel resources

    Full text link
    Quantum channels underlie the dynamics of quantum systems, but in many practical settings it is the channels themselves that require processing. We establish universal limitations on the processing of both quantum states and channels, expressed in the form of no-go theorems and quantitative bounds for the manipulation of general quantum channel resources under the most general transformation protocols. Focusing on the class of distillation tasks -- which can be understood either as the purification of noisy channels into unitary ones, or the extraction of state-based resources from channels -- we develop fundamental restrictions on the error incurred in such transformations and comprehensive lower bounds for the overhead of any distillation protocol. In the asymptotic setting, our results yield broadly applicable bounds for rates of distillation. We demonstrate our results through applications to fault-tolerant quantum computation, where we obtain state-of-the-art lower bounds for the overhead cost of magic state distillation, as well as to quantum communication, where we recover a number of strong converse bounds for quantum channel capacity.Comment: 15+25 pages, 4 figures. v3: close to published version (changes in presentation, title modified; main results unaffected). See also related work by Fang and Liu at arXiv:2010.1182

    Better predictions when models are wrong or underspecified

    Get PDF
    Many statistical methods rely on models of reality in order to learn from data and to make predictions about future data. By necessity, these models usually do not match reality exactly, but are either wrong (none of the hypotheses in the model provides an accurate description of reality) or underspecified (the hypotheses in the model describe only part of the data). In this thesis, we discuss three scenarios involving models that are wrong or underspecified. In each case, we find that standard statistical methods may fail, sometimes dramatically, and present different methods that continue to perform well even if the models are wrong or underspecified. The first two of these scenarios involve regression problems and investigate AIC (Akaike's Information Criterion) and Bayesian statistics. The third scenario has the famous Monty Hall problem as a special case, and considers the question how we can update our belief about an unknown outcome given new evidence when the precise relation between outcome and evidence is unknown.UBL - phd migration 201

    Achievable secrecy enchancement through joint encryption and privacy amplification

    Get PDF
    In this dissertation we try to achieve secrecy enhancement in communications by resorting to both cryptographic and information theoretic secrecy tools and metrics. Our objective is to unify tools and measures from cryptography community with techniques and metrics from information theory community that are utilized to provide privacy and confidentiality in communication systems. For this purpose we adopt encryption techniques accompanied with privacy amplification tools in order to achieve secrecy goals that are determined based on information theoretic and cryptographic metrics. Every secrecy scheme relies on a certain advantage for legitimate users over adversaries viewed as an asymmetry in the system to deliver the required security for data transmission. In all of the proposed schemes in this dissertation, we resort to either inherently existing asymmetry in the system or proactively created advantage for legitimate users over a passive eavesdropper to further enhance secrecy of the communications. This advantage is manipulated by means of privacy amplification and encryption tools to achieve secrecy goals for the system evaluated based on information theoretic and cryptographic metrics. In our first work discussed in Chapter 2 and the third work explained in Chapter 4, we rely on a proactively established advantage for legitimate users based on eavesdropper’s lack of knowledge about a shared source of data. Unlike these works that assume an errorfree physical channel, in the second work discussed in Chapter 3 correlated erasure wiretap channel model is considered. This work relies on a passive and internally existing advantage for legitimate users that is built upon statistical and partial independence of eavesdropper’s channel errors from the errors in the main channel. We arrive at this secrecy advantage for legitimate users by exploitation of an authenticated but insecure feedback channel. From the perspective of the utilized tools, the first work discussed in Chapter 2 considers a specific scenario where secrecy enhancement of a particular block cipher called Data Encryption standard (DES) operating in cipher feedback mode (CFB) is studied. This secrecy enhancement is achieved by means of deliberate noise injection and wiretap channel encoding as a technique for privacy amplification against a resource constrained eavesdropper. Compared to the first work, the third work considers a more general framework in terms of both metrics and secrecy tools. This work studies secrecy enhancement of a general cipher based on universal hashing as a privacy amplification technique against an unbounded adversary. In this work, we have achieved the goal of exponential secrecy where information leakage to adversary, that is assessed in terms of mutual information as an information theoretic measure and Eve’s distinguishability as a cryptographic metric, decays at an exponential rate. In the second work generally encrypted data frames are transmitted through Automatic Repeat reQuest (ARQ) protocol to generate a common random source between legitimate users that later on is transformed into information theoretically secure keys for encryption by means of privacy amplification based on universal hashing. Towards the end, future works as an extension of the accomplished research in this dissertation are outlined. Proofs of major theorems and lemmas are presented in the Appendix

    Better predictions when models are wrong or underspecified

    Get PDF
    Many statistical methods rely on models of reality in order to learn from data and to make predictions about future data. By necessity, these models usually do not match reality exactly, but are either wrong (none of the hypotheses in the model provides an accurate description of reality) or underspecified (the hypotheses in the model describe only part of the data). In this thesis, we discuss three scenarios involving models that are wrong or underspecified. In each case, we find that standard statistical methods may fail, sometimes dramatically, and present different methods that continue to perform well even if the models are wrong or underspecified. The first two of these scenarios involve regression problems and investigate AIC (Akaike's Information Criterion) and Bayesian statistics. The third scenario has the famous Monty Hall problem as a special case, and considers the question how we can update our belief about an unknown outcome given new evidence when the precise relation between outcome and evidence is unknown.UBL - phd migration 201

    A definition of quantum mechanical work

    Get PDF
    Orientador: Prof. Dr. Renato Moreira AngeloDissertação (mestrado) - Universidade Federal do Paraná, Setor de Ciências Exatas, Programa de Pós-Graduação em Física. Defesa : Curitiba, 10/07/2018Inclui referências: p.92-97Resumo: A Mecânica Quântica pode ser vista como uma estrutura matemática que descreve resultados associados a sistemas microscópicos. Por outro lado, a Termodinâmica Clássica (juntamente com a Física Estatística) tem sido utilizada para caracterizar sistemas macroscópicos de forma geral, em que quantidades médias são consideradas e a conexão entre estas descrita de maneira formal por equações de estado. Com o intuito de relacionar essas teorias e desenvolver uma mais geral, trabalhos recentes têm sido desenvolvidos para conectar as ideias fundamentais da Termodinâmica com princípios quânticos. Este referencial teórico é ocasionalmente denominado Termodinâmica Quântica. Alguns conceitos chave associados a essa recente teoria são trabalho e calor, os quais estão bem estabelecidos no escopo da Termodinâmica Clássica e compõem a conservação de energia expressa pela primeira lei da Termodinâmica. Definições largamente aceitas para calor e trabalho foram introduzidas por Alicki em seu trabalho seminal de 1979. Embora, mostra-se, tais definições satisfaçam diretamente a primeira lei da Termodinâmica e têm sido aplicadas de forma bem sucedida a muitos contextos, parece não haver uma justificativa fundacional profunda para elas. De fato, definições alternativas foram propostas com base em analogias com a Termodinâmica Clássica. Na presente dissertação, uma definição de trabalho quântico-mecânico é introduzida, preservando a estrutura matemática do conceito clássico de trabalho sem, contudo, de forma alguma invocar a noção de trajetória. Com o uso de estados Gaussianos e do modelo de Caldirola-Kanai, um estudo de caso é conduzido através do qual o trabalho quântico ora proposto é comparado com a definição de Alicki, tanto em regime quântico como em semiclássico, mostrando resultados promissores. Inadequações conceituais são encontradas para o modelo de Alicki no limite clássico e possíveis interpretações são discutidas para a noção de trabalho quântico aqui introduzida. Finalmente, a nova definição é investigada em comparação com a abordagem clássico-estatística para estados de superposição e estados mistos. Palavras-chave: Trabalho. Termodinâmica Quântica. Modelo de Caldirola-Kanai.Abstract: Quantum Mechanics can be seen as a mathematical framework that describes experimental results associated with microscopic systems. On the other hand, the theory of Classical Thermodynamics (along with Statistical Physics) has been used to characterize macroscopic systems in a general way, whereby mean quantities are considered and the connections among them are formally described by state equations. In order to relate these theories and build up a more general one, recent works have been developed to connect the fundamental ideas of Thermodynamics with quantum principles. This theoretical framework is sometimes called Quantum Thermodynamics. Some key concepts associated with this recent theory are work and heat, which are very well established in the scope of Classical Thermodynamics and compose the energy conservation principle expressed by the first law of Thermodynamics. Widely accepted definitions of heat and work within the context of Quantum Thermodynamics were introduced by Alicki in his 1979 seminal work. Although such definitions can be shown to directly satisfy the first law of Thermodynamics and have been successfully applied to many contexts, there seems to be no deep foundational justification for them. In fact, alternative definitions have been proposed with basis on analogies with Classical Thermodynamics. In the present dissertation, a definition of quantum mechanical work is introduced which preserves the mathematical structure of the classical concept of work without, however, in any way invoking the notion of trajectory. By use of Gaussian states and the Caldirola- Kanai model, a case study is conducted through which the proposed quantum work is compared with Alicki's definition, both in quantum and semiclassical regimes, showing promising results. Conceptual inadequacies of Alicki's model are found in the classical limit and possible interpretations are discussed for the presently introduced notion of work. Finally, the new definition is investigated in comparison with a classical-statistical approach for superposition and mixed states. Keywords: Work. Quantum Thermodynamics. Caldirola-Kanai model
    corecore