8,367 research outputs found

    Using quantum key distribution for cryptographic purposes: a survey

    Full text link
    The appealing feature of quantum key distribution (QKD), from a cryptographic viewpoint, is the ability to prove the information-theoretic security (ITS) of the established keys. As a key establishment primitive, QKD however does not provide a standalone security service in its own: the secret keys established by QKD are in general then used by a subsequent cryptographic applications for which the requirements, the context of use and the security properties can vary. It is therefore important, in the perspective of integrating QKD in security infrastructures, to analyze how QKD can be combined with other cryptographic primitives. The purpose of this survey article, which is mostly centered on European research results, is to contribute to such an analysis. We first review and compare the properties of the existing key establishment techniques, QKD being one of them. We then study more specifically two generic scenarios related to the practical use of QKD in cryptographic infrastructures: 1) using QKD as a key renewal technique for a symmetric cipher over a point-to-point link; 2) using QKD in a network containing many users with the objective of offering any-to-any key establishment service. We discuss the constraints as well as the potential interest of using QKD in these contexts. We finally give an overview of challenges relative to the development of QKD technology that also constitute potential avenues for cryptographic research.Comment: Revised version of the SECOQC White Paper. Published in the special issue on QKD of TCS, Theoretical Computer Science (2014), pp. 62-8

    Practical cryptographic strategies in the post-quantum era

    Full text link
    We review new frontiers in information security technologies in communications and distributed storage technologies with the use of classical, quantum, hybrid classical-quantum, and post-quantum cryptography. We analyze the current state-of-the-art, critical characteristics, development trends, and limitations of these techniques for application in enterprise information protection systems. An approach concerning the selection of practical encryption technologies for enterprises with branched communication networks is introduced.Comment: 5 pages, 2 figures; review pape

    Deep Active Learning for Named Entity Recognition

    Get PDF
    Deep learning has yielded state-of-the-art performance on many natural language processing tasks including named entity recognition (NER). However, this typically requires large amounts of labeled data. In this work, we demonstrate that the amount of labeled training data can be drastically reduced when deep learning is combined with active learning. While active learning is sample-efficient, it can be computationally expensive since it requires iterative retraining. To speed this up, we introduce a lightweight architecture for NER, viz., the CNN-CNN-LSTM model consisting of convolutional character and word encoders and a long short term memory (LSTM) tag decoder. The model achieves nearly state-of-the-art performance on standard datasets for the task while being computationally much more efficient than best performing models. We carry out incremental active learning, during the training process, and are able to nearly match state-of-the-art performance with just 25\% of the original training data

    Classical Homomorphic Encryption for Quantum Circuits

    Get PDF
    We present the first leveled fully homomorphic encryption scheme for quantum circuits with classical keys. The scheme allows a classical client to blindly delegate a quantum computation to a quantum server: an honest server is able to run the computation while a malicious server is unable to learn any information about the computation. We show that it is possible to construct such a scheme directly from a quantum secure classical homomorphic encryption scheme with certain properties. Finally, we show that a classical homomorphic encryption scheme with the required properties can be constructed from the learning with errors problem

    A quantum key distribution protocol for rapid denial of service detection

    Get PDF
    We introduce a quantum key distribution protocol designed to expose fake users that connect to Alice or Bob for the purpose of monopolising the link and denying service. It inherently resists attempts to exhaust Alice and Bob's initial shared secret, and is 100% efficient, regardless of the number of qubits exchanged above the finite key limit. Additionally, secure key can be generated from two-photon pulses, without having to make any extra modifications. This is made possible by relaxing the security of BB84 to that of the quantum-safe block cipher used for day-to-day encryption, meaning the overall security remains unaffected for useful real-world cryptosystems such as AES-GCM being keyed with quantum devices.Comment: 13 pages, 3 figures. v2: Shifted focus of paper towards DoS and added protocol 4. v1: Accepted to QCrypt 201

    Preventing False Discovery in Interactive Data Analysis is Hard

    Full text link
    We show that, under a standard hardness assumption, there is no computationally efficient algorithm that given nn samples from an unknown distribution can give valid answers to n3+o(1)n^{3+o(1)} adaptively chosen statistical queries. A statistical query asks for the expectation of a predicate over the underlying distribution, and an answer to a statistical query is valid if it is "close" to the correct expectation over the distribution. Our result stands in stark contrast to the well known fact that exponentially many statistical queries can be answered validly and efficiently if the queries are chosen non-adaptively (no query may depend on the answers to previous queries). Moreover, a recent work by Dwork et al. shows how to accurately answer exponentially many adaptively chosen statistical queries via a computationally inefficient algorithm; and how to answer a quadratic number of adaptive queries via a computationally efficient algorithm. The latter result implies that our result is tight up to a linear factor in n.n. Conceptually, our result demonstrates that achieving statistical validity alone can be a source of computational intractability in adaptive settings. For example, in the modern large collaborative research environment, data analysts typically choose a particular approach based on previous findings. False discovery occurs if a research finding is supported by the data but not by the underlying distribution. While the study of preventing false discovery in Statistics is decades old, to the best of our knowledge our result is the first to demonstrate a computational barrier. In particular, our result suggests that the perceived difficulty of preventing false discovery in today's collaborative research environment may be inherent
    corecore