4,852 research outputs found

    Unconditionally verifiable blind computation

    Get PDF
    Blind Quantum Computing (BQC) allows a client to have a server carry out a quantum computation for them such that the client's input, output and computation remain private. A desirable property for any BQC protocol is verification, whereby the client can verify with high probability whether the server has followed the instructions of the protocol, or if there has been some deviation resulting in a corrupted output state. A verifiable BQC protocol can be viewed as an interactive proof system leading to consequences for complexity theory. The authors, together with Broadbent, previously proposed a universal and unconditionally secure BQC scheme where the client only needs to be able to prepare single qubits in separable states randomly chosen from a finite set and send them to the server, who has the balance of the required quantum computational resources. In this paper we extend that protocol with new functionality allowing blind computational basis measurements, which we use to construct a new verifiable BQC protocol based on a new class of resource states. We rigorously prove that the probability of failing to detect an incorrect output is exponentially small in a security parameter, while resource overhead remains polynomial in this parameter. The new resource state allows entangling gates to be performed between arbitrary pairs of logical qubits with only constant overhead. This is a significant improvement on the original scheme, which required that all computations to be performed must first be put into a nearest neighbour form, incurring linear overhead in the number of qubits. Such an improvement has important consequences for efficiency and fault-tolerance thresholds.Comment: 46 pages, 10 figures. Additional protocol added which allows arbitrary circuits to be verified with polynomial securit

    From Low-Distortion Norm Embeddings to Explicit Uncertainty Relations and Efficient Information Locking

    Full text link
    The existence of quantum uncertainty relations is the essential reason that some classically impossible cryptographic primitives become possible when quantum communication is allowed. One direct operational manifestation of these uncertainty relations is a purely quantum effect referred to as information locking. A locking scheme can be viewed as a cryptographic protocol in which a uniformly random n-bit message is encoded in a quantum system using a classical key of size much smaller than n. Without the key, no measurement of this quantum state can extract more than a negligible amount of information about the message, in which case the message is said to be "locked". Furthermore, knowing the key, it is possible to recover, that is "unlock", the message. In this paper, we make the following contributions by exploiting a connection between uncertainty relations and low-distortion embeddings of L2 into L1. We introduce the notion of metric uncertainty relations and connect it to low-distortion embeddings of L2 into L1. A metric uncertainty relation also implies an entropic uncertainty relation. We prove that random bases satisfy uncertainty relations with a stronger definition and better parameters than previously known. Our proof is also considerably simpler than earlier proofs. We apply this result to show the existence of locking schemes with key size independent of the message length. We give efficient constructions of metric uncertainty relations. The bases defining these metric uncertainty relations are computable by quantum circuits of almost linear size. This leads to the first explicit construction of a strong information locking scheme. Moreover, we present a locking scheme that is close to being implementable with current technology. We apply our metric uncertainty relations to exhibit communication protocols that perform quantum equality testing.Comment: 60 pages, 5 figures. v4: published versio

    Reference frames, superselection rules, and quantum information

    Full text link
    Recently, there has been much interest in a new kind of ``unspeakable'' quantum information that stands to regular quantum information in the same way that a direction in space or a moment in time stands to a classical bit string: the former can only be encoded using particular degrees of freedom while the latter are indifferent to the physical nature of the information carriers. The problem of correlating distant reference frames, of which aligning Cartesian axes and synchronizing clocks are important instances, is an example of a task that requires the exchange of unspeakable information and for which it is interesting to determine the fundamental quantum limit of efficiency. There have also been many investigations into the information theory that is appropriate for parties that lack reference frames or that lack correlation between their reference frames, restrictions that result in global and local superselection rules. In the presence of these, quantum unspeakable information becomes a new kind of resource that can be manipulated, depleted, quantified, etcetera. Methods have also been developed to contend with these restrictions using relational encodings, particularly in the context of computation, cryptography, communication, and the manipulation of entanglement. This article reviews the role of reference frames and superselection rules in the theory of quantum information processing.Comment: 55 pages, published versio

    Classical Cryptographic Protocols in a Quantum World

    Get PDF
    Cryptographic protocols, such as protocols for secure function evaluation (SFE), have played a crucial role in the development of modern cryptography. The extensive theory of these protocols, however, deals almost exclusively with classical attackers. If we accept that quantum information processing is the most realistic model of physically feasible computation, then we must ask: what classical protocols remain secure against quantum attackers? Our main contribution is showing the existence of classical two-party protocols for the secure evaluation of any polynomial-time function under reasonable computational assumptions (for example, it suffices that the learning with errors problem be hard for quantum polynomial time). Our result shows that the basic two-party feasibility picture from classical cryptography remains unchanged in a quantum world.Comment: Full version of an old paper in Crypto'11. Invited to IJQI. This is authors' copy with different formattin

    Recursive quantum repeater networks

    Full text link
    Internet-scale quantum repeater networks will be heterogeneous in physical technology, repeater functionality, and management. The classical control necessary to use the network will therefore face similar issues as Internet data transmission. Many scalability and management problems that arose during the development of the Internet might have been solved in a more uniform fashion, improving flexibility and reducing redundant engineering effort. Quantum repeater network development is currently at the stage where we risk similar duplication when separate systems are combined. We propose a unifying framework that can be used with all existing repeater designs. We introduce the notion of a Quantum Recursive Network Architecture, developed from the emerging classical concept of 'recursive networks', extending recursive mechanisms from a focus on data forwarding to a more general distributed computing request framework. Recursion abstracts independent transit networks as single relay nodes, unifies software layering, and virtualizes the addresses of resources to improve information hiding and resource management. Our architecture is useful for building arbitrary distributed states, including fundamental distributed states such as Bell pairs and GHZ, W, and cluster states.Comment: 14 page

    Systematizing Genome Privacy Research: A Privacy-Enhancing Technologies Perspective

    Full text link
    Rapid advances in human genomics are enabling researchers to gain a better understanding of the role of the genome in our health and well-being, stimulating hope for more effective and cost efficient healthcare. However, this also prompts a number of security and privacy concerns stemming from the distinctive characteristics of genomic data. To address them, a new research community has emerged and produced a large number of publications and initiatives. In this paper, we rely on a structured methodology to contextualize and provide a critical analysis of the current knowledge on privacy-enhancing technologies used for testing, storing, and sharing genomic data, using a representative sample of the work published in the past decade. We identify and discuss limitations, technical challenges, and issues faced by the community, focusing in particular on those that are inherently tied to the nature of the problem and are harder for the community alone to address. Finally, we report on the importance and difficulty of the identified challenges based on an online survey of genome data privacy expertsComment: To appear in the Proceedings on Privacy Enhancing Technologies (PoPETs), Vol. 2019, Issue
    corecore