266 research outputs found

    Computational indistinguishability and boson sampling

    Full text link
    We introduce a computational problem of distinguishing between the output of an ideal coarse-grained boson sampler and the output of a true random number generator, as a resource for cryptographic schemes, which are secure against computationally unbounded adversaries. Moreover, we define a cryptographic setting for the implementation of such schemes, including message encryption and authentication, as well as entity authentication

    Decision and function problems based on boson sampling

    Get PDF
    Boson sampling is a mathematical problem that is strongly believed to be intractable for classical computers, whereas passive linear interferometers can produce samples efficiently. So far, the problem remains a computational curiosity, and the possible usefulness of boson-sampling devices is mainly limited to the proof of quantum supremacy. The purpose of this work is to investigate whether boson sampling can be used as a resource of decision and function problems that are computationally hard, and may thus have cryptographic applications. After the definition of a rather general theoretical framework for the design of such problems, we discuss their solution by means of a brute-force numerical approach, as well as by means of non-boson samplers. Moreover, we estimate the sample sizes required for their solution by passive linear interferometers, and it is shown that they are independent of the size of the Hilbert space.Comment: Close to the version published in PR

    Experimental Demonstration of Quantum Fully Homomorphic Encryption with Application in a Two-Party Secure Protocol

    Get PDF
    A fully homomorphic encryption system hides data from unauthorized parties while still allowing them to perform computations on the encrypted data. Aside from the straightforward benefit of allowing users to delegate computations to a more powerful server without revealing their inputs, a fully homomorphic cryptosystem can be used as a building block in the construction of a number of cryptographic functionalities. Designing such a scheme remained an open problem until 2009, decades after the idea was first conceived, and the past few years have seen the generalization of this functionality to the world of quantum machines. Quantum schemes prior to the one implemented here were able to replicate some features in particular use cases often associated with homomorphic encryption but lacked other crucial properties, for example, relying on continual interaction to perform a computation or leaking information about the encrypted data. We present the first experimental realization of a quantum fully homomorphic encryption scheme. To demonstrate the versatility of a a quantum fully homomorphic encryption scheme, we further present a toy two-party secure computation task enabled by our scheme

    Proof-of-work consensus by quantum sampling

    Full text link
    Since its advent in 2011, boson-sampling has been a preferred candidate for demonstrating quantum advantage because of its simplicity and near-term requirements compared to other quantum algorithms. We propose to use a variant, called coarse-grained boson-sampling (CGBS), as a quantum Proof-of-Work (PoW) scheme for blockchain consensus. The users perform boson-sampling using input states that depend on the current block information, and commit their samples to the network. Afterward, CGBS strategies are determined which can be used to both validate samples and to reward successful miners. By combining rewards to miners committing honest samples together with penalties to miners committing dishonest samples, a Nash equilibrium is found that incentivizes honest nodes. The scheme works for both Fock state boson sampling and Gaussian boson sampling and provides dramatic speedup and energy savings relative to computation by classical hardware.Comment: 21 pages, 6 figures (v2 fixes typos, add references

    Using an imperfect photonic network to implement random unitaries

    No full text
    We numerically investigate the implementation of Haar-random unitarity transformations and Fourier transformations in photonic devices consisting of beam splitters and phase shifters, which are used for integrated photonics implementations of boson sampling. The distribution of reflectivities required to implement an arbitrary unitary transformation is skewed towards low values, and this skew becomes stronger the larger the number of modes. A realistic implementation using Mach-Zehnder interferometers is incapable of achieving the low values required and thus has limited fidelity. We show that numerical optimisation and adding extra beam splitters to the network can help to restore fidelity

    The state of quantum computing applications in health and medicine

    Full text link
    Quantum computing hardware and software have made enormous strides over the last years. Questions around quantum computing's impact on research and society have changed from "if" to "when/how". The 2020s have been described as the "quantum decade", and the first production solutions that drive scientific and business value are expected to become available over the next years. Medicine, including fields in healthcare and life sciences, has seen a flurry of quantum-related activities and experiments in the last few years (although medicine and quantum theory have arguably been entangled ever since Schr\"odinger's cat). The initial focus was on biochemical and computational biology problems; recently, however, clinical and medical quantum solutions have drawn increasing interest. The rapid emergence of quantum computing in health and medicine necessitates a mapping of the landscape. In this review, clinical and medical proof-of-concept quantum computing applications are outlined and put into perspective. These consist of over 40 experimental and theoretical studies from the last few years. The use case areas span genomics, clinical research and discovery, diagnostics, and treatments and interventions. Quantum machine learning (QML) in particular has rapidly evolved and shown to be competitive with classical benchmarks in recent medical research. Near-term QML algorithms, for instance, quantum support vector classifiers and quantum neural networks, have been trained with diverse clinical and real-world data sets. This includes studies in generating new molecular entities as drug candidates, diagnosing based on medical image classification, predicting patient persistence, forecasting treatment effectiveness, and tailoring radiotherapy. The use cases and algorithms are summarized and an outlook on medicine in the quantum era, including technical and ethical challenges, is provided

    Hierarchical axioms for quantum mechanics

    Full text link
    The origin of nonclassicality in quantum mechanics (QM) has been investigated recently by a number of authors with a view to identifying axioms that would single out quantum mechanics as a special theory within a broader framework such as convex operational theories. In these studies, the axioms tend to be logically independent in the sense that no specific ordering of the axioms is implied. Here, we identify a hierarchy of five nonclassical features that separate QM from a classical theory: (Q1) Incompatibility and indeterminism; (Q2) Contextuality; (Q3) Entanglement; (Q4) Nonlocality and (Q5) Indistinguishability of identical particles. Such a hierarchy isn't obvious when viewed from within the quantum mechanical framework, but, from the perspective of generalized probability theories (GPTs), the later axioms can be regarded as further structure introduced on top of earlier axioms. Relevant toy GPTs are introduced at each layer when useful to illustrate the action of the nonclassical features associated with the particular layer.Comment: 6 Pages. arXiv admin note: substantial text overlap with arXiv:1607.0176

    Assessing, testing, and challenging the computational power of quantum devices

    Get PDF
    Randomness is an intrinsic feature of quantum theory. The outcome of any measurement will be random, sampled from a probability distribution that is defined by the measured quantum state. The task of sampling from a prescribed probability distribution therefore seems to be a natural technological application of quantum devices. And indeed, certain random sampling tasks have been proposed to experimentally demonstrate the speedup of quantum over classical computation, so-called “quantum computational supremacy”. In the research presented in this thesis, I investigate the complexity-theoretic and physical foundations of quantum sampling algorithms. Using the theory of computational complexity, I assess the computational power of natural quantum simulators and close loopholes in the complexity-theoretic argument for the classical intractability of quantum samplers (Part I). In particular, I prove anticoncentration for quantum circuit families that give rise to a 2-design and review methods for proving average-case hardness. I present quantum random sampling schemes that are tailored to large-scale quantum simulation hardware but at the same time rise up to the highest standard in terms of their complexity-theoretic underpinning. Using methods from property testing and quantum system identification, I shed light on the question, how and under which conditions quantum sampling devices can be tested or verified in regimes that are not simulable on classical computers (Part II). I present a no-go result that prevents efficient verification of quantum random sampling schemes as well as approaches using which this no-go result can be circumvented. In particular, I develop fully efficient verification protocols in what I call the measurement-device-dependent scenario in which single-qubit measurements are assumed to function with high accuracy. Finally, I try to understand the physical mechanisms governing the computational boundary between classical and quantum computing devices by challenging their computational power using tools from computational physics and the theory of computational complexity (Part III). I develop efficiently computable measures of the infamous Monte Carlo sign problem and assess those measures both in terms of their practicability as a tool for alleviating or easing the sign problem and the computational complexity of this task. An overarching theme of the thesis is the quantum sign problem which arises due to destructive interference between paths – an intrinsically quantum effect. The (non-)existence of a sign problem takes on the role as a criterion which delineates the boundary between classical and quantum computing devices. I begin the thesis by identifying the quantum sign problem as a root of the computational intractability of quantum output probabilities. It turns out that the intricate structure of the probability distributions the sign problem gives rise to, prohibits their verification from few samples. In an ironic twist, I show that assessing the intrinsic sign problem of a quantum system is again an intractable problem
    • …
    corecore