48,232 research outputs found
The role of Quantum Interference in Quantum Computing
Quantum interference is proposed as a tool to augment Quantum Computation.Comment: 3 pages, no figures (citation added
Demonstration of Controllable Temporal Distinguishability in a Three-Photon State
Multi-photon interference is at the heart of the recently proposed linear
optical quantum computing scheme and plays an essential role in many protocols
in quantum information. Indistinguishability is what leads to the effect of
quantum interference. Optical interferometers such as Michaelson interferometer
provide a measure for second-order coherence at one-photon level and
Hong-Ou-Mandel interferometer was widely employed to describe two-photon
entanglement and indistinguishability. However, there is not an effective way
for a system of more than two photons. Recently, a new interferometric scheme
was proposed to quantify the degree of multi-photon distinguishability. Here we
report an experiment to implement the scheme for three-photon case. We are able
to generate three photons with different degrees of temporal distinguishability
and demonstrate how to characterize them by the visibility of three-photon
interference. This method of quantitative description of multi-photon
indistinguishability will have practical implications in the implementation of
quantum information protocols
Quantum interferometers: principles and applications
Interference, which refers to the phenomenon associated with the
superposition of waves, has played a crucial role in the advancement of physics
and finds a wide range of applications in physical and engineering
measurements. Interferometers are experimental setups designed to observe and
manipulate interference. With the development of technology, many quantum
interferometers have been discovered and have become cornerstone tools in the
field of quantum physics. Quantum interferometers not only explore the nature
of the quantum world but also have extensive applications in quantum
information technology, such as quantum communication, quantum computing, and
quantum measurement. In this review, we analyze and summarize three typical
quantum interferometers: the Hong-Ou-Mandel (HOM) interferometer, the N00N
state interferometer, and the Franson interferometer. We focus on the
principles and applications of these three interferometers. In the principles
section, we present the theoretical models for these interferometers, including
single-mode theory and multi-mode theory. In the applications section, we
review the applications of these interferometers in quantum communication,
computation, and measurement. We hope that this review article will promote the
development of quantum interference in both fundamental science and practical
engineering applications.Comment: 64 pages, 40 figures. Comments are welcom
Sequent Calculus Representations for Quantum Circuits
When considering a sequent-style proof system for quantum programs, there are certain elements of quantum mechanics that we may wish to capture, such as phase, dynamics of unitary transformations, and measurement probabilities. Traditional quantum logics which focus primarily on the abstract orthomodular lattice theory and structures of Hilbert spaces have not satisfactorily captured some of these elements. We can start from 'scratch' in an attempt to conceptually characterize the types of proof rules which should be in a system that represents elements necessary for quantum algorithms. This present work attempts to do this from the perspective of the quantum circuit model of quantum computation. A sequent calculus based on single quantum circuits is suggested, and its ability to incorporate important conceptual and dynamic aspects of quantum computing is discussed. In particular, preserving the representation of phase helps illustrate the role of interference as a resource in quantum computation. Interference also provides an intuitive basis for a non-monotonic calculus
Role of interference and entanglement in quantum neural processing
The role of interference and entanglement in quantum neural processing is
discussed. It is argued that on contrast to the quantum computing the problem
of the use of exponential resources as the payment for the absense of
entanglement does not exist for quantum neural processing. This is because of
corresponding systems, as any modern classical artificial neural systems, do
not realize functions precisely, but approximate them by training on small sets
of examples. It can permit to implement quantum neural systems optically,
because in this case there is no need in exponential resources of optical
devices (beam-splitters etc.). On the other hand, the role of entanglement in
quantum neural processing is still very important, because it actually
associates qubit states: this is necessary feature of quantum neural memory
models.Comment: 15 pages, PD
Assessing, testing, and challenging the computational power of quantum devices
Randomness is an intrinsic feature of quantum theory. The outcome of any measurement will be random, sampled from a probability distribution that is defined by the measured quantum state. The task of sampling from a prescribed probability distribution therefore seems to be a natural technological application of quantum devices. And indeed, certain random sampling tasks have been proposed to experimentally demonstrate the speedup of quantum over classical computation, so-called “quantum computational supremacy”.
In the research presented in this thesis, I investigate the complexity-theoretic and physical foundations of quantum sampling algorithms. Using the theory of computational complexity, I assess the computational power of natural quantum simulators and close loopholes in the complexity-theoretic argument for the classical intractability of quantum samplers (Part I). In particular, I prove anticoncentration for quantum circuit families that give rise to a 2-design and review methods for proving average-case hardness. I present quantum random sampling schemes that are tailored to large-scale quantum simulation hardware but at the same time rise up to the highest standard in terms of their complexity-theoretic underpinning. Using methods from property testing and quantum system identification, I shed light on the question, how and under which conditions quantum sampling devices can be tested or verified in regimes that are not simulable on classical computers (Part II). I present a no-go result that prevents efficient verification of quantum random sampling schemes as well as approaches using which this no-go result can be circumvented. In particular, I develop fully efficient verification protocols in what I call the measurement-device-dependent scenario in which single-qubit measurements are assumed to function with high accuracy. Finally, I try to understand the physical mechanisms governing the computational boundary between classical and quantum computing devices by challenging their computational power using tools from computational physics and the theory of computational complexity (Part III). I develop efficiently computable measures of the infamous Monte Carlo sign problem and assess those measures both in terms of their practicability as a tool for alleviating or easing the sign problem and the computational complexity of this task.
An overarching theme of the thesis is the quantum sign problem which arises due to destructive interference between paths – an intrinsically quantum effect. The (non-)existence of a sign problem takes on the role as a criterion which delineates the boundary between classical and quantum computing devices. I begin the thesis by identifying the quantum sign problem as a root of the computational intractability of quantum output probabilities. It turns out that the intricate structure of the probability distributions the sign problem gives rise to, prohibits their verification from few samples. In an ironic twist, I show that assessing the intrinsic sign problem of a quantum system is again an intractable problem
- …