9,848 research outputs found
On the experimental verification of quantum complexity in linear optics
The first quantum technologies to solve computational problems that are
beyond the capabilities of classical computers are likely to be devices that
exploit characteristics inherent to a particular physical system, to tackle a
bespoke problem suited to those characteristics. Evidence implies that the
detection of ensembles of photons, which have propagated through a linear
optical circuit, is equivalent to sampling from a probability distribution that
is intractable to classical simulation. However, it is probable that the
complexity of this type of sampling problem means that its solution is
classically unverifiable within a feasible number of trials, and the task of
establishing correct operation becomes one of gathering sufficiently convincing
circumstantial evidence. Here, we develop scalable methods to experimentally
establish correct operation for this class of sampling algorithm, which we
implement with two different types of optical circuits for 3, 4, and 5 photons,
on Hilbert spaces of up to 50,000 dimensions. With only a small number of
trials, we establish a confidence >99% that we are not sampling from a uniform
distribution or a classical distribution, and we demonstrate a unitary specific
witness that functions robustly for small amounts of data. Like the algorithmic
operations they endorse, our methods exploit the characteristics native to the
quantum system in question. Here we observe and make an application of a
"bosonic clouding" phenomenon, interesting in its own right, where photons are
found in local groups of modes superposed across two locations. Our broad
approach is likely to be practical for all architectures for quantum
technologies where formal verification methods for quantum algorithms are
either intractable or unknown.Comment: Comments welcom
Composite CDMA - A statistical mechanics analysis
Code Division Multiple Access (CDMA) in which the spreading code assignment
to users contains a random element has recently become a cornerstone of CDMA
research. The random element in the construction is particular attractive as it
provides robustness and flexibility in utilising multi-access channels, whilst
not making significant sacrifices in terms of transmission power. Random codes
are generated from some ensemble, here we consider the possibility of combining
two standard paradigms, sparsely and densely spread codes, in a single
composite code ensemble. The composite code analysis includes a replica
symmetric calculation of performance in the large system limit, and
investigation of finite systems through a composite belief propagation
algorithm. A variety of codes are examined with a focus on the high
multi-access interference regime. In both the large size limit and finite
systems we demonstrate scenarios in which the composite code has typical
performance exceeding sparse and dense codes at equivalent signal to noise
ratio.Comment: 23 pages, 11 figures, Sigma Phi 2008 conference submission -
submitted to J.Stat.Mec
Generation and sampling of quantum states of light in a silicon chip
Implementing large instances of quantum algorithms requires the processing of
many quantum information carriers in a hardware platform that supports the
integration of different components. While established semiconductor
fabrication processes can integrate many photonic components, the generation
and algorithmic processing of many photons has been a bottleneck in integrated
photonics. Here we report the on-chip generation and processing of quantum
states of light with up to eight photons in quantum sampling algorithms.
Switching between different optical pumping regimes, we implement the
Scattershot, Gaussian and standard boson sampling protocols in the same silicon
chip, which integrates linear and nonlinear photonic circuitry. We use these
results to benchmark a quantum algorithm for calculating molecular vibronic
spectra. Our techniques can be readily scaled for the on-chip implementation of
specialised quantum algorithms with tens of photons, pointing the way to
efficiency advantages over conventional computers
Generalized multi-photon quantum interference
Non-classical interference of photons lies at the heart of optical quantum
information processing. This effect is exploited in universal quantum gates as
well as in purpose-built quantum computers that solve the BosonSampling
problem. Although non-classical interference is often associated with perfectly
indistinguishable photons this only represents the degenerate case, hard to
achieve under realistic experimental conditions. Here we exploit tunable
distinguishability to reveal the full spectrum of multi-photon non-classical
interference. This we investigate in theory and experiment by controlling the
delay times of three photons injected into an integrated interferometric
network. We derive the entire coincidence landscape and identify transition
matrix immanants as ideally suited functions to describe the generalized case
of input photons with arbitrary distinguishability. We introduce a compact
description by utilizing a natural basis which decouples the input state from
the interferometric network, thereby providing a useful tool for even larger
photon numbers
- …