45,575 research outputs found
High-Fidelity Control, Detection, and Entanglement of Alkaline-Earth Rydberg Atoms
Trapped neutral atoms have become a prominent platform for quantum science, where entanglement fidelity records have been set using highly excited Rydberg states. However, controlled two-qubit entanglement generation has so far been limited to alkali species, leaving the exploitation of more complex electronic structures as an open frontier that could lead to improved fidelities and fundamentally different applications such as quantum-enhanced optical clocks. Here, we demonstrate a novel approach utilizing the two-valence electron structure of individual alkaline-earth Rydberg atoms. We find fidelities for Rydberg state detection, single-atom Rabi operations and two-atom entanglement that surpass previously published values. Our results pave the way for novel applications, including programmable quantum metrology and hybrid atom–ion systems, and set the stage for alkaline-earth based quantum computing architectures
Adiabatic evolution on a spatial-photonic Ising machine
Combinatorial optimization problems are crucial for widespread applications
but remain difficult to solve on a large scale with conventional hardware.
Novel optical platforms, known as coherent or photonic Ising machines, are
attracting considerable attention as accelerators on optimization tasks
formulable as Ising models. Annealing is a well-known technique based on
adiabatic evolution for finding optimal solutions in classical and quantum
systems made by atoms, electrons, or photons. Although various Ising machines
employ annealing in some form, adiabatic computing on optical settings has been
only partially investigated. Here, we realize the adiabatic evolution of
frustrated Ising models with 100 spins programmed by spatial light modulation.
We use holographic and optical control to change the spin couplings
adiabatically, and exploit experimental noise to explore the energy landscape.
Annealing enhances the convergence to the Ising ground state and allows to find
the problem solution with probability close to unity. Our results demonstrate a
photonic scheme for combinatorial optimization in analogy with adiabatic
quantum algorithms and enforced by optical vector-matrix multiplications and
scalable photonic technology.Comment: 9 pages, 4 figure
From retrodiction to Bayesian quantum imaging
We employ quantum retrodiction to develop a robust Bayesian algorithm for reconstructing the intensity values of an image from sparse photocount data, while also accounting for detector noise in the form of dark counts. This method yields not only a reconstructed image but also provides the full probability distribution function for the intensity at each pixel. We use simulated as well as real data to illustrate both the applications of the algorithm and the analysis options that are only available when the full probability distribution functions are known. These include calculating Bayesian credible regions for each pixel intensity, allowing an objective assessment of the reliability of the reconstructed image intensity values
Biophotonic Tools in Cell and Tissue Diagnostics.
In order to maintain the rapid advance of biophotonics in the U.S. and enhance our competitiveness worldwide, key measurement tools must be in place. As part of a wide-reaching effort to improve the U.S. technology base, the National Institute of Standards and Technology sponsored a workshop titled "Biophotonic tools for cell and tissue diagnostics." The workshop focused on diagnostic techniques involving the interaction between biological systems and photons. Through invited presentations by industry representatives and panel discussion, near- and far-term measurement needs were evaluated. As a result of this workshop, this document has been prepared on the measurement tools needed for biophotonic cell and tissue diagnostics. This will become a part of the larger measurement road-mapping effort to be presented to the Nation as an assessment of the U.S. Measurement System. The information will be used to highlight measurement needs to the community and to facilitate solutions
Postselection threshold against biased noise
The highest current estimates for the amount of noise a quantum computer can
tolerate are based on fault-tolerance schemes relying heavily on postselecting
on no detected errors. However, there has been no proof that these schemes give
even a positive tolerable noise threshold. A technique to prove a positive
threshold, for probabilistic noise models, is presented. The main idea is to
maintain strong control over the distribution of errors in the quantum state at
all times. This distribution has correlations which conceivably could grow out
of control with postselection. But in fact, the error distribution can be
written as a mixture of nearby distributions each satisfying strong
independence properties, so there are no correlations for postselection to
amplify.Comment: 13 pages, FOCS 2006; conference versio
Observation of the Mott Insulator to Superfluid Crossover of a Driven-Dissipative Bose-Hubbard System
Dissipation is ubiquitous in nature and plays a crucial role in quantum
systems such as causing decoherence of quantum states. Recently, much attention
has been paid to an intriguing possibility of dissipation as an efficient tool
for preparation and manipulation of quantum states. Here we report the
realization of successful demonstration of a novel role of dissipation in a
quantum phase transition using cold atoms. We realize an engineered dissipative
Bose-Hubbard system by introducing a controllable strength of two-body
inelastic collision via photo-association for ultracold bosons in a
three-dimensional optical lattice. In the dynamics subjected to a slow
ramp-down of the optical lattice, we find that strong on-site dissipation
favors the Mott insulating state: the melting of the Mott insulator is delayed
and the growth of the phase coherence is suppressed. The controllability of the
dissipation is highlighted by quenching the dissipation, providing a novel
method for investigating a quantum many-body state and its non-equilibrium
dynamics.Comment: 26 pages, 17 figure
Multicolour correlative imaging using phosphor probes
Correlative light and electron microscopy exploits the advantages of optical methods, such as multicolour probes and their use in hydrated live biological samples, to locate functional units, which are then correlated with structural details that can be revealed by the superior resolution of electron microscopes. One difficulty is locating the area imaged by the electron beam in the much larger optical field of view. Multifunctional probes that can be imaged in both modalities and thus register the two images are required. Phosphor materials give cathodoluminescence (CL) optical emissions under electron excitation. Lanthanum phosphate containing thulium or terbium or europium emits narrow bands in the blue, green and red regions of the CL spectrum; they may be synthesised with very uniform-sized crystals in the 10- to 50-nm range. Such crystals can be imaged by CL in the electron microscope, at resolutions limited by the particle size, and with colour discrimination to identify different probes. These materials also give emissions in the optical microscope, by
multiphoton excitation. They have been deposited on the surface of glioblastoma cells and imaged by CL. Gadolinium oxysulphide doped with terbium emits green photons by either ultraviolet or electron excitation. Sixty-nanometre crystals of this phosphor have been imaged in the atmospheric scanning electron microscope (JEOL ClairScope). This probe and microscope combination allow correlative imaging in hydrated samples. Phosphor probes should prove to be very useful in correlative light and electron microscopy, as fiducial
markers to assist in image registration, and in high/super resolution imaging studies
Contextual advantage for state discrimination
Finding quantitative aspects of quantum phenomena which cannot be explained
by any classical model has foundational importance for understanding the
boundary between classical and quantum theory. It also has practical
significance for identifying information processing tasks for which those
phenomena provide a quantum advantage. Using the framework of generalized
noncontextuality as our notion of classicality, we find one such nonclassical
feature within the phenomenology of quantum minimum error state discrimination.
Namely, we identify quantitative limits on the success probability for minimum
error state discrimination in any experiment described by a noncontextual
ontological model. These constraints constitute noncontextuality inequalities
that are violated by quantum theory, and this violation implies a quantum
advantage for state discrimination relative to noncontextual models.
Furthermore, our noncontextuality inequalities are robust to noise and are
operationally formulated, so that any experimental violation of the
inequalities is a witness of contextuality, independently of the validity of
quantum theory. Along the way, we introduce new methods for analyzing
noncontextuality scenarios, and demonstrate a tight connection between our
minimum error state discrimination scenario and a Bell scenario.Comment: 18 pages, 9 figure
- …