1,327 research outputs found
On the conversion efficiency of ultracold fermionic atoms to bosonic molecules via Feshbach resonances
We explain why the experimental efficiency observed in the conversion of
ultracold Fermi gases of K and Li atoms into diatomic Bose gases
is limited to 0.5 when the Feshbach resonance sweep rate is sufficiently slow
to pass adiabatically through the Landau Zener transition but faster than ``the
collision rate'' in the gas, and increases beyond 0.5 when it is slower. The
0.5 efficiency limit is due to the preparation of a statistical mixture of two
spin-states, required to enable s-wave scattering. By constructing the
many-body state of the system we show that this preparation yields a mixture of
even and odd parity pair-states, where only even parity can produce molecules.
The odd parity spin-symmetric states must decorrelate before the constituent
atoms can further Feshbach scatter thereby increasing the conversion
efficiency; ``the collision rate'' is the pair decorrelation rate.Comment: 4 pages, 3 figures, final version accepted to Phys. Rev. Let
Many-body effects on adiabatic passage through Feshbach resonances
We theoretically study the dynamics of an adiabatic sweep through a Feshbach
resonance, thereby converting a degenerate quantum gas of fermionic atoms into
a degenerate quantum gas of bosonic dimers. Our analysis relies on a zero
temperature mean-field theory which accurately accounts for initial molecular
quantum fluctuations, triggering the association process. The structure of the
resulting semiclassical phase space is investigated, highlighting the dynamical
instability of the system towards association, for sufficiently small detuning
from resonance. It is shown that this instability significantly modifies the
finite-rate efficiency of the sweep, transforming the single-pair exponential
Landau-Zener behavior of the remnant fraction of atoms Gamma on sweep rate
alpha, into a power-law dependence as the number of atoms increases. The
obtained nonadiabaticity is determined from the interplay of characteristic
time scales for the motion of adiabatic eigenstates and for fast periodic
motion around them. Critical slowing-down of these precessions near the
instability leads to the power-law dependence. A linear power law is obtained when the initial molecular fraction is smaller than the 1/N
quantum fluctuations, and a cubic-root power law is
attained when it is larger. Our mean-field analysis is confirmed by exact
calculations, using Fock-space expansions. Finally, we fit experimental low
temperature Feshbach sweep data with a power-law dependence. While the
agreement with the experimental data is well within experimental error bars,
similar accuracy can be obtained with an exponential fit, making additional
data highly desirable.Comment: 9 pages, 9 figure
Nonlinear adiabatic passage from fermion atoms to boson molecules
We study the dynamics of an adiabatic sweep through a Feshbach resonance in a
quantum gas of fermionic atoms. Analysis of the dynamical equations, supported
by mean-field and many-body numerical results, shows that the dependence of the
remaining atomic fraction on the sweep rate varies from
exponential Landau-Zener behavior for a single pair of particles to a power-law
dependence for large particle number . The power-law is linear, , when the initial molecular fraction is smaller than the 1/N
quantum fluctuations, and when it is larger.
Experimental data agree better with a linear dependence than with an
exponential Landau-Zener fit, indicating that many-body effects are significant
in the atom-molecule conversion process.Comment: 5 pages, 4 figure
Confinement effects on the stimulated dissociation of molecular BECs
We show that a molecular BEC in a trap is stabilized against stimulated
dissociation if the trap size is smaller than the resonance healing length
. The condensate shape determines the critical
atom-molecule coupling frequency. We discuss an experiment for triggering
dissociation by a sudden change of coupling or trap parameters. This effect
demonstrates one of the unique collective features of 'superchemistry' in that
the yield of a chemical reaction depends critically on the size and shape of
the reaction vessel.Comment: 4 pages, 4 figure
Biased tomography schemes: an objective approach
We report on an intrinsic relationship between the maximum-likelihood
quantum-state estimation and the representation of the signal. A quantum
analogy of the transfer function determines the space where the reconstruction
should be done without the need for any ad hoc truncations of the Hilbert
space. An illustration of this method is provided by a simple yet practically
important tomography of an optical signal registered by realistic binary
detectors.Comment: 4 pages, 3 figures, accepted in PR
Sequential Relational Decomposition
The concept of decomposition in computer science and engineering is
considered a fundamental component of computational thinking and is prevalent
in design of algorithms, software construction, hardware design, and more. We
propose a simple and natural formalization of sequential decomposition, in
which a task is decomposed into two sequential sub-tasks, with the first
sub-task to be executed before the second sub-task is executed. These tasks are
specified by means of input/output relations. We define and study decomposition
problems, which is to decide whether a given specification can be sequentially
decomposed. Our main result is that decomposition itself is a difficult
computational problem. More specifically, we study decomposition problems in
three settings: where the input task is specified explicitly, by means of
Boolean circuits, and by means of automatic relations. We show that in the
first setting decomposition is NP-complete, in the second setting it is
NEXPTIME-complete, and in the third setting there is evidence to suggest that
it is undecidable. Our results indicate that the intuitive idea of
decomposition as a system-design approach requires further investigation. In
particular, we show that adding a human to the loop by asking for a
decomposition hint lowers the complexity of decomposition problems
considerably
Verification of Hierarchical Artifact Systems
Data-driven workflows, of which IBM's Business Artifacts are a prime
exponent, have been successfully deployed in practice, adopted in industrial
standards, and have spawned a rich body of research in academia, focused
primarily on static analysis. The present work represents a significant advance
on the problem of artifact verification, by considering a much richer and more
realistic model than in previous work, incorporating core elements of IBM's
successful Guard-Stage-Milestone model. In particular, the model features task
hierarchy, concurrency, and richer artifact data. It also allows database key
and foreign key dependencies, as well as arithmetic constraints. The results
show decidability of verification and establish its complexity, making use of
novel techniques including a hierarchy of Vector Addition Systems and a variant
of quantifier elimination tailored to our context.Comment: Full version of the accepted PODS pape
Formation of Two Component Bose Condensate During the Chemical Potential Curve Crossing
In this article we study the formation of the two modes Bose-Einstein
condensate and the correlation between them. We show that beyond the mean field
approximation the dissociation of a molecular condensate due to the chemical
potential curve crossing leads to the formation of two modes condensate. We
also show that these two modes are correlated in a two mode squeezed state.Comment: 10 page
Verifying Temporal Heap Properties Specified via Evolution Logic
This paper addresses the problem of establishing temporal properties of programs written in languages, such as Java, that make extensive use of the heap to allocate--- and deallocate---new objects and threads. Establishing liveness properties is a particularly hard challenge. One of the crucial obstacles is that heap locations have no static names and the number of heap locations is unbounded. The paper presents a framework for the verification of Java-like programs. Unlike classical model checking, which uses propositional temporal logic, we use first-order temporal logic to specify temporal properties of heap evolutions; this logic allows domain changes to be expressed, which permits allocation and deallocation to be modelled naturally. The paper also presents an abstract-interpretation algorithm that automatically verifies temporal properties expressed using the logic
Reconstruction of photon statistics using low performance photon counters
The output of a photodetector consists of a current pulse whose charge has
the statistical distribution of the actual photon numbers convolved with a
Bernoulli distribution. Photodetectors are characterized by a nonunit quantum
efficiency, i.e. not all the photons lead to a charge, and by a finite
resolution, i.e. a different number of detected photons leads to a
discriminable values of the charge only up to a maximum value. We present a
detailed comparison, based on Monte Carlo simulated experiments and real data,
among the performances of detectors with different upper limits of counting
capability. In our scheme the inversion of Bernoulli convolution is performed
by maximum-likelihood methods assisted by measurements taken at different
quantum efficiencies. We show that detectors that are only able to discriminate
between zero, one and more than one detected photons are generally enough to
provide a reliable reconstruction of the photon statistics for single-peaked
distributions, while detectors with higher resolution limits do not lead to
further improvements. In addition, we demonstrate that, for semiclassical
states, even on/off detectors are enough to provide a good reconstruction.
Finally, we show that a reliable reconstruction of multi-peaked distributions
requires either higher quantum efficiency or better capability in
discriminating high number of detected photons.Comment: 8 pages, 3 figure
- …