451,409 research outputs found
Aspects of Two-Level Systems under External Time Dependent Fields
The dynamics of two-level systems in time-dependent backgrounds is under
consideration. We present some new exact solutions in special backgrounds
decaying in time. On the other hand, following ideas of Feynman, Vernon and
Hellwarth, we discuss in detail the possibility to reduce the quantum dynamics
to a classical Hamiltonian system. This, in particular, opens the possibility
to directly apply powerful methods of classical mechanics (e.g. KAM methods) to
study the quantum system. Following such an approach, we draw conclusions of
relevance for ``quantum chaos'' when the external background is periodic or
quasi-periodic in time.Comment: To appear in J. Phys. A. Mathematical and Genera
Quantum Mechanics: Harbinger of a Non-Commutative Probability Theory?
In this paper we discuss the relevance of the algebraic approach to quantum
phenomena first introduced by von Neumann before he confessed to Birkoff that
he no longer believed in Hilbert space. This approach is more general and
allows us to see the structure of quantum processes in terms of non-commutative
probability theory, a non-Boolean structure of the implicate order which
contains Boolean sub-structures which accommodates the explicate classical
world. We move away from mechanical `waves' and `particles' and take as basic
what Bohm called a {\em structure process}. This enables us to learn new
lessons that can have a wider application in the way we think of structures in
language and thought itself.Comment: 20 pages, one figure. Invited pape
Towards resource theory of coherence in distributed scenarios
The search for a simple description of fundamental physical processes is an
important part of quantum theory. One example for such an abstraction can be
found in the distance lab paradigm: if two separated parties are connected via
a classical channel, it is notoriously difficult to characterize all possible
operations these parties can perform. This class of operations is widely known
as local operations and classical communication (LOCC). Surprisingly, the
situation becomes comparably simple if the more general class of separable
operations is considered, a finding which has been extensively used in quantum
information theory for many years. Here, we propose a related approach for the
resource theory of quantum coherence, where two distant parties can only
perform measurements which do not create coherence and can communicate their
outcomes via a classical channel. We call this class local incoherent
operations and classical communication (LICC). While the characterization of
this class is also difficult in general, we show that the larger class of
separable incoherent operations (SI) has a simple mathematical form, yet still
preserving the main features of LICC. We demonstrate the relevance of our
approach by applying it to three different tasks: assisted coherence
distillation, quantum teleportation, and single-shot quantum state merging. We
expect that the results obtained in this work also transfer to other concepts
of coherence which are discussed in recent literature. The approach presented
here opens new ways to study the resource theory of coherence in distributed
scenarios.Comment: 11 pages, 1 figure, accepted for publication in Physical Review
Quantum origin of the primordial fluctuation spectrum and its statistics
The usual account for the origin of cosmic structure during inflation is not
fully satisfactory, as it lacks a physical mechanism capable of generating the
inhomogeneity and anisotropy of our Universe, from an exactly homogeneous and
isotropic initial state associated with the early inflationary regime. The
proposal in [A. Perez, H. Sahlmann, and D. Sudarsky, Classical Quantum Gravity,
23, 2317, (2006)] considers the spontaneous dynamical collapse of the wave
function, as a possible answer to that problem. In this work, we review briefly
the difficulties facing the standard approach, as well as the answers provided
by the above proposal and explore their relevance to the investigations
concerning the characterization of the primordial spectrum and other
statistical aspects of the cosmic microwave background and large-scale matter
distribution. We will see that the new approach leads to novel ways of
considering some of the relevant questions, and, in particular, to distinct
characterizations of the non-Gaussianities that might have left imprints on the
available data.Comment: 27 pages. Revision to match the published versio
CoBRA: A cooperative coevolutionary algorithm for bi-level optimization
International audienceThis article presents CoBRA, a new evolutionary algorithm, based on a coevolutionary scheme, to solve bi-level optimization problems. It handles population-based algorithms on each level, each one cooperating with the other to provide solutions for the overall problem. Moreover, in order to evaluate the relevance of CoBRA against more classical approaches, a new performance assessment methodology, based on rationality, is introduced. An experimental analysis is conducted on a bi-level distribution planning problem, where multiple manufacturing plants deliver items to depots, and where a distribution company controls several depots and distributes items from depots to re- tailers. The experimental results reveal significant enhancements, particularly over the lower level, with respect to a more classical approach based on a hierarchical scheme
Internal links and pairs as a new tool for the analysis of bipartite complex networks
Many real-world complex networks are best modeled as bipartite (or 2-mode)
graphs, where nodes are divided into two sets with links connecting one side to
the other. However, there is currently a lack of methods to analyze properly
such graphs as most existing measures and methods are suited to classical
graphs. A usual but limited approach consists in deriving 1-mode graphs (called
projections) from the underlying bipartite structure, though it causes
important loss of information and data storage issues. We introduce here
internal links and pairs as a new notion useful for such analysis: it gives
insights on the information lost by projecting the bipartite graph. We
illustrate the relevance of theses concepts on several real-world instances
illustrating how it enables to discriminate behaviors among various cases when
we compare them to a benchmark of random networks. Then, we show that we can
draw benefit from this concept for both modeling complex networks and storing
them in a compact format
Cosmological Dark Energy: Prospects for a Dynamical Theory
We present an approach to the problem of vacuum energy in cosmology, based on
dynamical screening of Lambda on the horizon scale. We review first the
physical basis of vacuum energy as a phenomenon connected with macroscopic
boundary conditions, and the origin of the idea of its screening by particle
creation and vacuum polarization effects. We discuss next the relevance of the
quantum trace anomaly to this issue. The trace anomaly implies additional terms
in the low energy effective theory of gravity, which amounts to a non-trivial
modification of the classical Einstein theory, fully consistent with the
Equivalence Principle. We show that the new dynamical degrees of freedom the
anomaly contains provide a natural mechanism for relaxing Lambda to zero on
cosmological scales. We consider possible signatures of the restoration of
conformal invariance predicted by the fluctuations of these new scalar degrees
of freedom on the spectrum and statistics of the CMB, in light of the latest
bounds from WMAP. Finally we assess the prospects for a new cosmological model
in which the dark energy adjusts itself dynamically to the cosmological horizon
boundary, and therefore remains naturally of order H^2 at all times without
fine tuning.Comment: 50 pages, Invited Contribution to New Journal of Physics Focus Issue
on Dark Energ
EigenGP: Gaussian Process Models with Adaptive Eigenfunctions
Gaussian processes (GPs) provide a nonparametric representation of functions.
However, classical GP inference suffers from high computational cost for big
data. In this paper, we propose a new Bayesian approach, EigenGP, that learns
both basis dictionary elements--eigenfunctions of a GP prior--and prior
precisions in a sparse finite model. It is well known that, among all
orthogonal basis functions, eigenfunctions can provide the most compact
representation. Unlike other sparse Bayesian finite models where the basis
function has a fixed form, our eigenfunctions live in a reproducing kernel
Hilbert space as a finite linear combination of kernel functions. We learn the
dictionary elements--eigenfunctions--and the prior precisions over these
elements as well as all the other hyperparameters from data by maximizing the
model marginal likelihood. We explore computational linear algebra to simplify
the gradient computation significantly. Our experimental results demonstrate
improved predictive performance of EigenGP over alternative sparse GP methods
as well as relevance vector machine.Comment: Accepted by IJCAI 201
Quantum Computing for high school: an approach to interdisciplinary in STEM for teaching
The paper focuses on a Quantum Computing teaching module for high school students that was designed and implemented within the I SEE Erasmus+ project (https://iseeproject.eu/).
The module is discussed as an example of how the S-T-E-M disciplines can be integrated to stress the conceptual, epistemological, and social relevance of quantum computing. We implemented a three-level approach to introduce quantum technologies without getting lost in the technicalities. The approach has allowed us to highlight the difference between classical and quantum computers and to bring out the interdisciplinary character that characterises the new technologies
- …