1,332 research outputs found
An Introduction to Superconducting Qubits and Circuit Quantum Electrodynamics
A subset of the concepts of circuit quantum electrodynamics are reviewed as a
reference to the Axion Dark Matter Experiment (ADMX) community as part of the
proceedings of the 2nd Workshop on Microwave Cavities and Detectors for Axion
Research. The classical Lagrangians and Hamiltonians for an LC circuit are
discussed along with black box circuit quantization methods for a weakly
anharmonic qubit coupled to a resonator or cavity
Thermodynamic equilibrium and its stability for Microcanonical systems described by the Sharma-Taneja-Mittal entropy
It is generally assumed that the thermodynamic stability of equilibrium state
is reflected by the concavity of entropy. We inquire, in the microcanonical
picture, on the validity of this statement for systems described by the
bi-parametric entropy of Sharma-Taneja-Mittal. We analyze
the ``composability'' rule for two statistically independent systems, A and B,
described by the entropy with the same set of the deformed
parameters. It is shown that, in spite of the concavity of the entropy, the
``composability'' rule modifies the thermodynamic stability conditions of the
equilibrium state. Depending on the values assumed by the deformed parameters,
when the relation holds (super-additive systems), the concavity
conditions does imply the thermodynamics stability. Otherwise, when the
relation holds (sub-additive systems), the concavity
conditions does not imply the thermodynamical stability of the equilibrium
state.Comment: 13 pages, two columns, 1 figure, RevTex4, version accepted on PR
Consistency of the Shannon entropy in quantum experiments
The consistency of the Shannon entropy, when applied to outcomes of quantum
experiments, is analysed. It is shown that the Shannon entropy is fully
consistent and its properties are never violated in quantum settings, but
attention must be paid to logical and experimental contexts. This last remark
is shown to apply regardless of the quantum or classical nature of the
experiments.Comment: 12 pages, LaTeX2e/REVTeX4. V5: slightly different than the published
versio
Quasiclassical Coarse Graining and Thermodynamic Entropy
Our everyday descriptions of the universe are highly coarse-grained,
following only a tiny fraction of the variables necessary for a perfectly
fine-grained description. Coarse graining in classical physics is made natural
by our limited powers of observation and computation. But in the modern quantum
mechanics of closed systems, some measure of coarse graining is inescapable
because there are no non-trivial, probabilistic, fine-grained descriptions.
This essay explores the consequences of that fact. Quantum theory allows for
various coarse-grained descriptions some of which are mutually incompatible.
For most purposes, however, we are interested in the small subset of
``quasiclassical descriptions'' defined by ranges of values of averages over
small volumes of densities of conserved quantities such as energy and momentum
and approximately conserved quantities such as baryon number. The
near-conservation of these quasiclassical quantities results in approximate
decoherence, predictability, and local equilibrium, leading to closed sets of
equations of motion. In any description, information is sacrificed through the
coarse graining that yields decoherence and gives rise to probabilities for
histories. In quasiclassical descriptions, further information is sacrificed in
exhibiting the emergent regularities summarized by classical equations of
motion. An appropriate entropy measures the loss of information. For a
``quasiclassical realm'' this is connected with the usual thermodynamic entropy
as obtained from statistical mechanics. It was low for the initial state of our
universe and has been increasing since.Comment: 17 pages, 0 figures, revtex4, Dedicated to Rafael Sorkin on his 60th
birthday, minor correction
Coherent Bayesian inference on compact binary inspirals using a network of interferometric gravitational wave detectors
Presented in this paper is a Markov chain Monte Carlo (MCMC) routine for
conducting coherent parameter estimation for interferometric gravitational wave
observations of an inspiral of binary compact objects using data from multiple
detectors. The MCMC technique uses data from several interferometers and infers
all nine of the parameters (ignoring spin) associated with the binary system,
including the distance to the source, the masses, and the location on the sky.
The Metropolis-algorithm utilises advanced MCMC techniques, such as importance
resampling and parallel tempering. The data is compared with time-domain
inspiral templates that are 2.5 post-Newtonian (PN) in phase and 2.0 PN in
amplitude. Our routine could be implemented as part of an inspiral detection
pipeline for a world wide network of detectors. Examples are given for
simulated signals and data as seen by the LIGO and Virgo detectors operating at
their design sensitivity.Comment: 10 pages, 4 figure
The statistical mechanics of networks
We study the family of network models derived by requiring the expected
properties of a graph ensemble to match a given set of measurements of a
real-world network, while maximizing the entropy of the ensemble. Models of
this type play the same role in the study of networks as is played by the
Boltzmann distribution in classical statistical mechanics; they offer the best
prediction of network properties subject to the constraints imposed by a given
set of observations. We give exact solutions of models within this class that
incorporate arbitrary degree distributions and arbitrary but independent edge
probabilities. We also discuss some more complex examples with correlated edges
that can be solved approximately or exactly by adapting various familiar
methods, including mean-field theory, perturbation theory, and saddle-point
expansions.Comment: 15 pages, 4 figure
Accumulation of entanglement in a continuous variable memory
We study the accumulation of entanglement in a memory device built out of two
continuous variable (CV) systems. We address the case of a qubit mediating an
indirect joint interaction between the CV systems. We show that, in striking
contrast with respect to registers built out of bidimensional Hilbert spaces,
entanglement superior to a single ebit can be efficiently accumulated in the
memory, even though no entangled resource is used. We study the protocol in an
immediately implementable setup, assessing the effects of the main
imperfections.Comment: 4 pages, 3 figures, RevTeX
Renormalization Group and Quantum Information
The renormalization group is a tool that allows one to obtain a reduced
description of systems with many degrees of freedom while preserving the
relevant features. In the case of quantum systems, in particular,
one-dimensional systems defined on a chain, an optimal formulation is given by
White's "density matrix renormalization group". This formulation can be shown
to rely on concepts of the developing theory of quantum information.
Furthermore, White's algorithm can be connected with a peculiar type of
quantization, namely, angular quantization. This type of quantization arose in
connection with quantum gravity problems, in particular, the Unruh effect in
the problem of black-hole entropy and Hawking radiation. This connection
highlights the importance of quantum system boundaries, regarding the
concentration of quantum states on them, and helps us to understand the optimal
nature of White's algorithm.Comment: 16 pages, 5 figures, accepted in Journal of Physics
From Reversible Quantum Microdynamics to Irreversible Quantum Transport
The transition from reversible microdynamics to irreversible transport can be
studied very efficiently with the help of the so-called projection method. We
give a concise introduction to that method, illustrate its power by using it to
analyze the well-known rate and quantum Boltzmann equations, and present, as a
new application, the derivation of a source term accounting for the spontaneous
creation of electron-positron pairs in strong fields. Thereby we emphasize the
fundamental importance of time scales: only if the various time scales
exhibited by the dynamics are widely disparate, can the evolution of the slower
degrees of freedom be described by a conventional Markovian transport equation;
otherwise, one must account for finite memory effects. We show how the
projection method can be employed to determine these time scales, and how --if
necessary-- it allows one to include memory effects in a straightforward
manner. Finally, there is an appendix in which we discuss the concepts of
entropy and macroscopic irreversibility.Comment: Review article, 78 pages, uuencoded compressed PostScript fil
An information theoretic approach to statistical dependence: copula information
We discuss the connection between information and copula theories by showing
that a copula can be employed to decompose the information content of a
multivariate distribution into marginal and dependence components, with the
latter quantified by the mutual information. We define the information excess
as a measure of deviation from a maximum entropy distribution. The idea of
marginal invariant dependence measures is also discussed and used to show that
empirical linear correlation underestimates the amplitude of the actual
correlation in the case of non-Gaussian marginals. The mutual information is
shown to provide an upper bound for the asymptotic empirical log-likelihood of
a copula. An analytical expression for the information excess of T-copulas is
provided, allowing for simple model identification within this family. We
illustrate the framework in a financial data set.Comment: to appear in Europhysics Letter
- …