1,799 research outputs found
A link between the maximum entropy approach and the variational entropy form
The maximum entropy approach operating with quite general entropy measure and
constraint is considered. It is demonstrated that for a conditional or
parametrized probability distribution there is a "universal"
relation among the entropy rate and the functions appearing in the constraint.
It is shown that the recently proposed variational formulation of the entropic
functional can be obtained as a consequence of this relation, that is from the
maximum entropy principle. This resolves certain puzzling points appeared in
the variational approach
Comparing Infrared Dirac-Born-Infeld Brane Inflation to Observations
We compare the Infrared Dirac-Born-Infeld (IR DBI) brane inflation model to
observations using a Bayesian analysis. The current data cannot distinguish it
from the \LambdaCDM model, but is able to give interesting constraints on
various microscopic parameters including the mass of the brane moduli
potential, the fundamental string scale, the charge or warp factor of throats,
and the number of the mobile branes. We quantify some distinctive testable
predictions with stringy signatures, such as the large non-Gaussianity, and the
large, but regional, running of the spectral index. These results illustrate
how we may be able to probe aspects of string theory using cosmological
observations.Comment: 54 pages, 13 figures. v2: non-Gaussianity constraint has been applied
to the model; parameter constraints have tightened significantly, conclusions
unchanged. References added; v3, minor revision, PRD versio
Orthogonality relations for triple modes at dielectric boundary surfaces
We work out the orthogonality relations for the set of Carniglia-Mandel
triple modes which provide a set of normal modes for the source-free
electromagnetic field in a background consisting of a passive dielectric
half-space and the vacuum, respectively. Due to the inherent computational
complexity of the problem, an efficient strategy to accomplish this task is
desirable, which is presented in the paper. Furthermore, we provide all main
steps for the various proofs pertaining to different combinations of triple
modes in the orthogonality integral.Comment: 15 page
Rules for transition rates in nonequilibrium steady states
Just as transition rates in a canonical ensemble must respect the principle
of detailed balance, constraints exist on transition rates in driven steady
states. I derive those constraints, by maximum information-entropy inference,
and apply them to the steady states of driven diffusion and a sheared lattice
fluid. The resulting ensemble can potentially explain nonequilibrium phase
behaviour and, for steady shear, gives rise to stress-mediated long-range
interactions.Comment: 4 pages. To appear in Physical Review Letter
Normal mode splitting and mechanical effects of an optical lattice in a ring cavity
A novel regime of atom-cavity physics is explored, arising when large atom
samples dispersively interact with high-finesse optical cavities. A stable far
detuned optical lattice of several million rubidium atoms is formed inside an
optical ring resonator by coupling equal amounts of laser light to each
propagation direction of a longitudinal cavity mode. An adjacent longitudinal
mode, detunedby about 3 GHz, is used to perform probe transmission spectroscopy
of the system. The atom-cavity coupling for the lattice beams and the probe is
dispersive and dissipation results only from the finite photon-storage time.
The observation of two well-resolved normal modes demonstrates the regime of
strong cooperative coupling. The details of the normal mode spectrum reveal
mechanical effects associated with the retroaction of the probe upon the
optical lattice.Comment: 4 pages, 3 figure
Combining cosmological datasets: hyperparameters and Bayesian evidence
A method is presented for performing joint analyses of cosmological datasets,
in which the weight assigned to each dataset is determined directly by it own
statistical properties. The weights are considered in a Bayesian context as a
set of hyperparameters, which are then marginalised over in order to recover
the posterior distribution as a function only of the cosmological parameters of
interest. In the case of a Gaussian likelihood function, this marginalisation
may be performed analytically. Calculation of the Bayesian evidence for the
data, with and without the introduction of hyperparameters, enables a direct
determination of whether the data warrant the introduction of weights into the
analysis; this generalises the standard likelihood ratio approach to model
comparison. The method is illustrated by application to the classic toy problem
of fitting a straight line to a set of data. A cosmological illustration of the
technique is also presented, in which the latest measurements of the cosmic
microwave background power spectrum are used to infer constraints on
cosmological parameters.Comment: 12 pages, 6 figures, submitted to MNRA
Cavity-induced temperature control of a two-level system
We consider a two-level atom interacting with a single mode of the
electromagnetic field in a cavity within the Jaynes-Cummings model. Initially,
the atom is thermal while the cavity is in a coherent state. The atom interacts
with the cavity field for a fixed time. After removing the atom from the cavity
and applying a laser pulse the atom will be in a thermal state again. Depending
on the interaction time with the cavity field the final temperature can be
varied over a large range. We discuss how this method can be used to cool the
internal degrees of freedom of atoms and create heat baths suitable for
studying thermodynamics at the nanoscale
Optimization of the transmission of observable expectation values and observable statistics in Continuous Variable Teleportation
We analyze the statistics of observables in continuous variable quantum
teleportation in the formalism of the characteristic function. We derive
expressions for average values of output state observables in particular
cumulants which are additive in terms of the input state and the resource of
teleportation. Working with Squeezed Bell-like states, which may be optimized
in a free parameter for better teleportation performance we discuss the
relation between resources optimal for fidelity and for different observable
averages. We obtain the values of the free parameter which optimize the central
momenta and cumulants up to fourth order. For the cumulants the distortion
between in and out states due to teleportation depends only on the resource. We
obtain optimal parameters for the second and fourth order cumulants which do
not depend on the squeezing of the resource. The second order central momenta
which is equal to the second order cumulants and the photon number average are
optimized by the same resource. We show that the optimal fidelity resource,
found in reference (Phys. Rev. A {\bf 76}, 022301 (2007)) to depend also on the
characteristics of input, tends for high squeezing to the resource which
optimizes the second order momenta. A similar behavior is obtained for the
resource which optimizes the photon statistics which is treated here using the
sum of the squared differences in photon probabilities of input and output
states as the distortion measure. This is interpreted to mean that the
distortions associated to second order momenta dominates the behavior of the
output state for large squeezing of the resource. Optimal fidelity and optimal
photon statistics resources are compared and is shown that for mixtures of Fock
states they are equivalent.Comment: 25 pages, 11 figure
On classical models of spin
The reason for recalling this old paper is the ongoing discussion on the
attempts of circumventing certain assumptions leading to the Bell theorem
(Hess-Philipp, Accardi). If I correctly understand the intentions of these
Authors, the idea is to make use of the following logical loophole inherent in
the proof of the Bell theorem: Probabilities of counterfactual events A and A'
do not have to coincide with actually measured probabilities if measurements of
A and A' disturb each other, or for any other fundamental reason cannot be
performed simulaneously. It is generally believed that in the context of
classical probability theory (i.e. realistic hidden variables) probabilities of
counterfactual events can be identified with those of actually measured events.
In the paper I give an explicit counterexample to this belief. The "first
variation" on the Aerts model shows that counterfactual and actual problems
formulated for the same classical system may be unrelated. In the model the
first probability does not violate any classical inequality whereas the second
does. Pecularity of the Bell inequality is that on the basis of an in principle
unobservable probability one derives probabilities of jointly measurable random
variables, the fact additionally obscuring the logical meaning of the
construction. The existence of the loophole does not change the fact that I was
not able to construct a local model violating the inequality with all the other
loopholes eliminated.Comment: published as Found. Phys. Lett. 3 (1992) 24
Escort mean values and the characterization of power-law-decaying probability densities
Escort mean values (or -moments) constitute useful theoretical tools for
describing basic features of some probability densities such as those which
asymptotically decay like {\it power laws}. They naturally appear in the study
of many complex dynamical systems, particularly those obeying nonextensive
statistical mechanics, a current generalization of the Boltzmann-Gibbs theory.
They recover standard mean values (or moments) for . Here we discuss the
characterization of a (non-negative) probability density by a suitable set of
all its escort mean values together with the set of all associated normalizing
quantities, provided that all of them converge. This opens the door to a
natural extension of the well known characterization, for the instance,
of a distribution in terms of the standard moments, provided that {\it all} of
them have {\it finite} values. This question would be specially relevant in
connection with probability densities having {\it divergent} values for all
nonvanishing standard moments higher than a given one (e.g., probability
densities asymptotically decaying as power-laws), for which the standard
approach is not applicable. The Cauchy-Lorentz distribution, whose second and
higher even order moments diverge, constitutes a simple illustration of the
interest of this investigation. In this context, we also address some
mathematical subtleties with the aim of clarifying some aspects of an
interesting non-linear generalization of the Fourier Transform, namely, the
so-called -Fourier Transform.Comment: 20 pages (2 Appendices have been added
- …