16,090 research outputs found
LISA Source Confusion
The Laser Interferometer Space Antenna (LISA) will detect thousands of
gravitational wave sources. Many of these sources will be overlapping in the
sense that their signals will have a non-zero cross-correlation. Such overlaps
lead to source confusion, which adversely affects how well we can extract
information about the individual sources. Here we study how source confusion
impacts parameter estimation for galactic compact binaries, with emphasis on
the effects of the number of overlaping sources, the time of observation, the
gravitational wave frequencies of the sources, and the degree of the signal
correlations. Our main findings are that the parameter resolution decays
exponentially with the number of overlapping sources, and super-exponentially
with the degree of cross-correlation. We also find that an extended mission
lifetime is key to disentangling the source confusion as the parameter
resolution for overlapping sources improves much faster than the usual square
root of the observation time.Comment: 8 pages, 14 figure
The Minimum Description Length Principle and Model Selection in Spectropolarimetry
It is shown that the two-part Minimum Description Length Principle can be
used to discriminate among different models that can explain a given observed
dataset. The description length is chosen to be the sum of the lengths of the
message needed to encode the model plus the message needed to encode the data
when the model is applied to the dataset. It is verified that the proposed
principle can efficiently distinguish the model that correctly fits the
observations while avoiding over-fitting. The capabilities of this criterion
are shown in two simple problems for the analysis of observed
spectropolarimetric signals. The first is the de-noising of observations with
the aid of the PCA technique. The second is the selection of the optimal number
of parameters in LTE inversions. We propose this criterion as a quantitative
approach for distinguising the most plausible model among a set of proposed
models. This quantity is very easy to implement as an additional output on the
existing inversion codes.Comment: Accepted for publication in the Astrophysical Journa
Time's Barbed Arrow: Irreversibility, Crypticity, and Stored Information
We show why the amount of information communicated between the past and
future--the excess entropy--is not in general the amount of information stored
in the present--the statistical complexity. This is a puzzle, and a
long-standing one, since the latter is what is required for optimal prediction,
but the former describes observed behavior. We layout a classification scheme
for dynamical systems and stochastic processes that determines when these two
quantities are the same or different. We do this by developing closed-form
expressions for the excess entropy in terms of optimal causal predictors and
retrodictors--the epsilon-machines of computational mechanics. A process's
causal irreversibility and crypticity are key determining properties.Comment: 4 pages, 2 figure
Entropy exchange and entanglement in the Jaynes-Cummings model
The Jaynes-Cummings model is the simplest fully quantum model that describes
the interaction between light and matter. We extend a previous analysis by
Phoenix and Knight (S. J. D. Phoenix, P. L. Knight, Annals of Physics 186,
381). of the JCM by considering mixed states of both the light and matter. We
present examples of qualitatively different entropic correlations. In
particular, we explore the regime of entropy exchange between light and matter,
i.e. where the rate of change of the two are anti-correlated. This behavior
contrasts with the case of pure light-matter states in which the rate of change
of the two entropies are positively correlated and in fact identical. We give
an analytical derivation of the anti-correlation phenomenon and discuss the
regime of its validity. Finally, we show a strong correlation between the
region of the Bloch sphere characterized by entropy exchange and that
characterized by minimal entanglement as measured by the negative eigenvalues
of the partially transposed density matrix.Comment: 8 pages, 5 figure
Measuring the effective complexity of cosmological models
We introduce a statistical measure of the effective model complexity, called
the Bayesian complexity. We demonstrate that the Bayesian complexity can be
used to assess how many effective parameters a set of data can support and that
it is a useful complement to the model likelihood (the evidence) in model
selection questions. We apply this approach to recent measurements of cosmic
microwave background anisotropies combined with the Hubble Space Telescope
measurement of the Hubble parameter. Using mildly non-informative priors, we
show how the 3-year WMAP data improves on the first-year data by being able to
measure both the spectral index and the reionization epoch at the same time. We
also find that a non-zero curvature is strongly disfavored. We conclude that
although current data could constrain at least seven effective parameters, only
six of them are required in a scheme based on the Lambda-CDM concordance
cosmology.Comment: 9 pages, 4 figures, revised version accepted for publication in PRD,
updated with WMAP3 result
Fluctuation Theorem with Information Exchange: Role of Correlations in Stochastic Thermodynamics
We establish the fluctuation theorem in the presence of information exchange
between a nonequilibrium system and other degrees of freedom such as an
observer and a feedback controller, where the amount of information exchange is
added to the entropy production. The resulting generalized second law sets the
fundamental limit of energy dissipation and energy cost during the information
exchange. Our results apply not only to feedback-controlled processes but also
to a much broader class of information exchanges, and provides a unified
framework of nonequilibrium thermodynamics of measurement and feedback control.Comment: To appear in PR
Abstract composition rule for relativistic kinetic energy in the thermodynamical limit
We demonstrate by simple mathematical considerations that a power-law tailed
distribution in the kinetic energy of relativistic particles can be a limiting
distribution seen in relativistic heavy ion experiments. We prove that the
infinite repetition of an arbitrary composition rule on an infinitesimal amount
leads to a rule with a formal logarithm. As a consequence the stationary
distribution of energy in the thermodynamical limit follows the composed
function of the Boltzmann-Gibbs exponential with this formal logarithm. In
particular, interactions described as solely functions of the relative
four-momentum squared lead to kinetic energy distributions of the
Tsallis-Pareto (cut power-law) form in the high energy limit.Comment: Submitted to Europhysics Letters. LaTeX, 3 eps figure
Information Flow through a Chaotic Channel: Prediction and Postdiction at Finite Resolution
We reconsider the persistence of information under the dynamics of the
logistic map in order to discuss communication through a nonlinear channel
where the sender can set the initial state of the system with finite
resolution, and the recipient measures it with the same accuracy. We separate
out the contributions of global phase space shrinkage and local phase space
contraction and expansion to the uncertainty in predicting and postdicting the
state of the system. Thus, we determine how the amplification parameter, the
time lag, and the resolution influence the possibility for communication. A
novel representation for real numbers is introduced that allows for a
visualization of the flow of information between scales.Comment: 14 pages, 13 figure
- …