7,855 research outputs found
Data Assimilation: A Mathematical Introduction
These notes provide a systematic mathematical treatment of the subject of
data assimilation
Maximum Fidelity
The most fundamental problem in statistics is the inference of an unknown
probability distribution from a finite number of samples. For a specific
observed data set, answers to the following questions would be desirable: (1)
Estimation: Which candidate distribution provides the best fit to the observed
data?, (2) Goodness-of-fit: How concordant is this distribution with the
observed data?, and (3) Uncertainty: How concordant are other candidate
distributions with the observed data? A simple unified approach for univariate
data that addresses these traditionally distinct statistical notions is
presented called "maximum fidelity". Maximum fidelity is a strict frequentist
approach that is fundamentally based on model concordance with the observed
data. The fidelity statistic is a general information measure based on the
coordinate-independent cumulative distribution and critical yet previously
neglected symmetry considerations. An approximation for the null distribution
of the fidelity allows its direct conversion to absolute model concordance (p
value). Fidelity maximization allows identification of the most concordant
model distribution, generating a method for parameter estimation, with
neighboring, less concordant distributions providing the "uncertainty" in this
estimate. Maximum fidelity provides an optimal approach for parameter
estimation (superior to maximum likelihood) and a generally optimal approach
for goodness-of-fit assessment of arbitrary models applied to univariate data.
Extensions to binary data, binned data, multidimensional data, and classical
parametric and nonparametric statistical tests are described. Maximum fidelity
provides a philosophically consistent, robust, and seemingly optimal foundation
for statistical inference. All findings are presented in an elementary way to
be immediately accessible to all researchers utilizing statistical analysis.Comment: 66 pages, 32 figures, 7 tables, submitte
RAINIER: A Simulation Tool for Distributions of Excited Nuclear States and Cascade Fluctuations
A new code has been developed named RAINIER that simulates the -ray
decay of discrete and quasi-continuum nuclear levels for a user-specified range
of energy, angular momentum, and parity including a realistic treatment of
level spacing and transition width fluctuations. A similar program, DICEBOX,
uses the Monte Carlo method to simulate level and width fluctuations but is
restricted to -ray decay from no more than two initial states such as
de-excitation following thermal neutron capture. On the other hand, modern
reaction codes such as TALYS and EMPIRE populate a wide range of states in the
residual nucleus prior to -ray decay, but do not go beyond the use of
deterministic functions and therefore neglect cascade fluctuations. This
combination of capabilities allows RAINIER to be used to determine
quasi-continuum properties through comparison with experimental data. Several
examples are given that demonstrate how cascade fluctuations influence
experimental high-resolution -ray spectra from reactions that populate
a wide range of initial states.Comment: 14 pages, 13 figures, Nuclear Instrumentation and Methods A, 201
How unitary cosmology generalizes thermodynamics and solves the inflationary entropy problem
We analyze cosmology assuming unitary quantum mechanics, using a tripartite
partition into system, observer and environment degrees of freedom. This
generalizes the second law of thermodynamics to "The system's entropy can't
decrease unless it interacts with the observer, and it can't increase unless it
interacts with the environment." The former follows from the quantum Bayes
Theorem we derive. We show that because of the long-range entanglement created
by cosmological inflation, the cosmic entropy decreases exponentially rather
than linearly with the number of bits of information observed, so that a given
observer can reduce entropy by much more than the amount of information her
brain can store. Indeed, we argue that as long as inflation has occurred in a
non-negligible fraction of the volume, almost all sentient observers will find
themselves in a post-inflationary low-entropy Hubble volume, and we humans have
no reason to be surprised that we do so as well, which solves the so-called
inflationary entropy problem. An arguably worse problem for unitary cosmology
involves gamma-ray-burst constraints on the "Big Snap", a fourth cosmic
doomsday scenario alongside the "Big Crunch", "Big Chill" and "Big Rip", where
an increasingly granular nature of expanding space modifies our life-supporting
laws of physics.
Our tripartite framework also clarifies when it is valid to make the popular
quantum gravity approximation that the Einstein tensor equals the quantum
expectation value of the stress-energy tensor, and how problems with recent
attempts to explain dark energy as gravitational backreaction from
super-horizon scale fluctuations can be understood as a failure of this
approximation.Comment: Updated to match accepted PRD version, including Quantum Bayes
Theorem derivation and rigorous proof that decoherence increases von Neumann
entropy. 20 pages, 5 fig
- …