3,068 research outputs found
Application of Bayesian model averaging to measurements of the primordial power spectrum
Cosmological parameter uncertainties are often stated assuming a particular
model, neglecting the model uncertainty, even when Bayesian model selection is
unable to identify a conclusive best model. Bayesian model averaging is a
method for assessing parameter uncertainties in situations where there is also
uncertainty in the underlying model. We apply model averaging to the estimation
of the parameters associated with the primordial power spectra of curvature and
tensor perturbations. We use CosmoNest and MultiNest to compute the model
Evidences and posteriors, using cosmic microwave data from WMAP, ACBAR,
BOOMERanG and CBI, plus large-scale structure data from the SDSS DR7. We find
that the model-averaged 95% credible interval for the spectral index using all
of the data is 0.940 < n_s < 1.000, where n_s is specified at a pivot scale
0.015 Mpc^{-1}. For the tensors model averaging can tighten the credible upper
limit, depending on prior assumptions.Comment: 7 pages with 7 figures include
Synthetic LISA: Simulating Time Delay Interferometry in a Model LISA
We report on three numerical experiments on the implementation of Time-Delay
Interferometry (TDI) for LISA, performed with Synthetic LISA, a C++/Python
package that we developed to simulate the LISA science process at the level of
scientific and technical requirements. Specifically, we study the laser-noise
residuals left by first-generation TDI when the LISA armlengths have a
realistic time dependence; we characterize the armlength-measurements
accuracies that are needed to have effective laser-noise cancellation in both
first- and second-generation TDI; and we estimate the quantization and
telemetry bitdepth needed for the phase measurements. Synthetic LISA generates
synthetic time series of the LISA fundamental noises, as filtered through all
the TDI observables; it also provides a streamlined module to compute the TDI
responses to gravitational waves according to a full model of TDI, including
the motion of the LISA array and the temporal and directional dependence of the
armlengths. We discuss the theoretical model that underlies the simulation, its
implementation, and its use in future investigations on system characterization
and data-analysis prototyping for LISA.Comment: 18 pages, 14 EPS figures, REVTeX 4. Accepted PRD version. See
http://www.vallis.org/syntheticlisa for information on the Synthetic LISA
software packag
Forecasting the Pricing Kernel of IBNR Claims Development in Property-Casualty Insurance
Quantum trajectories for the realistic measurement of a solid-state charge qubit
We present a new model for the continuous measurement of a coupled quantum
dot charge qubit. We model the effects of a realistic measurement, namely
adding noise to, and filtering, the current through the detector. This is
achieved by embedding the detector in an equivalent circuit for measurement.
Our aim is to describe the evolution of the qubit state conditioned on the
macroscopic output of the external circuit. We achieve this by generalizing a
recently developed quantum trajectory theory for realistic photodetectors [P.
Warszawski, H. M. Wiseman and H. Mabuchi, Phys. Rev. A_65_ 023802 (2002)] to
treat solid-state detectors. This yields stochastic equations whose (numerical)
solutions are the ``realistic quantum trajectories'' of the conditioned qubit
state. We derive our general theory in the context of a low transparency
quantum point contact. Areas of application for our theory and its relation to
previous work are discussed.Comment: 7 pages, 2 figures. Shorter, significantly modified, updated versio
A selected bibliography on parameter optimization methods suitable for hybrid computation
Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/68333/2/10.1177_003754976700800610.pd
Precursors of extreme increments
We investigate precursors and predictability of extreme increments in a time
series. The events we are focusing on consist in large increments within
successive time steps. We are especially interested in understanding how the
quality of the predictions depends on the strategy to choose precursors, on the
size of the event and on the correlation strength. We study the prediction of
extreme increments analytically in an AR(1) process, and numerically in wind
speed recordings and long-range correlated ARMA data. We evaluate the success
of predictions via receiver operator characteristics (ROC-curves). Furthermore,
we observe an increase of the quality of predictions with increasing event size
and with decreasing correlation in all examples. Both effects can be understood
by using the likelihood ratio as a summary index for smooth ROC-curves
State and dynamical parameter estimation for open quantum systems
Following the evolution of an open quantum system requires full knowledge of
its dynamics. In this paper we consider open quantum systems for which the
Hamiltonian is ``uncertain''. In particular, we treat in detail a simple system
similar to that considered by Mabuchi [Quant. Semiclass. Opt. 8, 1103 (1996)]:
a radiatively damped atom driven by an unknown Rabi frequency (as
would occur for an atom at an unknown point in a standing light wave). By
measuring the environment of the system, knowledge about the system state, and
about the uncertain dynamical parameter, can be acquired. We find that these
two sorts of knowledge acquisition (quantified by the posterior distribution
for , and the conditional purity of the system, respectively) are quite
distinct processes, which are not strongly correlated. Also, the quality and
quantity of knowledge gain depend strongly on the type of monitoring scheme. We
compare five different detection schemes (direct, adaptive, homodyne of the
quadrature, homodyne of the quadrature, and heterodyne) using four
different measures of the knowledge gain (Shannon information about ,
variance in , long-time system purity, and short-time system purity).Comment: 14 pages, 18 figure
- …