22,602 research outputs found
Computationally Efficient Nonparametric Importance Sampling
The variance reduction established by importance sampling strongly depends on
the choice of the importance sampling distribution. A good choice is often hard
to achieve especially for high-dimensional integration problems. Nonparametric
estimation of the optimal importance sampling distribution (known as
nonparametric importance sampling) is a reasonable alternative to parametric
approaches.In this article nonparametric variants of both the self-normalized
and the unnormalized importance sampling estimator are proposed and
investigated. A common critique on nonparametric importance sampling is the
increased computational burden compared to parametric methods. We solve this
problem to a large degree by utilizing the linear blend frequency polygon
estimator instead of a kernel estimator. Mean square error convergence
properties are investigated leading to recommendations for the efficient
application of nonparametric importance sampling. Particularly, we show that
nonparametric importance sampling asymptotically attains optimal importance
sampling variance. The efficiency of nonparametric importance sampling
algorithms heavily relies on the computational efficiency of the employed
nonparametric estimator. The linear blend frequency polygon outperforms kernel
estimators in terms of certain criteria such as efficient sampling and
evaluation. Furthermore, it is compatible with the inversion method for sample
generation. This allows to combine our algorithms with other variance reduction
techniques such as stratified sampling. Empirical evidence for the usefulness
of the suggested algorithms is obtained by means of three benchmark integration
problems. As an application we estimate the distribution of the queue length of
a spam filter queueing system based on real data.Comment: 29 pages, 7 figure
Bayesian inference on compact binary inspiral gravitational radiation signals in interferometric data
Presented is a description of a Markov chain Monte Carlo (MCMC) parameter
estimation routine for use with interferometric gravitational radiational data
in searches for binary neutron star inspiral signals. Five parameters
associated with the inspiral can be estimated, and summary statistics are
produced. Advanced MCMC methods were implemented, including importance
resampling and prior distributions based on detection probability, in order to
increase the efficiency of the code. An example is presented from an
application using realistic, albeit fictitious, data.Comment: submitted to Classical and Quantum Gravity. 14 pages, 5 figure
Bayesian coherent analysis of in-spiral gravitational wave signals with a detector network
The present operation of the ground-based network of gravitational-wave laser
interferometers in "enhanced" configuration brings the search for gravitational
waves into a regime where detection is highly plausible. The development of
techniques that allow us to discriminate a signal of astrophysical origin from
instrumental artefacts in the interferometer data and to extract the full range
of information are some of the primary goals of the current work. Here we
report the details of a Bayesian approach to the problem of inference for
gravitational wave observations using a network of instruments, for the
computation of the Bayes factor between two hypotheses and the evaluation of
the marginalised posterior density functions of the unknown model parameters.
The numerical algorithm to tackle the notoriously difficult problem of the
evaluation of large multi-dimensional integrals is based on a technique known
as Nested Sampling, which provides an attractive alternative to more
traditional Markov-chain Monte Carlo (MCMC) methods. We discuss the details of
the implementation of this algorithm and its performance against a Gaussian
model of the background noise, considering the specific case of the signal
produced by the in-spiral of binary systems of black holes and/or neutron
stars, although the method is completely general and can be applied to other
classes of sources. We also demonstrate the utility of this approach by
introducing a new coherence test to distinguish between the presence of a
coherent signal of astrophysical origin in the data of multiple instruments and
the presence of incoherent accidental artefacts, and the effects on the
estimation of the source parameters as a function of the number of instruments
in the network.Comment: 22 page
Recommended from our members
Computational Methods for Parameter Estimation in Climate Models
Intensive computational methods have been used by Earth scientists in a wide range of problems in data inversion and uncertainty quantification such as earthquake epicenter location and climate projections. To quantify the uncertainties resulting from a range of plausible model configurations it is necessary to estimate a multidimensional probability distribution. The computational cost of estimating these distributions for geoscience applications is impractical using traditional methods such as Metropolis/Gibbs algorithms as simulation costs limit the number of experiments that can be obtained reasonably. Several alternate sampling strategies have been proposed that could improve on the sampling efficiency including Multiple Very Fast Simulated Annealing (MVFSA) and Adaptive Metropolis algorithms. The performance of these proposed sampling strategies are evaluated with a surrogate climate model that is able to approximate the noise and response behavior of a realistic atmospheric general circulation model (AGCM). The surrogate model is fast enough that its evaluation can be embedded in these Monte Carlo algorithms. We show that adaptive methods can be superior to MVFSA to approximate the known posterior distribution with fewer forward evaluations. However the adaptive methods can also be limited by inadequate sample mixing. The Single Component and Delayed Rejection Adaptive Metropolis algorithms were found to resolve these limitations, although challenges remain to approximating multi-modal distributions. The results show that these advanced methods of statistical inference can provide practical solutions to the climate model calibration problem and challenges in quantifying climate projection uncertainties. The computational methods would also be useful to problems outside climate prediction, particularly those where sampling is limited by availability of computational resources.National Science Foundation OCE-0415251CONACyT-Mexico 159764Institute for Geophysic
- …