949 research outputs found
Rejoinder: Harold Jeffreys's Theory of Probability Revisited
We are grateful to all discussants of our re-visitation for their strong
support in our enterprise and for their overall agreement with our perspective.
Further discussions with them and other leading statisticians showed that the
legacy of Theory of Probability is alive and lasting. [arXiv:0804.3173]Comment: Published in at http://dx.doi.org/10.1214/09-STS284REJ the
Statistical Science (http://www.imstat.org/sts/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Comments on "Particle Markov chain Monte Carlo" by C. Andrieu, A. Doucet, and R. Hollenstein
This is the compilation of our comments submitted to the Journal of the Royal
Statistical Society, Series B, to be published within the discussion of the
Read Paper of Andrieu, Doucet and Hollenstein.Comment: 7 pages, 4 figure
In praise of the referee
There has been a lively debate in many fields, including statistics and
related applied fields such as psychology and biomedical research, on possible
reforms of the scholarly publishing system. Currently, referees contribute so
much to improve scientific papers, both directly through constructive criticism
and indirectly through the threat of rejection. We discuss ways in which new
approaches to journal publication could continue to make use of the valuable
efforts of peer reviewers.Comment: 13 page
Bayesian optimization using sequential Monte Carlo
We consider the problem of optimizing a real-valued continuous function
using a Bayesian approach, where the evaluations of are chosen sequentially
by combining prior information about , which is described by a random
process model, and past evaluation results. The main difficulty with this
approach is to be able to compute the posterior distributions of quantities of
interest which are used to choose evaluation points. In this article, we decide
to use a Sequential Monte Carlo (SMC) approach
Properties of Nested Sampling
Nested sampling is a simulation method for approximating marginal likelihoods
proposed by Skilling (2006). We establish that nested sampling has an
approximation error that vanishes at the standard Monte Carlo rate and that
this error is asymptotically Gaussian. We show that the asymptotic variance of
the nested sampling approximation typically grows linearly with the dimension
of the parameter. We discuss the applicability and efficiency of nested
sampling in realistic problems, and we compare it with two current methods for
computing marginal likelihood. We propose an extension that avoids resorting to
Markov chain Monte Carlo to obtain the simulated points.Comment: Revision submitted to Biometrik
Sequential quasi-Monte Carlo: Introduction for Non-Experts, Dimension Reduction, Application to Partly Observed Diffusion Processes
SMC (Sequential Monte Carlo) is a class of Monte Carlo algorithms for
filtering and related sequential problems. Gerber and Chopin (2015) introduced
SQMC (Sequential quasi-Monte Carlo), a QMC version of SMC. This paper has two
objectives: (a) to introduce Sequential Monte Carlo to the QMC community, whose
members are usually less familiar with state-space models and particle
filtering; (b) to extend SQMC to the filtering of continuous-time state-space
models, where the latent process is a diffusion. A recurring point in the paper
will be the notion of dimension reduction, that is how to implement SQMC in
such a way that it provides good performance despite the high dimension of the
problem.Comment: To be published in the proceedings of MCMQMC 201
On Particle Learning
This document is the aggregation of six discussions of Lopes et al. (2010)
that we submitted to the proceedings of the Ninth Valencia Meeting, held in
Benidorm, Spain, on June 3-8, 2010, in conjunction with Hedibert Lopes' talk at
this meeting, and of a further discussion of the rejoinder by Lopes et al.
(2010). The main point in those discussions is the potential for degeneracy in
the particle learning methodology, related with the exponential forgetting of
the past simulations. We illustrate in particular the resulting difficulties in
the case of mixtures.Comment: 14 pages, 9 figures, discussions on the invited paper of Lopes,
Carvalho, Johannes, and Polson, for the Ninth Valencia International Meeting
on Bayesian Statistics, held in Benidorm, Spain, on June 3-8, 2010. To appear
in Bayesian Statistics 9, Oxford University Press (except for the final
discussion
Harold Jeffreys's Theory of Probability Revisited
Published exactly seventy years ago, Jeffreys's Theory of Probability (1939)
has had a unique impact on the Bayesian community and is now considered to be
one of the main classics in Bayesian Statistics as well as the initiator of the
objective Bayes school. In particular, its advances on the derivation of
noninformative priors as well as on the scaling of Bayes factors have had a
lasting impact on the field. However, the book reflects the characteristics of
the time, especially in terms of mathematical rigor. In this paper we point out
the fundamental aspects of this reference work, especially the thorough
coverage of testing problems and the construction of both estimation and
testing noninformative priors based on functional divergences. Our major aim
here is to help modern readers in navigating in this difficult text and in
concentrating on passages that are still relevant today.Comment: This paper commented in: [arXiv:1001.2967], [arXiv:1001.2968],
[arXiv:1001.2970], [arXiv:1001.2975], [arXiv:1001.2985], [arXiv:1001.3073].
Rejoinder in [arXiv:0909.1008]. Published in at
http://dx.doi.org/10.1214/09-STS284 the Statistical Science
(http://www.imstat.org/sts/) by the Institute of Mathematical Statistics
(http://www.imstat.org
Discussions on "Riemann manifold Langevin and Hamiltonian Monte Carlo methods"
This is a collection of discussions of `Riemann manifold Langevin and
Hamiltonian Monte Carlo methods" by Girolami and Calderhead, to appear in the
Journal of the Royal Statistical Society, Series B.Comment: 6 pages, one figur
Kernel Sequential Monte Carlo
We propose kernel sequential Monte Carlo (KSMC), a framework for sampling from static target densities. KSMC is a family of
sequential Monte Carlo algorithms that are based on building emulator
models of the current particle system in a reproducing kernel Hilbert
space. We here focus on modelling nonlinear covariance structure and
gradients of the target. The emulator’s geometry is adaptively updated
and subsequently used to inform local proposals. Unlike in adaptive
Markov chain Monte Carlo, continuous adaptation does not compromise
convergence of the sampler. KSMC combines the strengths of sequental
Monte Carlo and kernel methods: superior performance for multimodal
targets and the ability to estimate model evidence as compared to Markov
chain Monte Carlo, and the emulator’s ability to represent targets that
exhibit high degrees of nonlinearity. As KSMC does not require access to
target gradients, it is particularly applicable on targets whose gradients
are unknown or prohibitively expensive. We describe necessary tuning
details and demonstrate the benefits of the the proposed methodology on
a series of challenging synthetic and real-world examples
- …