1,101 research outputs found
Riemann-Langevin Particle Filtering in Track-Before-Detect
Track-before-detect (TBD) is a powerful approach that consists in providing
the tracker with sensor measurements directly without pre-detection. Due to the
measurement model non-linearities, online state estimation in TBD is most
commonly solved via particle filtering. Existing particle filters for TBD do
not incorporate measurement information in their proposal distribution. The
Langevin Monte Carlo (LMC) is a sampling method whose proposal is able to
exploit all available knowledge of the posterior (that is, both prior and
measurement information). This letter synthesizes recent advances in LMC-based
filtering to describe the Riemann-Langevin particle filter and introduces its
novel application to TBD. The benefits of our approach are illustrated in a
challenging low-noise scenario.Comment: Minor grammatical update
Bayesian subset simulation
We consider the problem of estimating a probability of failure ,
defined as the volume of the excursion set of a function above a given threshold, under a given
probability measure on . In this article, we combine the popular
subset simulation algorithm (Au and Beck, Probab. Eng. Mech. 2001) and our
sequential Bayesian approach for the estimation of a probability of failure
(Bect, Ginsbourger, Li, Picheny and Vazquez, Stat. Comput. 2012). This makes it
possible to estimate when the number of evaluations of is very
limited and is very small. The resulting algorithm is called Bayesian
subset simulation (BSS). A key idea, as in the subset simulation algorithm, is
to estimate the probabilities of a sequence of excursion sets of above
intermediate thresholds, using a sequential Monte Carlo (SMC) approach. A
Gaussian process prior on is used to define the sequence of densities
targeted by the SMC algorithm, and drive the selection of evaluation points of
to estimate the intermediate probabilities. Adaptive procedures are
proposed to determine the intermediate thresholds and the number of evaluations
to be carried out at each stage of the algorithm. Numerical experiments
illustrate that BSS achieves significant savings in the number of function
evaluations with respect to other Monte Carlo approaches
Improving Bayesian Graph Convolutional Networks using Markov Chain Monte Carlo Graph Sampling
In the modern age of social media and networks, graph representations of real-world phenomena have become incredibly crucial. Often, we are interested in understanding how entities in a graph are interconnected. Graph Neural Networks (GNNs) have proven to be a very useful tool in a variety of graph learning tasks including node classification, link prediction, and edge classification. However, in most of these tasks, the graph data we are working with may be noisy and may contain spurious edges. That is, there is a lot of uncertainty associated with the underlying graph structure. Recent approaches to modeling uncertainty have been to use a Bayesian framework and view the graph as a random variable with probabilities associated with model parameters. Introducing the Bayesian paradigm to graph-based models, specifically for semi-supervised node classification, has been shown to yield higher classification accuracies. However, the method of graph inference proposed in recent work does not take into account the structure of the graph. In this paper, we propose Neighborhood Random Walk Sampling (NRWS), a Markov Chain Monte Carlo (MCMC) based graph sampling algorithm that utilizes graph structure, improves diversity among connections, and yields consistently competitive classification results compared to the state-of-the-art in semi-supervised node classification
The age-redshift relation for Luminous Red Galaxies in the Sloan Digital Sky Survey
We present a detailed analysis of 17,852 quiescent, Luminous Red Galaxies
(LRGs) selected from Sloan Digital Sky Survey (SDSS) Data Release Seven (DR7)
spanning a redshift range of 0.0 < z < 0.4. These galaxies are co-added into
four equal bins of velocity dispersion and luminosity to produce high
signal-to-noise spectra (>100A^{-1}), thus facilitating accurate measurements
of the standard Lick absorption-line indices. In particular, we have carefully
corrected and calibrated these indices onto the commonly used Lick/IDS system,
thus allowing us to compare these data with other measurements in the
literature, and derive realistic ages, metallicities ([Z/H]) and alpha-element
abundance ratios ([alpha/Fe]) for these galaxies using Simple Stellar
Population (SSP) models. We use these data to study the relationship of these
galaxy parameters with redshift, and find little evidence for evolution in
metallicity or alpha-elements (especially for our intermediate mass samples).
This demonstrates that our subsamples are consistent with pure passive evolving
(i.e. no chemical evolution) and represent a homogeneous population over this
redshift range. We also present the age-redshift relation for these LRGs and
clearly see a decrease in their age with redshift (5 Gyrs over the redshift
range studied here) which is fully consistent with the cosmological lookback
times in a concordance Lambda CDM universe. We also see that our most massive
sample of LRGs is the youngest compared to the lower mass galaxies. We provide
these data now to help future cosmological and galaxy evolution studies of
LRGs, and provide in the appendices of this paper the required methodology and
information to calibrate SDSS spectra onto the Lick/IDS system.Comment: 26 pages, with several appendices containing data. Accepted for
publication in MNRA
Recommended from our members
Monte Carlo Methods in Practice and Efficiency Enhancements via Parallel Computation
Monte Carlo methods are crucial when dealing with advanced problems in Bayesian inference. Indeed, common approaches such as Markov chain Monte Carlo (MCMC) and sequential Monte Carlo (SMC) can be endlessly adapted to tackle the most complex problems. What is important then is to construct efficient algorithms, and significant attention in the literature is devoted to developing algorithms that mix well, have low computational complexity and can scale up to large datasets. One of the most commonly used and straightforward approaches is to speed up Monte Carlo algorithms by running them in parallel computing environments. The compute time of Monte Carlo algorithms is random and can vary depending on the current state of the Markov chain. Other computing-infrastructure related factors, such as competing jobs on the same processor, or memory bandwidth, which are prevalent in shared computing architectures such as cloud computing, can also affect this compute time. However, many algorithms running in parallel require the processors to communicate every so often, and for that we must ensure that they are simultaneously ready and any idle wait time is minimised. This can be done by employing a framework known as Anytime Monte Carlo, which imposes a real-time deadline on parallel computations.
The contributions in this thesis include novel applications of the Anytime framework to construct efficient Anytime MCMC and SMC algorithms which make use of parallel computing in order to perform inference for advanced problems. Examples of such problems investigated include models in which the likelihood cannot be evaluated analytically, and changepoint models, which are often used to model the heterogeneity of sequential data, but tricky to infer upon due to the unknown number and locations of the changepoints. This thesis also focuses on the difficult task of performing parameter inference in single-molecule microscopy, a category of models in which the arrival rate of observations is not uniformly distributed and measurement models have complex forms. These issues are exacerbated when molecules have trajectories described by stochastic differential equations.
The original contributions of this thesis are organised in Chapters 4-6. Chapter 4 shows the development of a novel Anytime parallel tempering algorithm and demonstrates the performance enhancements the Anytime framework brings to parallel tempering, an algorithm, which runs multiple interacting MCMC chains in order to more efficiently explore the state space. In Chapter 5, a general Anytime SMC sampler is developed for performing changepoint inference using reversible jump MCMC (RJ-MCMC), an algorithm that takes into account the unknown number of changepoints by including transdimensional MCMC updates. The workings of the algorithm are illustrated on a particularly complex changepoint model, and once again the improvements in performance brought by employing the Anytime framework are demonstrated. Chapter 6 moves away from the Anytime framework, and presents a novel and general SMC approach to performing parameter inference for molecules with stochastic trajectories
- …