1,101 research outputs found

    Riemann-Langevin Particle Filtering in Track-Before-Detect

    Get PDF
    Track-before-detect (TBD) is a powerful approach that consists in providing the tracker with sensor measurements directly without pre-detection. Due to the measurement model non-linearities, online state estimation in TBD is most commonly solved via particle filtering. Existing particle filters for TBD do not incorporate measurement information in their proposal distribution. The Langevin Monte Carlo (LMC) is a sampling method whose proposal is able to exploit all available knowledge of the posterior (that is, both prior and measurement information). This letter synthesizes recent advances in LMC-based filtering to describe the Riemann-Langevin particle filter and introduces its novel application to TBD. The benefits of our approach are illustrated in a challenging low-noise scenario.Comment: Minor grammatical update

    Bayesian subset simulation

    Full text link
    We consider the problem of estimating a probability of failure α\alpha, defined as the volume of the excursion set of a function f:X⊆Rd→Rf:\mathbb{X} \subseteq \mathbb{R}^{d} \to \mathbb{R} above a given threshold, under a given probability measure on X\mathbb{X}. In this article, we combine the popular subset simulation algorithm (Au and Beck, Probab. Eng. Mech. 2001) and our sequential Bayesian approach for the estimation of a probability of failure (Bect, Ginsbourger, Li, Picheny and Vazquez, Stat. Comput. 2012). This makes it possible to estimate α\alpha when the number of evaluations of ff is very limited and α\alpha is very small. The resulting algorithm is called Bayesian subset simulation (BSS). A key idea, as in the subset simulation algorithm, is to estimate the probabilities of a sequence of excursion sets of ff above intermediate thresholds, using a sequential Monte Carlo (SMC) approach. A Gaussian process prior on ff is used to define the sequence of densities targeted by the SMC algorithm, and drive the selection of evaluation points of ff to estimate the intermediate probabilities. Adaptive procedures are proposed to determine the intermediate thresholds and the number of evaluations to be carried out at each stage of the algorithm. Numerical experiments illustrate that BSS achieves significant savings in the number of function evaluations with respect to other Monte Carlo approaches

    Improving Bayesian Graph Convolutional Networks using Markov Chain Monte Carlo Graph Sampling

    Get PDF
    In the modern age of social media and networks, graph representations of real-world phenomena have become incredibly crucial. Often, we are interested in understanding how entities in a graph are interconnected. Graph Neural Networks (GNNs) have proven to be a very useful tool in a variety of graph learning tasks including node classification, link prediction, and edge classification. However, in most of these tasks, the graph data we are working with may be noisy and may contain spurious edges. That is, there is a lot of uncertainty associated with the underlying graph structure. Recent approaches to modeling uncertainty have been to use a Bayesian framework and view the graph as a random variable with probabilities associated with model parameters. Introducing the Bayesian paradigm to graph-based models, specifically for semi-supervised node classification, has been shown to yield higher classification accuracies. However, the method of graph inference proposed in recent work does not take into account the structure of the graph. In this paper, we propose Neighborhood Random Walk Sampling (NRWS), a Markov Chain Monte Carlo (MCMC) based graph sampling algorithm that utilizes graph structure, improves diversity among connections, and yields consistently competitive classification results compared to the state-of-the-art in semi-supervised node classification

    The age-redshift relation for Luminous Red Galaxies in the Sloan Digital Sky Survey

    Get PDF
    We present a detailed analysis of 17,852 quiescent, Luminous Red Galaxies (LRGs) selected from Sloan Digital Sky Survey (SDSS) Data Release Seven (DR7) spanning a redshift range of 0.0 < z < 0.4. These galaxies are co-added into four equal bins of velocity dispersion and luminosity to produce high signal-to-noise spectra (>100A^{-1}), thus facilitating accurate measurements of the standard Lick absorption-line indices. In particular, we have carefully corrected and calibrated these indices onto the commonly used Lick/IDS system, thus allowing us to compare these data with other measurements in the literature, and derive realistic ages, metallicities ([Z/H]) and alpha-element abundance ratios ([alpha/Fe]) for these galaxies using Simple Stellar Population (SSP) models. We use these data to study the relationship of these galaxy parameters with redshift, and find little evidence for evolution in metallicity or alpha-elements (especially for our intermediate mass samples). This demonstrates that our subsamples are consistent with pure passive evolving (i.e. no chemical evolution) and represent a homogeneous population over this redshift range. We also present the age-redshift relation for these LRGs and clearly see a decrease in their age with redshift (5 Gyrs over the redshift range studied here) which is fully consistent with the cosmological lookback times in a concordance Lambda CDM universe. We also see that our most massive sample of LRGs is the youngest compared to the lower mass galaxies. We provide these data now to help future cosmological and galaxy evolution studies of LRGs, and provide in the appendices of this paper the required methodology and information to calibrate SDSS spectra onto the Lick/IDS system.Comment: 26 pages, with several appendices containing data. Accepted for publication in MNRA
    • …
    corecore