1,995 research outputs found
Efficient Bayesian inference for harmonic models via adaptive posterior factorization
NOTICE: this is the author’s version of a work that was accepted for publication in Neurocomputing. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in NEUROCOMPUTING, [VOL72, ISSUE 1-3, (2008)] DOI10.1016/j.neucom.2007.12.05
Taming outliers in pulsar-timing datasets with hierarchical likelihoods and Hamiltonian sampling
Pulsar-timing datasets have been analyzed with great success using
probabilistic treatments based on Gaussian distributions, with applications
ranging from studies of neutron-star structure to tests of general relativity
and searches for nanosecond gravitational waves. As for other applications of
Gaussian distributions, outliers in timing measurements pose a significant
challenge to statistical inference, since they can bias the estimation of
timing and noise parameters, and affect reported parameter uncertainties. We
describe and demonstrate a practical end-to-end approach to perform Bayesian
inference of timing and noise parameters robustly in the presence of outliers,
and to identify these probabilistically. The method is fully consistent (i.e.,
outlier-ness probabilities vary in tune with the posterior distributions of the
timing and noise parameters), and it relies on the efficient sampling of the
hierarchical form of the pulsar-timing likelihood. Such sampling has recently
become possible with a "no-U-turn" Hamiltonian sampler coupled to a highly
customized reparametrization of the likelihood; this code is described
elsewhere, but it is already available online. We recommend our method as a
standard step in the preparation of pulsar-timing-array datasets: even if
statistical inference is not affected, follow-up studies of outlier candidates
can reveal unseen problems in radio observations and timing measurements;
furthermore, confidence in the results of gravitational-wave searches will only
benefit from stringent statistical evidence that datasets are clean and
outlier-free.Comment: 6 pages, 2 figures, RevTeX 4.
Migrating Individuals and Probabilistic Models on DEDAS: a Comparison on Continuous Functions
One of the most promising areas in which probabilistic graphical models have shown an incipient activity is the field of heuristic optimization and, in particular, in the Estimation of Distribution Algorithms (EDAs). EDAs constitute a well-known family of Evolutionary Computation techniques, similar to Genetic Algorithms. Due to their inherent parallelism, different research lines have been studied trying to improve EDAs from the point of view of execution time and/or accuracy. Among these proposals, we focus on the so-called island-based models. This approach defines several islands (EDA instances) running independently and exchanging information with a given frequency. The information sent by the islands can be a set of individuals or a probabilistic model. This paper presents a comparative study of both information exchanging techniques for a univariate EDA (U M DAg) over a wide set of parameters and problems –the standard benchmark developed for the IEEE Workshop on Evolutionary Algorithms and other Metaheuristics for Continuous Optimization Problems of the ISDA 2009 Conference. The study concludes that the configurations based on migrating individuals obtain better result
Reliability approach for safe designing on a locking system
The aim of this work is to predict the failure probability of a locking system. This failure probability is assessed using complementary methods: the First-Order Reliability Method (FORM) and Second-Order Reliability Method (SORM) as approximated methods, and Monte Carlo simulations as the reference method. Both types are implemented in a specific software [Phimeca software. Software for reliability analysis developed by Phimeca Engineering S.A.] used in this study. For the Monte Carlo simulations, a response surface, based on experimental design and finite element calculations [Abaqus/Standard User’s Manuel vol. I.], is elaborated so that the relation between the random input variables and structural responses could be established. Investigations of previous reliable methods on two configurations of the locking system show the large sturdiness of the first one and enable design improvements for the second one
Reliability approach for safe designing on a locking system
The aim of this work is to predict the failure probability of a locking system. This failure probability is assessed using complementary methods: the First-Order Reliability Method (FORM) and Second-Order Reliability Method (SORM) as approximated methods, and Monte Carlo simulations as the reference method. Both types are implemented in a specific software [Phimeca software. Software for reliability analysis developed by Phimeca Engineering S.A.] used in this study. For the Monte Carlo simulations, a response surface, based on experimental design and finite element calculations [Abaqus/Standard User’s Manuel vol. I.], is elaborated so that the relation between the random input variables and structural responses could be established. Investigations of previous reliable methods on two configurations of the locking system show the large sturdiness of the first one and enable design improvements for the second one
Open TURNS: An industrial software for uncertainty quantification in simulation
The needs to assess robust performances for complex systems and to answer
tighter regulatory processes (security, safety, environmental control, and
health impacts, etc.) have led to the emergence of a new industrial simulation
challenge: to take uncertainties into account when dealing with complex
numerical simulation frameworks. Therefore, a generic methodology has emerged
from the joint effort of several industrial companies and academic
institutions. EDF R&D, Airbus Group and Phimeca Engineering started a
collaboration at the beginning of 2005, joined by IMACS in 2014, for the
development of an Open Source software platform dedicated to uncertainty
propagation by probabilistic methods, named OpenTURNS for Open source Treatment
of Uncertainty, Risk 'N Statistics. OpenTURNS addresses the specific industrial
challenges attached to uncertainties, which are transparency, genericity,
modularity and multi-accessibility. This paper focuses on OpenTURNS and
presents its main features: openTURNS is an open source software under the LGPL
license, that presents itself as a C++ library and a Python TUI, and which
works under Linux and Windows environment. All the methodological tools are
described in the different sections of this paper: uncertainty quantification,
uncertainty propagation, sensitivity analysis and metamodeling. A section also
explains the generic wrappers way to link openTURNS to any external code. The
paper illustrates as much as possible the methodological tools on an
educational example that simulates the height of a river and compares it to the
height of a dyke that protects industrial facilities. At last, it gives an
overview of the main developments planned for the next few years
A Hierarchical Bayesian Model for Frame Representation
In many signal processing problems, it may be fruitful to represent the
signal under study in a frame. If a probabilistic approach is adopted, it
becomes then necessary to estimate the hyper-parameters characterizing the
probability distribution of the frame coefficients. This problem is difficult
since in general the frame synthesis operator is not bijective. Consequently,
the frame coefficients are not directly observable. This paper introduces a
hierarchical Bayesian model for frame representation. The posterior
distribution of the frame coefficients and model hyper-parameters is derived.
Hybrid Markov Chain Monte Carlo algorithms are subsequently proposed to sample
from this posterior distribution. The generated samples are then exploited to
estimate the hyper-parameters and the frame coefficients of the target signal.
Validation experiments show that the proposed algorithms provide an accurate
estimation of the frame coefficients and hyper-parameters. Application to
practical problems of image denoising show the impact of the resulting Bayesian
estimation on the recovered signal quality
- …