38,850 research outputs found
Conjugate Bayes for probit regression via unified skew-normal distributions
Regression models for dichotomous data are ubiquitous in statistics. Besides
being useful for inference on binary responses, these methods serve also as
building blocks in more complex formulations, such as density regression,
nonparametric classification and graphical models. Within the Bayesian
framework, inference proceeds by updating the priors for the coefficients,
typically set to be Gaussians, with the likelihood induced by probit or logit
regressions for the responses. In this updating, the apparent absence of a
tractable posterior has motivated a variety of computational methods, including
Markov Chain Monte Carlo routines and algorithms which approximate the
posterior. Despite being routinely implemented, Markov Chain Monte Carlo
strategies face mixing or time-inefficiency issues in large p and small n
studies, whereas approximate routines fail to capture the skewness typically
observed in the posterior. This article proves that the posterior distribution
for the probit coefficients has a unified skew-normal kernel, under Gaussian
priors. Such a novel result allows efficient Bayesian inference for a wide
class of applications, especially in large p and small-to-moderate n studies
where state-of-the-art computational methods face notable issues. These
advances are outlined in a genetic study, and further motivate the development
of a wider class of conjugate priors for probit models along with methods to
obtain independent and identically distributed samples from the unified
skew-normal posterior
A new adaptive response surface method for reliability analysis
Response surface method is a convenient tool to assess reliability for a wide range of structural mechanical problems. More specifically, adaptive schemes which consist in iteratively refine the experimental design close to the limit state have received much attention. However, it is generally difficult to take into account a lot of variables and to well handle approximation error. The method, proposed in this paper, addresses these points using sparse response surface and a relevant criterion for results accuracy. For this purpose, a response surface is built from an initial Latin Hypercube Sampling (LHS) where the most significant terms are chosen from statistical criteria and cross-validation method. At each step, LHS is refined in a region of interest defined with respect to an importance level on probability density in the design point. Two convergence criteria are used in the procedure: The first one concerns localization of the region and the second one the response surface quality. Finally, a bootstrap method is used to determine the influence of the response error on the estimated probability of failure. This method is applied to several examples and results are discussed
Particle algorithms for optimization on binary spaces
We discuss a unified approach to stochastic optimization of pseudo-Boolean
objective functions based on particle methods, including the cross-entropy
method and simulated annealing as special cases. We point out the need for
auxiliary sampling distributions, that is parametric families on binary spaces,
which are able to reproduce complex dependency structures, and illustrate their
usefulness in our numerical experiments. We provide numerical evidence that
particle-driven optimization algorithms based on parametric families yield
superior results on strongly multi-modal optimization problems while local
search heuristics outperform them on easier problems
Metamodel-based importance sampling for structural reliability analysis
Structural reliability methods aim at computing the probability of failure of
systems with respect to some prescribed performance functions. In modern
engineering such functions usually resort to running an expensive-to-evaluate
computational model (e.g. a finite element model). In this respect simulation
methods, which may require runs cannot be used directly. Surrogate
models such as quadratic response surfaces, polynomial chaos expansions or
kriging (which are built from a limited number of runs of the original model)
are then introduced as a substitute of the original model to cope with the
computational cost. In practice it is almost impossible to quantify the error
made by this substitution though. In this paper we propose to use a kriging
surrogate of the performance function as a means to build a quasi-optimal
importance sampling density. The probability of failure is eventually obtained
as the product of an augmented probability computed by substituting the
meta-model for the original performance function and a correction term which
ensures that there is no bias in the estimation even if the meta-model is not
fully accurate. The approach is applied to analytical and finite element
reliability problems and proves efficient up to 100 random variables.Comment: 20 pages, 7 figures, 2 tables. Preprint submitted to Probabilistic
Engineering Mechanic
Cyclotron resonant scattering feature simulations. I. Thermally averaged cyclotron scattering cross sections, mean free photon-path tables, and electron momentum sampling
Electron cyclotron resonant scattering features (CRSFs) are observed as
absorption-like lines in the spectra of X-ray pulsars. A significant fraction
of the computing time for Monte Carlo simulations of these quantum mechanical
features is spent on the calculation of the mean free path for each individual
photon before scattering, since it involves a complex numerical integration
over the scattering cross section and the (thermal) velocity distribution of
the scattering electrons.
We aim to numerically calculate interpolation tables which can be used in
CRSF simulations to sample the mean free path of the scattering photon and the
momentum of the scattering electron. The tables also contain all the
information required for sampling the scattering electron's final spin.
The tables were calculated using an adaptive Simpson integration scheme. The
energy and angle grids were refined until a prescribed accuracy is reached. The
tables are used by our simulation code to produce artificial CRSF spectra. The
electron momenta sampled during these simulations were analyzed and justified
using theoretically determined boundaries.
We present a complete set of tables suited for mean free path calculations of
Monte Carlo simulations of the cyclotron scattering process for conditions
expected in typical X-ray pulsar accretion columns (0.01<B/B_{crit}<=0.12,
where B_{crit}=4.413x10^{13} G and 3keV<=kT<15keV). The sampling of the tables
is chosen such that the results have an estimated relative error of at most
1/15 for all points in the grid. The tables are available online at
http://www.sternwarte.uni-erlangen.de/research/cyclo.Comment: A&A, in pres
Distributed Verification of Rare Properties using Importance Splitting Observers
Rare properties remain a challenge for statistical model checking (SMC) due
to the quadratic scaling of variance with rarity. We address this with a
variance reduction framework based on lightweight importance splitting
observers. These expose the model-property automaton to allow the construction
of score functions for high performance algorithms.
The confidence intervals defined for importance splitting make it appealing
for SMC, but optimising its performance in the standard way makes distribution
inefficient. We show how it is possible to achieve equivalently good results in
less time by distributing simpler algorithms. We first explore the challenges
posed by importance splitting and present an algorithm optimised for
distribution. We then define a specific bounded time logic that is compiled
into memory-efficient observers to monitor executions. Finally, we demonstrate
our framework on a number of challenging case studies
- …