44,054 research outputs found
Particle algorithms for optimization on binary spaces
We discuss a unified approach to stochastic optimization of pseudo-Boolean
objective functions based on particle methods, including the cross-entropy
method and simulated annealing as special cases. We point out the need for
auxiliary sampling distributions, that is parametric families on binary spaces,
which are able to reproduce complex dependency structures, and illustrate their
usefulness in our numerical experiments. We provide numerical evidence that
particle-driven optimization algorithms based on parametric families yield
superior results on strongly multi-modal optimization problems while local
search heuristics outperform them on easier problems
mfEGRA: Multifidelity Efficient Global Reliability Analysis through Active Learning for Failure Boundary Location
This paper develops mfEGRA, a multifidelity active learning method using
data-driven adaptively refined surrogates for failure boundary location in
reliability analysis. This work addresses the issue of prohibitive cost of
reliability analysis using Monte Carlo sampling for expensive-to-evaluate
high-fidelity models by using cheaper-to-evaluate approximations of the
high-fidelity model. The method builds on the Efficient Global Reliability
Analysis (EGRA) method, which is a surrogate-based method that uses adaptive
sampling for refining Gaussian process surrogates for failure boundary location
using a single-fidelity model. Our method introduces a two-stage adaptive
sampling criterion that uses a multifidelity Gaussian process surrogate to
leverage multiple information sources with different fidelities. The method
combines expected feasibility criterion from EGRA with one-step lookahead
information gain to refine the surrogate around the failure boundary. The
computational savings from mfEGRA depends on the discrepancy between the
different models, and the relative cost of evaluating the different models as
compared to the high-fidelity model. We show that accurate estimation of
reliability using mfEGRA leads to computational savings of 46% for an
analytic multimodal test problem and 24% for a three-dimensional acoustic horn
problem, when compared to single-fidelity EGRA. We also show the effect of
using a priori drawn Monte Carlo samples in the implementation for the acoustic
horn problem, where mfEGRA leads to computational savings of 45% for the
three-dimensional case and 48% for a rarer event four-dimensional case as
compared to single-fidelity EGRA
Catching Super Massive Black Hole Binaries Without a Net
The gravitational wave signals from coalescing Supermassive Black Hole
Binaries are prime targets for the Laser Interferometer Space Antenna (LISA).
With optimal data processing techniques, the LISA observatory should be able to
detect black hole mergers anywhere in the Universe. The challenge is to find
ways to dig the signals out of a combination of instrument noise and the large
foreground from stellar mass binaries in our own galaxy. The standard procedure
of matched filtering against a grid of templates can be computationally
prohibitive, especially when the black holes are spinning or the mass ratio is
large. Here we develop an alternative approach based on Metropolis-Hastings
sampling and simulated annealing that is orders of magnitude cheaper than a
grid search. We demonstrate our approach on simulated LISA data streams that
contain the signals from binary systems of Schwarzschild Black Holes, embedded
in instrument noise and a foreground containing 26 million galactic binaries.
The search algorithm is able to accurately recover the 9 parameters that
describe the black hole binary without first having to remove any of the bright
foreground sources, even when the black hole system has low signal-to-noise.Comment: 4 pages, 3 figures, Refined search algorithm, added low SNR exampl
- …