11,535 research outputs found
mfEGRA: Multifidelity Efficient Global Reliability Analysis through Active Learning for Failure Boundary Location
This paper develops mfEGRA, a multifidelity active learning method using
data-driven adaptively refined surrogates for failure boundary location in
reliability analysis. This work addresses the issue of prohibitive cost of
reliability analysis using Monte Carlo sampling for expensive-to-evaluate
high-fidelity models by using cheaper-to-evaluate approximations of the
high-fidelity model. The method builds on the Efficient Global Reliability
Analysis (EGRA) method, which is a surrogate-based method that uses adaptive
sampling for refining Gaussian process surrogates for failure boundary location
using a single-fidelity model. Our method introduces a two-stage adaptive
sampling criterion that uses a multifidelity Gaussian process surrogate to
leverage multiple information sources with different fidelities. The method
combines expected feasibility criterion from EGRA with one-step lookahead
information gain to refine the surrogate around the failure boundary. The
computational savings from mfEGRA depends on the discrepancy between the
different models, and the relative cost of evaluating the different models as
compared to the high-fidelity model. We show that accurate estimation of
reliability using mfEGRA leads to computational savings of 46% for an
analytic multimodal test problem and 24% for a three-dimensional acoustic horn
problem, when compared to single-fidelity EGRA. We also show the effect of
using a priori drawn Monte Carlo samples in the implementation for the acoustic
horn problem, where mfEGRA leads to computational savings of 45% for the
three-dimensional case and 48% for a rarer event four-dimensional case as
compared to single-fidelity EGRA
Meta-models for structural reliability and uncertainty quantification
A meta-model (or a surrogate model) is the modern name for what was
traditionally called a response surface. It is intended to mimic the behaviour
of a computational model M (e.g. a finite element model in mechanics) while
being inexpensive to evaluate, in contrast to the original model which may take
hours or even days of computer processing time. In this paper various types of
meta-models that have been used in the last decade in the context of structural
reliability are reviewed. More specifically classical polynomial response
surfaces, polynomial chaos expansions and kriging are addressed. It is shown
how the need for error estimates and adaptivity in their construction has
brought this type of approaches to a high level of efficiency. A new technique
that solves the problem of the potential biasedness in the estimation of a
probability of failure through the use of meta-models is finally presented.Comment: Keynote lecture Fifth Asian-Pacific Symposium on Structural
Reliability and its Applications (5th APSSRA) May 2012, Singapor
Entropy Measures in Machine Fault Diagnosis: Insights and Applications
Entropy, as a complexity measure, has been widely applied for time series analysis. One preeminent example is the design of machine condition monitoring and industrial fault diagnostic systems.
The occurrence of failures in a machine will typically lead to non-linear characteristics in the measurements, caused by instantaneous variations, which can increase the complexity in the system response. Entropy measures are suitable to quantify such dynamic changes in the underlying process, distinguishing between different system conditions.
However, notions of entropy are defined differently in various contexts (e.g., information theory and dynamical systems theory), which may confound researchers in the applied sciences. In this paper, we have systematically reviewed the theoretical development of some fundamental entropy measures and clarified the relations among them. Then, typical entropy-based applications of machine fault diagnostic systems are summarized. Further, insights into possible applications of the entropy measures are explained, as to where and how these measures can be useful towards future data-driven fault diagnosis methodologies. Finally, potential research trends in this area are discussed, with the intent of improving online entropy estimation and expanding its applicability to a wider range of intelligent fault diagnostic systems
The detection of globular clusters in galaxies as a data mining problem
We present an application of self-adaptive supervised learning classifiers
derived from the Machine Learning paradigm, to the identification of candidate
Globular Clusters in deep, wide-field, single band HST images. Several methods
provided by the DAME (Data Mining & Exploration) web application, were tested
and compared on the NGC1399 HST data described in Paolillo 2011. The best
results were obtained using a Multi Layer Perceptron with Quasi Newton learning
rule which achieved a classification accuracy of 98.3%, with a completeness of
97.8% and 1.6% of contamination. An extensive set of experiments revealed that
the use of accurate structural parameters (effective radius, central surface
brightness) does improve the final result, but only by 5%. It is also shown
that the method is capable to retrieve also extreme sources (for instance, very
extended objects) which are missed by more traditional approaches.Comment: Accepted 2011 December 12; Received 2011 November 28; in original
form 2011 October 1
Witnessing eigenstates for quantum simulation of Hamiltonian spectra
The efficient calculation of Hamiltonian spectra, a problem often intractable
on classical machines, can find application in many fields, from physics to
chemistry. Here, we introduce the concept of an "eigenstate witness" and
through it provide a new quantum approach which combines variational methods
and phase estimation to approximate eigenvalues for both ground and excited
states. This protocol is experimentally verified on a programmable silicon
quantum photonic chip, a mass-manufacturable platform, which embeds entangled
state generation, arbitrary controlled-unitary operations, and projective
measurements. Both ground and excited states are experimentally found with
fidelities >99%, and their eigenvalues are estimated with 32-bits of precision.
We also investigate and discuss the scalability of the approach and study its
performance through numerical simulations of more complex Hamiltonians. This
result shows promising progress towards quantum chemistry on quantum computers.Comment: 9 pages, 4 figures, plus Supplementary Material [New version with
minor typos corrected.
- …