38,215 research outputs found
Meta-models for structural reliability and uncertainty quantification
A meta-model (or a surrogate model) is the modern name for what was
traditionally called a response surface. It is intended to mimic the behaviour
of a computational model M (e.g. a finite element model in mechanics) while
being inexpensive to evaluate, in contrast to the original model which may take
hours or even days of computer processing time. In this paper various types of
meta-models that have been used in the last decade in the context of structural
reliability are reviewed. More specifically classical polynomial response
surfaces, polynomial chaos expansions and kriging are addressed. It is shown
how the need for error estimates and adaptivity in their construction has
brought this type of approaches to a high level of efficiency. A new technique
that solves the problem of the potential biasedness in the estimation of a
probability of failure through the use of meta-models is finally presented.Comment: Keynote lecture Fifth Asian-Pacific Symposium on Structural
Reliability and its Applications (5th APSSRA) May 2012, Singapor
Polynomial-Chaos-based Kriging
Computer simulation has become the standard tool in many engineering fields
for designing and optimizing systems, as well as for assessing their
reliability. To cope with demanding analysis such as optimization and
reliability, surrogate models (a.k.a meta-models) have been increasingly
investigated in the last decade. Polynomial Chaos Expansions (PCE) and Kriging
are two popular non-intrusive meta-modelling techniques. PCE surrogates the
computational model with a series of orthonormal polynomials in the input
variables where polynomials are chosen in coherency with the probability
distributions of those input variables. On the other hand, Kriging assumes that
the computer model behaves as a realization of a Gaussian random process whose
parameters are estimated from the available computer runs, i.e. input vectors
and response values. These two techniques have been developed more or less in
parallel so far with little interaction between the researchers in the two
fields. In this paper, PC-Kriging is derived as a new non-intrusive
meta-modeling approach combining PCE and Kriging. A sparse set of orthonormal
polynomials (PCE) approximates the global behavior of the computational model
whereas Kriging manages the local variability of the model output. An adaptive
algorithm similar to the least angle regression algorithm determines the
optimal sparse set of polynomials. PC-Kriging is validated on various benchmark
analytical functions which are easy to sample for reference results. From the
numerical investigations it is concluded that PC-Kriging performs better than
or at least as good as the two distinct meta-modeling techniques. A larger gain
in accuracy is obtained when the experimental design has a limited size, which
is an asset when dealing with demanding computational models
Recommended from our members
Dysfunctional Social Reinforcement Processing in Disruptive Behavior Disorders: An Functional Magnetic Resonance Imaging Study.
ObjectivePrior functional magnetic resonance imaging (fMRI) work has revealed that children/adolescents with disruptive behavior disorders (DBDs) show dysfunctional reward/non-reward processing of non-social reinforcements in the context of instrumental learning tasks. Neural responsiveness to social reinforcements during instrumental learning, despite the importance of this for socialization, has not yet been previously investigated.MethodsTwenty-nine healthy children/adolescents and 19 children/adolescents with DBDs performed the fMRI social/non-social reinforcement learning task. Participants responded to random fractal image stimuli and received social and non-social rewards/non-rewards according to their accuracy.ResultsChildren/adolescents with DBDs showed significantly reduced responses within the caudate and posterior cingulate cortex (PCC) to non-social (financial) rewards and social non-rewards (the distress of others). Connectivity analyses revealed that children/adolescents with DBDs have decreased positive functional connectivity between the ventral striatum (VST) and the ventromedial prefrontal cortex (vmPFC) seeds and the lateral frontal cortex in response to reward relative to non-reward, irrespective of its sociality. In addition, they showed decreased positive connectivity between the vmPFC seed and the amygdala in response to non-reward relative to reward.ConclusionThese data indicate compromised reinforcement processing of both non-social rewards and social non-rewards in children/adolescents with DBDs within core regions for instrumental learning and reinforcement-based decision- making (caudate and PCC). In addition, children/adolescents with DBDs show dysfunctional interactions between the VST, vmPFC, and lateral frontal cortex in response to rewarded instrumental actions potentially reflecting disruptions in attention to rewarded stimuli
Application of Computational Intelligence Techniques to Process Industry Problems
In the last two decades there has been a large progress in the computational
intelligence research field. The fruits of the effort spent on the research in the discussed
field are powerful techniques for pattern recognition, data mining, data modelling, etc.
These techniques achieve high performance on traditional data sets like the UCI
machine learning database. Unfortunately, this kind of data sources usually represent
clean data without any problems like data outliers, missing values, feature co-linearity,
etc. common to real-life industrial data. The presence of faulty data samples can have
very harmful effects on the models, for example if presented during the training of the
models, it can either cause sub-optimal performance of the trained model or in the worst
case destroy the so far learnt knowledge of the model. For these reasons the application
of present modelling techniques to industrial problems has developed into a research
field on its own. Based on the discussion of the properties and issues of the data and the
state-of-the-art modelling techniques in the process industry, in this paper a novel
unified approach to the development of predictive models in the process industry is
presented
Human capital and entrepreneurial success : a meta-analytical review
The study meta-analytically integrates results from three decades of human capital research in entrepreneurship. Based on 70 independent samples (N = 24,733), we found a significant but small relationship between human capital and success (r(c) = .098). We examined theoretically derived moderators of this relationship referring to conceptualizations of human capital, to context, and to measurement of success. The relationship was higher for outcomes of human capital investments (knowledge/skills) than for human capital investments (education/experience), for human capital with high task-relatedness compared to low task-relatedness, for young businesses compared to old businesses, and for the dependent variable size compared to growth or profitability. Findings are relevant for practitioners (lenders, policy makers, educators) and for future research. Our findings show that future research should pursue moderator approaches to study the effects of human capital on success. Further, human capital is most important if it is task-related and if it consists of outcomes of human capital investments rather than human capital investments; this suggests that research should overcome a static view of human capital and should rather investigate the processes of learning, knowledge acquisition, and the transfer of knowledge to entrepreneurial tasks
Mechanism Deduction from Noisy Chemical Reaction Networks
We introduce KiNetX, a fully automated meta-algorithm for the kinetic
analysis of complex chemical reaction networks derived from semi-accurate but
efficient electronic structure calculations. It is designed to (i) accelerate
the automated exploration of such networks, and (ii) cope with model-inherent
errors in electronic structure calculations on elementary reaction steps. We
developed and implemented KiNetX to possess three features. First, KiNetX
evaluates the kinetic relevance of every species in a (yet incomplete) reaction
network to confine the search for new elementary reaction steps only to those
species that are considered possibly relevant. Second, KiNetX identifies and
eliminates all kinetically irrelevant species and elementary reactions to
reduce a complex network graph to a comprehensible mechanism. Third, KiNetX
estimates the sensitivity of species concentrations toward changes in
individual rate constants (derived from relative free energies), which allows
us to systematically select the most efficient electronic structure model for
each elementary reaction given a predefined accuracy. The novelty of KiNetX
consists in the rigorous propagation of correlated free-energy uncertainty
through all steps of our kinetic analyis. To examine the performance of KiNetX,
we developed AutoNetGen. It semirandomly generates chemistry-mimicking reaction
networks by encoding chemical logic into their underlying graph structure.
AutoNetGen allows us to consider a vast number of distinct chemistry-like
scenarios and, hence, to discuss assess the importance of rigorous uncertainty
propagation in a statistical context. Our results reveal that KiNetX reliably
supports the deduction of product ratios, dominant reaction pathways, and
possibly other network properties from semi-accurate electronic structure data.Comment: 36 pages, 4 figures, 2 table
- …