77,032 research outputs found
On the Influence of the Data Sampling Interval on Computer-Derived K-Indices
The K index was devised by Bartels et al. (1939) to provide an objective
monitoring of irregular geomagnetic activity. The K index was then routinely
used to monitor the magnetic activity at permanent magnetic observatories as
well as at temporary stations. The increasing number of digital and sometimes
unmanned observatories and the creation of INTERMAGNET put the question of
computer production of K at the centre of the debate. Four algorithms were
selected during the Vienna meeting (1991) and endorsed by IAGA for the computer
production of K indices. We used one of them (FMI algorithm) to investigate the
impact of the geomagnetic data sampling interval on computer produced K values
through the comparison of the computer derived K values for the period 2009,
January 1st to 2010, May 31st at the Port-aux-Francais magnetic observatory
using magnetic data series with different sampling rates (the smaller: 1
second; the larger: 1 minute). The impact is investigated on both 3-hour range
values and K indices data series, as a function of the activity level for low
and moderate geomagnetic activity
Global Sensitivity Analysis of Stochastic Computer Models with joint metamodels
The global sensitivity analysis method, used to quantify the influence of
uncertain input variables on the response variability of a numerical model, is
applicable to deterministic computer code (for which the same set of input
variables gives always the same output value). This paper proposes a global
sensitivity analysis methodology for stochastic computer code (having a
variability induced by some uncontrollable variables). The framework of the
joint modeling of the mean and dispersion of heteroscedastic data is used. To
deal with the complexity of computer experiment outputs, non parametric joint
models (based on Generalized Additive Models and Gaussian processes) are
discussed. The relevance of these new models is analyzed in terms of the
obtained variance-based sensitivity indices with two case studies. Results show
that the joint modeling approach leads accurate sensitivity index estimations
even when clear heteroscedasticity is present
Open TURNS: An industrial software for uncertainty quantification in simulation
The needs to assess robust performances for complex systems and to answer
tighter regulatory processes (security, safety, environmental control, and
health impacts, etc.) have led to the emergence of a new industrial simulation
challenge: to take uncertainties into account when dealing with complex
numerical simulation frameworks. Therefore, a generic methodology has emerged
from the joint effort of several industrial companies and academic
institutions. EDF R&D, Airbus Group and Phimeca Engineering started a
collaboration at the beginning of 2005, joined by IMACS in 2014, for the
development of an Open Source software platform dedicated to uncertainty
propagation by probabilistic methods, named OpenTURNS for Open source Treatment
of Uncertainty, Risk 'N Statistics. OpenTURNS addresses the specific industrial
challenges attached to uncertainties, which are transparency, genericity,
modularity and multi-accessibility. This paper focuses on OpenTURNS and
presents its main features: openTURNS is an open source software under the LGPL
license, that presents itself as a C++ library and a Python TUI, and which
works under Linux and Windows environment. All the methodological tools are
described in the different sections of this paper: uncertainty quantification,
uncertainty propagation, sensitivity analysis and metamodeling. A section also
explains the generic wrappers way to link openTURNS to any external code. The
paper illustrates as much as possible the methodological tools on an
educational example that simulates the height of a river and compares it to the
height of a dyke that protects industrial facilities. At last, it gives an
overview of the main developments planned for the next few years
Global sensitivity analysis of computer models with functional inputs
Global sensitivity analysis is used to quantify the influence of uncertain
input parameters on the response variability of a numerical model. The common
quantitative methods are applicable to computer codes with scalar input
variables. This paper aims to illustrate different variance-based sensitivity
analysis techniques, based on the so-called Sobol indices, when some input
variables are functional, such as stochastic processes or random spatial
fields. In this work, we focus on large cpu time computer codes which need a
preliminary meta-modeling step before performing the sensitivity analysis. We
propose the use of the joint modeling approach, i.e., modeling simultaneously
the mean and the dispersion of the code outputs using two interlinked
Generalized Linear Models (GLM) or Generalized Additive Models (GAM). The
``mean'' model allows to estimate the sensitivity indices of each scalar input
variables, while the ``dispersion'' model allows to derive the total
sensitivity index of the functional input variables. The proposed approach is
compared to some classical SA methodologies on an analytical function. Lastly,
the proposed methodology is applied to a concrete industrial computer code that
simulates the nuclear fuel irradiation
- …