255,885 research outputs found

    Automatic differentiation algorithms in model analysis

    Get PDF
    Title: Automatic differentiation algorithms in model analysisAuthor: M.J. HuiskesDate: 19 March, 2002In this thesis automatic differentiation algorithms and derivative-based methods are combined to develop efficient tools for model analysis. Automatic differentiation algorithms comprise a class of algorithms aimed at the derivative computation of functions that are represented as computer code. Derivative-based methods that may be implemented using these algorithms are presented for sensitivity analysis and statistical inference, particularly in the context of nonlinear parameter estimation.Local methods of sensitivity analysis are discussed for both explicit and implicit relations between variables. Particular attention is paid to propagation of uncertainty, and to the subsequent uncertainty decomposition of output uncertainty in the various sources of input uncertainty.Statistical methods are presented for the computation of accurate inferential information for nonlinear parameter estimation problems by means of higher-order derivatives of the model functions. Methods are also discussed for the assessment of the appropriateness of model structure complexity in relation to quality of data.To realize and demonstrate the potential of routines for model analysis based on automatic differentiation a software library is developed: a C++ library for the analysis of nonlinear models that can be represented by differentiable functions in which the methods for parameter estimation, statistical inference, model selection and sensitivity analysis are implemented. Several experiments are performed to assess the performance of the library. The application of the derivative-based methods and the routines of the library is further demonstrated by means of a number of case studies in ecological assessment. In two studies, large parameter estimation procedures for fish stock assessment are analyzed: for the Pacific halibut and North Sea herring species. The derivative-based methods of sensitivity analysis are applied in a study on the contribution of Russian forests to the global carbon cycle

    Design of Experiments for Screening

    Full text link
    The aim of this paper is to review methods of designing screening experiments, ranging from designs originally developed for physical experiments to those especially tailored to experiments on numerical models. The strengths and weaknesses of the various designs for screening variables in numerical models are discussed. First, classes of factorial designs for experiments to estimate main effects and interactions through a linear statistical model are described, specifically regular and nonregular fractional factorial designs, supersaturated designs and systematic fractional replicate designs. Generic issues of aliasing, bias and cancellation of factorial effects are discussed. Second, group screening experiments are considered including factorial group screening and sequential bifurcation. Third, random sampling plans are discussed including Latin hypercube sampling and sampling plans to estimate elementary effects. Fourth, a variety of modelling methods commonly employed with screening designs are briefly described. Finally, a novel study demonstrates six screening methods on two frequently-used exemplars, and their performances are compared

    EEG sleep stages identification based on weighted undirected complex networks

    Get PDF
    Sleep scoring is important in sleep research because any errors in the scoring of the patient's sleep electroencephalography (EEG) recordings can cause serious problems such as incorrect diagnosis, medication errors, and misinterpretations of patient's EEG recordings. The aim of this research is to develop a new automatic method for EEG sleep stages classification based on a statistical model and weighted brain networks. Methods each EEG segment is partitioned into a number of blocks using a sliding window technique. A set of statistical features are extracted from each block. As a result, a vector of features is obtained to represent each EEG segment. Then, the vector of features is mapped into a weighted undirected network. Different structural and spectral attributes of the networks are extracted and forwarded to a least square support vector machine (LS-SVM) classifier. At the same time the network's attributes are also thoroughly investigated. It is found that the network's characteristics vary with their sleep stages. Each sleep stage is best represented using the key features of their networks. Results In this paper, the proposed method is evaluated using two datasets acquired from different channels of EEG (Pz-Oz and C3-A2) according to the R&K and the AASM without pre-processing the original EEG data. The obtained results by the LS-SVM are compared with those by Naïve, k-nearest and a multi-class-SVM. The proposed method is also compared with other benchmark sleep stages classification methods. The comparison results demonstrate that the proposed method has an advantage in scoring sleep stages based on single channel EEG signals. Conclusions An average accuracy of 96.74% is obtained with the C3-A2 channel according to the AASM standard, and 96% with the Pz-Oz channel based on the R&K standard

    Uncertainty and sensitivity analysis of functional risk curves based on Gaussian processes

    Full text link
    A functional risk curve gives the probability of an undesirable event as a function of the value of a critical parameter of a considered physical system. In several applicative situations, this curve is built using phenomenological numerical models which simulate complex physical phenomena. To avoid cpu-time expensive numerical models, we propose to use Gaussian process regression to build functional risk curves. An algorithm is given to provide confidence bounds due to this approximation. Two methods of global sensitivity analysis of the models' random input parameters on the functional risk curve are also studied. In particular, the PLI sensitivity indices allow to understand the effect of misjudgment on the input parameters' probability density functions

    Open TURNS: An industrial software for uncertainty quantification in simulation

    Full text link
    The needs to assess robust performances for complex systems and to answer tighter regulatory processes (security, safety, environmental control, and health impacts, etc.) have led to the emergence of a new industrial simulation challenge: to take uncertainties into account when dealing with complex numerical simulation frameworks. Therefore, a generic methodology has emerged from the joint effort of several industrial companies and academic institutions. EDF R&D, Airbus Group and Phimeca Engineering started a collaboration at the beginning of 2005, joined by IMACS in 2014, for the development of an Open Source software platform dedicated to uncertainty propagation by probabilistic methods, named OpenTURNS for Open source Treatment of Uncertainty, Risk 'N Statistics. OpenTURNS addresses the specific industrial challenges attached to uncertainties, which are transparency, genericity, modularity and multi-accessibility. This paper focuses on OpenTURNS and presents its main features: openTURNS is an open source software under the LGPL license, that presents itself as a C++ library and a Python TUI, and which works under Linux and Windows environment. All the methodological tools are described in the different sections of this paper: uncertainty quantification, uncertainty propagation, sensitivity analysis and metamodeling. A section also explains the generic wrappers way to link openTURNS to any external code. The paper illustrates as much as possible the methodological tools on an educational example that simulates the height of a river and compares it to the height of a dyke that protects industrial facilities. At last, it gives an overview of the main developments planned for the next few years
    corecore