26,721 research outputs found

    Local-global neural networks: a new approach for nonlinear time series modelling

    Get PDF
    In this paper, the Local Global Neural Networks model is proposed within the context of time series models. This formulation encompasses some already existing nonlinear models and also admits the Mixture of Experts approach. We place emphasis on the linear expert case and extensively discuss the theoretical aspects of the model: stationarity conditions, existence, consistency and asymptotic normality of the parameter estimates, and model identifiability. A model building strategy is also considered and the whole procedure is illustrated with two real time-series.neural networks, nonlinear models, time-series, model identifiability, parameter estimation, model building, sunspot number.

    Application of response surface methodology to stiffened panel optimization

    Get PDF
    In a multilevel optimization frame, the use of surrogate models to approximate optimization constraints allows great time saving. Among available metamodelling techniques we chose to use Neural Networks to perform regression of static mechanical criteria, namely buckling and collapse reserve factors of a stiffened panel, which are constraints of our subsystem optimization problem. Due to the highly non linear behaviour of these functions with respect to loading and design variables, we encountered some difficulties to obtain an approximation of sufficient quality on the whole design space. In particular, variations of the approximated function can be very different according to the value of loading variables. We show how a prior knowledge of the influence of the variables allows us to build an efficient Mixture of Expert model, leading to a good approximation of constraints. Optimization benchmark processes are computed to measure time saving, effects on optimum feasibility and objective value due to the use of the surrogate models as constraints. Finally we see that, while efficient, this mixture of expert model could be still improved by some additional learning techniques

    Surrogate modeling approximation using a mixture of experts based on EM joint estimation

    Get PDF
    An automatic method to combine several local surrogate models is presented. This method is intended to build accurate and smooth approximation of discontinuous functions that are to be used in structural optimization problems. It strongly relies on the Expectation-Maximization (EM) algorithm for Gaussian mixture models (GMM). To the end of regression, the inputs are clustered together with their output values by means of parameter estimation of the joint distribution. A local expert is then built (linear, quadratic, artificial neural network, moving least squares) on each cluster. Lastly, the local experts are combined using the Gaussian mixture model parameters found by the EM algorithm to obtain a global model. This method is tested over both mathematical test cases and an engineering optimization problem from aeronautics and is found to improve the accuracy of the approximation
    corecore