3,879,761 research outputs found

    Dynamic data-driven model reduction: adapting reduced models from incomplete data

    Get PDF
    This work presents a data-driven online adaptive model reduction approach for systems that undergo dynamic changes. Classical model reduction constructs a reduced model of a large-scale system in an offline phase and then keeps the reduced model unchanged during the evaluations in an online phase; however, if the system changes online, the reduced model may fail to predict the behavior of the changed system. Rebuilding the reduced model from scratch is often too expensive in time-critical and real-time environments. We introduce a dynamic data-driven adaptation approach that adapts the reduced model from incomplete sensor data obtained from the system during the online computations. The updates to the reduced models are derived directly from the incomplete data, without recourse to the full model. Our adaptivity approach approximates the missing values in the incomplete sensor data with gappy proper orthogonal decomposition. These approximate data are then used to derive low-rank updates to the reduced basis and the reduced operators. In our numerical examples, incomplete data with 30–40 % known values are sufficient to recover the reduced model that would be obtained via rebuilding from scratch.United States. Air Force Office of Scientific Research (AFOSR MURI on multi-information sources of multi-physics systems, Award Number FA9550-15-1-0038)United States. Dept. of Energy (Applied Mathematics Program, Award DE-FG02 08ER2585)United States. Dept. of Energy (Applied Mathematics Program, Award DE-SC0009297

    Predicting Spatio-Temporal Time Series Using Dimension Reduced Local States

    Full text link
    We present a method for both cross estimation and iterated time series prediction of spatio temporal dynamics based on reconstructed local states, PCA dimension reduction, and local modelling using nearest neighbour methods. The effectiveness of this approach is shown for (noisy) data from a (cubic) Barkley model, the Bueno-Orovio-Cherry-Fenton model, and the Kuramoto-Sivashinsky model

    Manifold interpolation and model reduction

    Full text link
    One approach to parametric and adaptive model reduction is via the interpolation of orthogonal bases, subspaces or positive definite system matrices. In all these cases, the sampled inputs stem from matrix sets that feature a geometric structure and thus form so-called matrix manifolds. This work will be featured as a chapter in the upcoming Handbook on Model Order Reduction (P. Benner, S. Grivet-Talocia, A. Quarteroni, G. Rozza, W.H.A. Schilders, L.M. Silveira, eds, to appear on DE GRUYTER) and reviews the numerical treatment of the most important matrix manifolds that arise in the context of model reduction. Moreover, the principal approaches to data interpolation and Taylor-like extrapolation on matrix manifolds are outlined and complemented by algorithms in pseudo-code.Comment: 37 pages, 4 figures, featured chapter of upcoming "Handbook on Model Order Reduction

    Decomposing feature-level variation with Covariate Gaussian Process Latent Variable Models

    Full text link
    The interpretation of complex high-dimensional data typically requires the use of dimensionality reduction techniques to extract explanatory low-dimensional representations. However, in many real-world problems these representations may not be sufficient to aid interpretation on their own, and it would be desirable to interpret the model in terms of the original features themselves. Our goal is to characterise how feature-level variation depends on latent low-dimensional representations, external covariates, and non-linear interactions between the two. In this paper, we propose to achieve this through a structured kernel decomposition in a hybrid Gaussian Process model which we call the Covariate Gaussian Process Latent Variable Model (c-GPLVM). We demonstrate the utility of our model on simulated examples and applications in disease progression modelling from high-dimensional gene expression data in the presence of additional phenotypes. In each setting we show how the c-GPLVM can extract low-dimensional structures from high-dimensional data sets whilst allowing a breakdown of feature-level variability that is not present in other commonly used dimensionality reduction approaches

    Identification of flexible structures for robust control

    Get PDF
    Documentation is provided of the authors' experience with modeling and identification of an experimental flexible structure for the purpose of control design, with the primary aim being to motivate some important research directions in this area. A multi-input/multi-output (MIMO) model of the structure is generated using the finite element method. This model is inadequate for control design, due to its large variation from the experimental data. Chebyshev polynomials are employed to fit the data with single-input/multi-output (SIMO) transfer function models. Combining these SIMO models leads to a MIMO model with more modes than the original finite element model. To find a physically motivated model, an ad hoc model reduction technique which uses a priori knowledge of the structure is developed. The ad hoc approach is compared with balanced realization model reduction to determine its benefits. Descriptions of the errors between the model and experimental data are formulated for robust control design. Plots of select transfer function models and experimental data are included

    Goods Versus Characteristics: Dimension Reduction and Revealed Preference

    Get PDF
    This paper compares the goods and characteristics models of the consumer within a non-parametric revealed preference framework. Of primary interest is to make a comparison on the basis of predictive success that takes into account dimension reduction. This allows us to nonparametrically identify the model which best fits the data. We implement these procedures on household panel data from the UK milk market. The primary result is that the better fit of the characteristics model is entirely attributable to dimension reduction.Characteristics; demand; dimension reduction; nested models; revealed preference

    Goods versus characteristics: dimension reduction and revealed preference

    Get PDF
    This paper compares the goods and characteristics models of the consumer within a non-parametric revealed preference framework. Of primary interest is to make a comparison on the basis of predictive success that takes into account dimension reduction. This allows us to nonparametrically identify the model which best fits the data. We implement these procedures on household panel data from the UK milk market. The primary result is that the better fit of the characteristics model is entirely attributable to dimension reduction.Characteristics; demand; dimension reduction; nested models; revealed preference.
    corecore