29 research outputs found

    Probabilistic Reduced-Order Modeling for Stochastic Partial Differential Equations

    Full text link
    We discuss a Bayesian formulation to coarse-graining (CG) of PDEs where the coefficients (e.g. material parameters) exhibit random, fine scale variability. The direct solution to such problems requires grids that are small enough to resolve this fine scale variability which unavoidably requires the repeated solution of very large systems of algebraic equations. We establish a physically inspired, data-driven coarse-grained model which learns a low- dimensional set of microstructural features that are predictive of the fine-grained model (FG) response. Once learned, those features provide a sharp distribution over the coarse scale effec- tive coefficients of the PDE that are most suitable for prediction of the fine scale model output. This ultimately allows to replace the computationally expensive FG by a generative proba- bilistic model based on evaluating the much cheaper CG several times. Sparsity enforcing pri- ors further increase predictive efficiency and reveal microstructural features that are important in predicting the FG response. Moreover, the model yields probabilistic rather than single-point predictions, which enables the quantification of the unavoidable epistemic uncertainty that is present due to the information loss that occurs during the coarse-graining process

    A Generalized Probabilistic Learning Approach for Multi-Fidelity Uncertainty Propagation in Complex Physical Simulations

    Full text link
    Two of the most significant challenges in uncertainty propagation pertain to the high computational cost for the simulation of complex physical models and the high dimension of the random inputs. In applications of practical interest both of these problems are encountered and standard methods for uncertainty quantification either fail or are not feasible. To overcome the current limitations, we propose a probabilistic multi-fidelity framework that can exploit lower-fidelity model versions of the original problem in a small data regime. The approach circumvents the curse of dimensionality by learning dependencies between the outputs of high-fidelity models and lower-fidelity models instead of explicitly accounting for the high-dimensional inputs. We complement the information provided by a low-fidelity model with a low-dimensional set of informative features of the stochastic input, which are discovered by employing a combination of supervised and unsupervised dimensionality reduction techniques. The goal of our analysis is an efficient and accurate estimation of the full probabilistic response for a high-fidelity model. Despite the incomplete and noisy information that low-fidelity predictors provide, we demonstrate that accurate and certifiable estimates for the quantities of interest can be obtained in the small data regime, i.e., with significantly fewer high-fidelity model runs than state-of-the-art methods for uncertainty propagation. We illustrate our approach by applying it to challenging numerical examples such as Navier-Stokes flow simulations and monolithic fluid-structure interaction problems.Comment: 31 pages, 14 figure
    corecore