26 research outputs found

    A hybrid FETI-DP method for non-smooth random partial differential equations

    Get PDF
    A domain decomposition approach exploiting the localization of random parameters in high-dimensional random PDEs is presented. For high efficiency, surrogate models in multi-element representations are computed locally when possible. This makes use of a stochastic Galerkin FETI-DP formulation of the underlying problem with localized representations of involved input random fields. The local parameter space associated to a subdomain is explored by a subdivision into regions where the parametric surrogate accuracy can be trusted and where instead Monte Carlo sampling has to be employed. A heuristic adaptive algorithm carries out a problem-dependent hp refinement in a stochastic multi-element sense, enlarging the trusted surrogate region in local parametric space as far as possible. This results in an efficient global parameter to solution sampling scheme making use of local parametric smoothness exploration in the involved surrogate construction. Adequately structured problems for this scheme occur naturally when uncertainties are defined on sub-domains, e.g. in a multi-physics setting, or when the Karhunen-Loeve expansion of a random field can be localized. The efficiency of this hybrid technique is demonstrated with numerical benchmark problems illustrating the identification of trusted (possibly higher order) surrogate regions and non-trusted sampling regions

    A local hybrid surrogate‐based finite element tearing interconnecting dual‐primal method for nonsmooth random partial differential equations

    Get PDF
    A domain decomposition approach for high‐dimensional random partial differential equations exploiting the localization of random parameters is presented. To obtain high efficiency, surrogate models in multielement representations in the parameter space are constructed locally when possible. The method makes use of a stochastic Galerkin finite element tearing interconnecting dual‐primal formulation of the underlying problem with localized representations of involved input random fields. Each local parameter space associated to a subdomain is explored by a subdivision into regions where either the parametric surrogate accuracy can be trusted or where instead one has to resort to Monte Carlo. A heuristic adaptive algorithm carries out a problem‐dependent hp‐refinement in a stochastic multielement sense, anisotropically enlarging the trusted surrogate region as far as possible. This results in an efficient global parameter to solution sampling scheme making use of local parametric smoothness exploration for the surrogate construction. Adequately structured problems for this scheme occur naturally when uncertainties are defined on subdomains, for example, in a multiphysics setting, or when the Karhunen–Loève expansion of a random field can be localized. The efficiency of the proposed hybrid technique is assessed with numerical benchmark problems illustrating the identification of trusted (possibly higher order) surrogate regions and nontrusted sampling regions

    Less interaction with forward models in Langevin dynamics

    Get PDF
    Ensemble methods have become ubiquitous for the solution of Bayesian inference problems. State-of-the-art Langevin samplers such as the Ensemble Kalman Sampler (EKS), Affine Invariant Langevin Dynamics (ALDI) or its extension using weighted covariance estimates rely on successive evaluations of the forward model or its gradient. A main drawback of these methods hence is their vast number of required forward calls as well as their possible lack of convergence in the case of more involved posterior measures such as multimodal distributions. The goal of this paper is to address these challenges to some extend. First, several possible adaptive ensemble enrichment strategies that successively enlarge the number of particles in the underlying Langevin dynamics are discusses that in turn lead to a significant reduction of the total number of forward calls. Second, analytical consistency guarantees of the ensemble enrichment method are provided for linear forward models. Third, to address more involved target distributions, the method is extended by applying adapted Langevin dynamics based on a homotopy formalism for which convergence is proved. Finally, numerical investigations of several benchmark problems illustrates the possible gain of the proposed method, comparing it to state-of-the-art Langevin samplers

    Low-rank tensor reconstruction of concentrated densities with application to Bayesian inversion

    Get PDF
    A novel method for the accurate functional approximation of possibly highly concentrated probability densities is developed. It is based on the combination of several modern techniques such as transport maps and nonintrusive reconstructions of low-rank tensor representations. The central idea is to carry out computations for statistical quantities of interest such as moments with a convenient reference measure which is approximated by an numerical transport, leading to a perturbed prior. Subsequently, a coordinate transformation leads to a beneficial setting for the further function approximation. An efficient layer based transport construction is realized by using the Variational Monte Carlo (VMC) method. The convergence analysis covers all terms introduced by the different (deterministic and statistical) approximations in the Hellinger distance and the Kullback-Leibler divergence. Important applications are presented and in particular the context of Bayesian inverse problems is illuminated which is a central motivation for the developed approach. Several numerical examples illustrate the efficacy with densities of different complexity

    Low-rank tensor reconstruction of concentrated densities with application to Bayesian inversion

    Get PDF
    This paper presents a novel method for the accurate functional approximation of possibly highly concentrated probability densities. It is based on the combination of several modern techniques such as transport maps and low-rank approximations via a nonintrusive tensor train reconstruction. The central idea is to carry out computations for statistical quantities of interest such as moments based on a convenient representation of a reference density for which accurate numerical methods can be employed. Since the transport from target to reference can usually not be determined exactly, one has to cope with a perturbed reference density due to a numerically approximated transport map. By the introduction of a layered approximation and appropriate coordinate transformations, the problem is split into a set of independent approximations in seperately chosen orthonormal basis functions, combining the notions h- and p-refinement (i.e. “mesh size” and polynomial degree). An efficient low-rank representation of the perturbed reference density is achieved via the Variational Monte Carlo method. This nonintrusive regression technique reconstructs the map in the tensor train format. An a priori convergence analysis with respect to the error terms introduced by the different (deterministic and statistical) approximations in the Hellinger distance and the Kullback–Leibler divergence is derived. Important applications are presented and in particular the context of Bayesian inverse problems is illuminated which is a main motivation for the developed approach. Several numerical examples illustrate the efficacy with densities of different complexity and degrees of perturbation of the transport to the reference density. The (superior) convergence is demonstrated in comparison to Monte Carlo and Markov Chain Monte Carlo methods
    corecore