605 research outputs found

    Preconditioning of the background error covariance matrix in data assimilation for the Caspian Sea

    Get PDF
    Data Assimilation (DA) is an uncertainty quantification technique used for improving numerical forecasted results by incorporating observed data into prediction models. As a crucial point into DA models is the ill conditioning of the covariance matrices involved, it is mandatory to introduce, in a DA software, preconditioning methods. Here we present first studies concerning the introduction of two different preconditioning methods in a DA software we are developing (we named S3DVAR) which implements a Scalable Three Dimensional Variational Data Assimilation model for assimilating sea surface temperature (SST) values collected into the Caspian Sea by using the Regional Ocean Modeling System (ROMS) with observations provided by the Group of High resolution sea surface temperature (GHRSST). We also present the algorithmic strategies we employ

    A scalable space-time domain decomposition approach for solving large-scale nonlinear regularized inverse ill-posed problems in 4D variational data assimilation

    Full text link
    We develop innovative algorithms for solving the strong-constraint formulation of four-dimensional variational data assimilation in large-scale applications. We present a space-time decomposition approach that employs domain decomposition along both the spatial and temporal directions in the overlapping case and involves partitioning of both the solution and the operators. Starting from the global functional defined on the entire domain, we obtain a type of regularized local functionals on the set of subdomains providing the order reduction of both the predictive and the data assimilation models. We analyze the algorithm convergence and its performance in terms of reduction of time complexity and algorithmic scalability. The numerical experiments are carried out on the shallow water equation on the sphere according to the setup available at the Ocean Synthesis/Reanalysis Directory provided by Hamburg University.Comment: Received: 10 March 2020 / Revised: 29 November 2021 / Accepted: 7 March 202

    Subsurface Characterization using Ensemble-based Approaches with Deep Generative Models

    Full text link
    Estimating spatially distributed properties such as hydraulic conductivity (K) from available sparse measurements is a great challenge in subsurface characterization. However, the use of inverse modeling is limited for ill-posed, high-dimensional applications due to computational costs and poor prediction accuracy with sparse datasets. In this paper, we combine Wasserstein Generative Adversarial Network with Gradient Penalty (WGAN-GP), a deep generative model that can accurately capture complex subsurface structure, and Ensemble Smoother with Multiple Data Assimilation (ES-MDA), an ensemble-based inversion method, for accurate and accelerated subsurface characterization. WGAN-GP is trained to generate high-dimensional K fields from a low-dimensional latent space and ES-MDA then updates the latent variables by assimilating available measurements. Several subsurface examples are used to evaluate the accuracy and efficiency of the proposed method and the main features of the unknown K fields are characterized accurately with reliable uncertainty quantification. Furthermore, the estimation performance is compared with a widely-used variational, i.e., optimization-based, inversion approach, and the proposed approach outperforms the variational inversion method, especially for the channelized and fractured field examples. We explain such superior performance by visualizing the objective function in the latent space: because of nonlinear and aggressive dimension reduction via generative modeling, the objective function surface becomes extremely complex while the ensemble approximation can smooth out the multi-modal surface during the minimization. This suggests that the ensemble-based approach works well over the variational approach when combined with deep generative models at the cost of forward model runs unless convergence-ensuring modifications are implemented in the variational inversion
    • …
    corecore