981 research outputs found

    Variational Data Assimilation via Sparse Regularization

    Get PDF
    This paper studies the role of sparse regularization in a properly chosen basis for variational data assimilation (VDA) problems. Specifically, it focuses on data assimilation of noisy and down-sampled observations while the state variable of interest exhibits sparsity in the real or transformed domain. We show that in the presence of sparsity, the β„“1\ell_{1}-norm regularization produces more accurate and stable solutions than the classic data assimilation methods. To motivate further developments of the proposed methodology, assimilation experiments are conducted in the wavelet and spectral domain using the linear advection-diffusion equation

    Variational Downscaling, Fusion and Assimilation of Hydrometeorological States via Regularized Estimation

    Full text link
    Improved estimation of hydrometeorological states from down-sampled observations and background model forecasts in a noisy environment, has been a subject of growing research in the past decades. Here, we introduce a unified framework that ties together the problems of downscaling, data fusion and data assimilation as ill-posed inverse problems. This framework seeks solutions beyond the classic least squares estimation paradigms by imposing proper regularization, which are constraints consistent with the degree of smoothness and probabilistic structure of the underlying state. We review relevant regularization methods in derivative space and extend classic formulations of the aforementioned problems with particular emphasis on hydrologic and atmospheric applications. Informed by the statistical characteristics of the state variable of interest, the central results of the paper suggest that proper regularization can lead to a more accurate and stable recovery of the true state and hence more skillful forecasts. In particular, using the Tikhonov and Huber regularization in the derivative space, the promise of the proposed framework is demonstrated in static downscaling and fusion of synthetic multi-sensor precipitation data, while a data assimilation numerical experiment is presented using the heat equation in a variational setting

    Simple Square Smoothing Regularization Operators

    Get PDF
    Tikhonov regularization of linear discrete ill-posed problems often is applied with a finite difference regularization operator that approximates a low-order derivative. These operators generally are represented by a banded rectangular matrix with fewer rows than columns. They therefore cannot be applied in iterative methods that are based on the Arnoldi process, which requires the regularization operator to be represented by a square matrix. This paper discusses two approaches to circumvent this difficulty: zero-padding the rectangular matrices to make them square and extending the rectangular matrix to a square circulant. We also describe how to combine these operators by weighted averaging and with orthogonal projection. Applications to Arnoldi and Lanczos bidiagonalization-based Tikhonov regularization, as well as to truncated iteration with a range-restricted minimal residual method, are presented

    Blind Minimax Estimation

    Full text link
    We consider the linear regression problem of estimating an unknown, deterministic parameter vector based on measurements corrupted by colored Gaussian noise. We present and analyze blind minimax estimators (BMEs), which consist of a bounded parameter set minimax estimator, whose parameter set is itself estimated from measurements. Thus, one does not require any prior assumption or knowledge, and the proposed estimator can be applied to any linear regression problem. We demonstrate analytically that the BMEs strictly dominate the least-squares estimator, i.e., they achieve lower mean-squared error for any value of the parameter vector. Both Stein's estimator and its positive-part correction can be derived within the blind minimax framework. Furthermore, our approach can be readily extended to a wider class of estimation problems than Stein's estimator, which is defined only for white noise and non-transformed measurements. We show through simulations that the BMEs generally outperform previous extensions of Stein's technique.Comment: 12 pages, 7 figure
    • …
    corecore