2,333 research outputs found

    A new robust and efficient estimator for ill-conditioned linear inverse problems with outliers

    Get PDF
    Solving a linear inverse problem may include difficulties such as the presence of outliers and a mixing matrix with a large condition number. In such cases a regularized robust estimator is needed. We propose a new tau-type regularized robust estimator that is simultaneously highly robust against outliers, highly efficient in the presence of purely Gaussian noise, and also stable when the mixing matrix has a large condition number. We also propose an algorithm to compute the estimates, based on a regularized iterative reweighted least squares algorithm. A basic and a fast version of the algorithm are given. Finally, we test the performance of the proposed approach using numerical experiments and compare it with other estimators. Our estimator provides superior robustness, even up to 40% of outliers, while at the same time performing quite close to the optimal maximum likelihood estimator in the outlier-free case

    Variational Downscaling, Fusion and Assimilation of Hydrometeorological States via Regularized Estimation

    Full text link
    Improved estimation of hydrometeorological states from down-sampled observations and background model forecasts in a noisy environment, has been a subject of growing research in the past decades. Here, we introduce a unified framework that ties together the problems of downscaling, data fusion and data assimilation as ill-posed inverse problems. This framework seeks solutions beyond the classic least squares estimation paradigms by imposing proper regularization, which are constraints consistent with the degree of smoothness and probabilistic structure of the underlying state. We review relevant regularization methods in derivative space and extend classic formulations of the aforementioned problems with particular emphasis on hydrologic and atmospheric applications. Informed by the statistical characteristics of the state variable of interest, the central results of the paper suggest that proper regularization can lead to a more accurate and stable recovery of the true state and hence more skillful forecasts. In particular, using the Tikhonov and Huber regularization in the derivative space, the promise of the proposed framework is demonstrated in static downscaling and fusion of synthetic multi-sensor precipitation data, while a data assimilation numerical experiment is presented using the heat equation in a variational setting

    Significance Regression: Robust Regression for Collinear Data

    Get PDF
    This paper examines robust linear multivariable regression from collinear data. A brief review of M-estimators discusses the strengths of this approach for tolerating outliers and/or perturbations in the error distributions. The review reveals that M-estimation may be unreliable if the data exhibit collinearity. Next, significance regression (SR) is discussed. SR is a successful method for treating collinearity but is not robust. A new significance regression algorithm for the weighted-least-squares error criterion (SR-WLS) is developed. Using the weights computed via M-estimation with the SR-WLS algorithm yields an effective method that robustly mollifies collinearity problems. Numerical examples illustrate the main points

    Vector Approximate Message Passing for the Generalized Linear Model

    Full text link
    The generalized linear model (GLM), where a random vector x\boldsymbol{x} is observed through a noisy, possibly nonlinear, function of a linear transform output z=Ax\boldsymbol{z}=\boldsymbol{Ax}, arises in a range of applications such as robust regression, binary classification, quantized compressed sensing, phase retrieval, photon-limited imaging, and inference from neural spike trains. When A\boldsymbol{A} is large and i.i.d. Gaussian, the generalized approximate message passing (GAMP) algorithm is an efficient means of MAP or marginal inference, and its performance can be rigorously characterized by a scalar state evolution. For general A\boldsymbol{A}, though, GAMP can misbehave. Damping and sequential-updating help to robustify GAMP, but their effects are limited. Recently, a "vector AMP" (VAMP) algorithm was proposed for additive white Gaussian noise channels. VAMP extends AMP's guarantees from i.i.d. Gaussian A\boldsymbol{A} to the larger class of rotationally invariant A\boldsymbol{A}. In this paper, we show how VAMP can be extended to the GLM. Numerical experiments show that the proposed GLM-VAMP is much more robust to ill-conditioning in A\boldsymbol{A} than damped GAMP

    Collinearity and consequences for estimation: a study and simulation

    Get PDF
    • …
    corecore