203 research outputs found

    Robustness analysis of a Maximum Correntropy framework for linear regression

    Full text link
    In this paper we formulate a solution of the robust linear regression problem in a general framework of correntropy maximization. Our formulation yields a unified class of estimators which includes the Gaussian and Laplacian kernel-based correntropy estimators as special cases. An analysis of the robustness properties is then provided. The analysis includes a quantitative characterization of the informativity degree of the regression which is appropriate for studying the stability of the estimator. Using this tool, a sufficient condition is expressed under which the parametric estimation error is shown to be bounded. Explicit expression of the bound is given and discussion on its numerical computation is supplied. For illustration purpose, two special cases are numerically studied.Comment: 10 pages, 5 figures, To appear in Automatic

    Correntropy Maximization via ADMM - Application to Robust Hyperspectral Unmixing

    Full text link
    In hyperspectral images, some spectral bands suffer from low signal-to-noise ratio due to noisy acquisition and atmospheric effects, thus requiring robust techniques for the unmixing problem. This paper presents a robust supervised spectral unmixing approach for hyperspectral images. The robustness is achieved by writing the unmixing problem as the maximization of the correntropy criterion subject to the most commonly used constraints. Two unmixing problems are derived: the first problem considers the fully-constrained unmixing, with both the non-negativity and sum-to-one constraints, while the second one deals with the non-negativity and the sparsity-promoting of the abundances. The corresponding optimization problems are solved efficiently using an alternating direction method of multipliers (ADMM) approach. Experiments on synthetic and real hyperspectral images validate the performance of the proposed algorithms for different scenarios, demonstrating that the correntropy-based unmixing is robust to outlier bands.Comment: 23 page

    Multi-kernel Correntropy Regression: Robustness, Optimality, and Application on Magnetometer Calibration

    Full text link
    This paper investigates the robustness and optimality of the multi-kernel correntropy (MKC) on linear regression. We first derive an upper error bound for a scalar regression problem in the presence of arbitrarily large outliers and reveal that the kernel bandwidth should be neither too small nor too big in the sense of the lowest upper error bound. Meanwhile, we find that the proposed MKC is related to a specific heavy-tail distribution, and the level of the heavy tail is controlled by the kernel bandwidth solely. Interestingly, this distribution becomes the Gaussian distribution when the bandwidth is set to be infinite, which allows one to tackle both Gaussian and non-Gaussian problems. We propose an expectation-maximization (EM) algorithm to estimate the parameter vectors and explore the kernel bandwidths alternatively. The results show that our algorithm is equivalent to the traditional linear regression under Gaussian noise and outperforms the conventional method under heavy-tailed noise. Both numerical simulations and experiments on a magnetometer calibration application verify the effectiveness of the proposed method
    • …
    corecore