233 research outputs found

    Maximum Correntropy Kalman Filter

    Full text link
    Traditional Kalman filter (KF) is derived under the well-known minimum mean square error (MMSE) criterion, which is optimal under Gaussian assumption. However, when the signals are non-Gaussian, especially when the system is disturbed by some heavy-tailed impulsive noises, the performance of KF will deteriorate seriously. To improve the robustness of KF against impulsive noises, we propose in this work a new Kalman filter, called the maximum correntropy Kalman filter (MCKF), which adopts the robust maximum correntropy criterion (MCC) as the optimality criterion, instead of using the MMSE. Similar to the traditional KF, the state mean and covariance matrix propagation equations are used to give prior estimations of the state and covariance matrix in MCKF. A novel fixed-point algorithm is then used to update the posterior estimations. A sufficient condition that guarantees the convergence of the fixed-point algorithm is given. Illustration examples are presented to demonstrate the effectiveness and robustness of the new algorithm.Comment: 11 pages, 11 figures, 7 table

    Multi-Kernel Correntropy for Robust Learning

    Full text link
    As a novel similarity measure that is defined as the expectation of a kernel function between two random variables, correntropy has been successfully applied in robust machine learning and signal processing to combat large outliers. The kernel function in correntropy is usually a zero-mean Gaussian kernel. In a recent work, the concept of mixture correntropy (MC) was proposed to improve the learning performance, where the kernel function is a mixture Gaussian kernel, namely a linear combination of several zero-mean Gaussian kernels with different widths. In both correntropy and mixture correntropy, the center of the kernel function is, however, always located at zero. In the present work, to further improve the learning performance, we propose the concept of multi-kernel correntropy (MKC), in which each component of the mixture Gaussian kernel can be centered at a different location. The properties of the MKC are investigated and an efficient approach is proposed to determine the free parameters in MKC. Experimental results show that the learning algorithms under the maximum multi-kernel correntropy criterion (MMKCC) can outperform those under the original maximum correntropy criterion (MCC) and the maximum mixture correntropy criterion (MMCC).Comment: 10 pages, 5 figure

    Maximum Correntropy Derivative-Free Robust Kalman Filter and Smoother

    Full text link
    We consider the problem of robust estimation involving filtering and smoothing for nonlinear state space models which are disturbed by heavy-tailed impulsive noises. To deal with heavy-tailed noises and improve the robustness of the traditional nonlinear Gaussian Kalman filter and smoother, we propose in this work a general framework of robust filtering and smoothing, which adopts a new maximum correntropy criterion to replace the minimum mean square error for state estimation. To facilitate understanding, we present our robust framework in conjunction with the cubature Kalman filter and smoother. A half-quadratic optimization method is utilized to solve the formulated robust estimation problems, which leads to a new maximum correntropy derivative-free robust Kalman filter and smoother. Simulation results show that the proposed methods achieve a substantial performance improvement over the conventional and existing robust ones with slight computational time increase

    Minimum Error Entropy Kalman Filter

    Full text link
    To date most linear and nonlinear Kalman filters (KFs) have been developed under the Gaussian assumption and the well-known minimum mean square error (MMSE) criterion. In order to improve the robustness with respect to impulsive (or heavy-tailed) non-Gaussian noises, the maximum correntropy criterion (MCC) has recently been used to replace the MMSE criterion in developing several robust Kalman-type filters. To deal with more complicated non-Gaussian noises such as noises from multimodal distributions, in the present paper we develop a new Kalman-type filter, called minimum error entropy Kalman filter (MEE-KF), by using the minimum error entropy (MEE) criterion instead of the MMSE or MCC. Similar to the MCC based KFs, the proposed filter is also an online algorithm with recursive process, in which the propagation equations are used to give prior estimates of the state and covariance matrix, and a fixed-point algorithm is used to update the posterior estimates. In addition, the minimum error entropy extended Kalman filter (MEE-EKF) is also developed for performance improvement in the nonlinear situations. The high accuracy and strong robustness of MEE-KF and MEE-EKF are confirmed by experimental results.Comment: 12 pages, 4 figure

    Robustness of Maximum Correntropy Estimation Against Large Outliers

    Full text link
    The maximum correntropy criterion (MCC) has recently been successfully applied in robust regression, classification and adaptive filtering, where the correntropy is maximized instead of minimizing the well-known mean square error (MSE) to improve the robustness with respect to outliers (or impulsive noises). Considerable efforts have been devoted to develop various robust adaptive algorithms under MCC, but so far little insight has been gained as to how the optimal solution will be affected by outliers. In this work, we study this problem in the context of parameter estimation for a simple linear errors-in-variables (EIV) model where all variables are scalar. Under certain conditions, we derive an upper bound on the absolute value of the estimation error and show that the optimal solution under MCC can be very close to the true value of the unknown parameter even with outliers (whose values can be arbitrarily large) in both input and output variables. Illustrative examples are presented to verify and clarify the theory.Comment: 8 pages, 7 figure

    Maximum Correntropy Adaptive Filtering Approach for Robust Compressive Sensing Reconstruction

    Full text link
    Robust compressive sensing(CS) reconstruction has become an attractive research topic in recent years. Robust CS aims to reconstruct the sparse signals under non-Gaussian(i.e. heavy tailed) noises where traditional CS reconstruction algorithms may perform very poorly due to utilizing l2l_2 norm of the residual vector in optimization. Most of existing robust CS reconstruction algorithms are based on greedy pursuit method or convex relaxation approach. Recently, the adaptive filtering framework has been introduced to deal with the CS reconstruction, which shows desirable performance in both efficiency and reconstruction performance under Gaussian noise. In this paper, we propose an adaptive filtering based robust CS reconstruction algorithm, called l0l_0 regularized maximum correntropy criterion(l0l_0-MCC) algorithm, which combines the adaptive filtering framework and maximum correntropy criterion(MCC). MCC has recently been successfully used in adaptive filtering due to its robustness to impulsive non-Gaussian noises and low computational complexity. We analyze theoretically the stability of the proposed l0l_0-MCC algorithm. A mini-batch based l0l_0-MCC(MB-l0l_0-MCC) algorithm is further developed to speed up the convergence. Comparison with existing robust CS reconstruction algorithms is conducted via simulations, showing that the proposed l0l_0-MCC and MB-l0l_0-MCC can achieve significantly better performance than other algorithms

    Robust Matrix Completion via Maximum Correntropy Criterion and Half Quadratic Optimization

    Full text link
    Robust matrix completion aims to recover a low-rank matrix from a subset of noisy entries perturbed by complex noises, where traditional methods for matrix completion may perform poorly due to utilizing l2l_2 error norm in optimization. In this paper, we propose a novel and fast robust matrix completion method based on maximum correntropy criterion (MCC). The correntropy based error measure is utilized instead of using l2l_2-based error norm to improve the robustness to noises. Using the half-quadratic optimization technique, the correntropy based optimization can be transformed to a weighted matrix factorization problem. Then, two efficient algorithms are derived, including alternating minimization based algorithm and alternating gradient descend based algorithm. The proposed algorithms do not need to calculate singular value decomposition (SVD) at each iteration. Further, the adaptive kernel selection strategy is proposed to accelerate the convergence speed as well as improve the performance. Comparison with existing robust matrix completion algorithms is provided by simulations, showing that the new methods can achieve better performance than existing state-of-the-art algorithms

    Diffusion Maximum Correntropy Criterion Algorithms for Robust Distributed Estimation

    Full text link
    Robust diffusion adaptive estimation algorithms based on the maximum correntropy criterion (MCC), including adaptation to combination MCC and combination to adaptation MCC, are developed to deal with the distributed estimation over network in impulsive (long-tailed) noise environments. The cost functions used in distributed estimation are in general based on the mean square error (MSE) criterion, which is desirable when the measurement noise is Gaussian. In non-Gaussian situations, such as the impulsive-noise case, MCC based methods may achieve much better performance than the MSE methods as they take into account higher order statistics of error distribution. The proposed methods can also outperform the robust diffusion least mean p-power(DLMP) and diffusion minimum error entropy (DMEE) algorithms. The mean and mean square convergence analysis of the new algorithms are also carried out.Comment: 17 pages,10 figure

    Complex Correntropy Function: properties, and application to a channel equalization problem

    Full text link
    The use of correntropy as a similarity measure has been increasing in different scenarios due to the well-known ability to extract high-order statistic information from data. Recently, a new similarity measure between complex random variables was defined and called complex correntropy. Based on a Gaussian kernel, it extends the benefits of correntropy to complex-valued data. However, its properties have not yet been formalized. This paper studies the properties of this new similarity measure and extends this definition to positive-definite kernels. Complex correntropy is applied to a channel equalization problem as good results are achieved when compared with other algorithms such as the complex least mean square (CLMS), complex recursive least squares (CRLS), and least absolute deviation (LAD).Comment: 24 pages, 9 figure

    Marine Animal Classification with Correntropy Loss Based Multi-view Learning

    Full text link
    To analyze marine animals behavior, seasonal distribution and abundance, digital imagery can be acquired by visual or Lidar camera. Depending on the quantity and properties of acquired imagery, the animals are characterized as either features (shape, color, texture, etc.), or dissimilarity matrices derived from different shape analysis methods (shape context, internal distance shape context, etc.). For both cases, multi-view learning is critical in integrating more than one set of feature or dissimilarity matrix for higher classification accuracy. This paper adopts correntropy loss as cost function in multi-view learning, which has favorable statistical properties for rejecting noise. For the case of features, the correntropy loss-based multi-view learning and its entrywise variation are developed based on the multi-view intact space learning algorithm. For the case of dissimilarity matrices, the robust Euclidean embedding algorithm is extended to its multi-view form with the correntropy loss function. Results from simulated data and real-world marine animal imagery show that the proposed algorithms can effectively enhance classification rate, as well as suppress noise under different noise conditions
    • …
    corecore