187 research outputs found

    Complex Correntropy Function: properties, and application to a channel equalization problem

    Full text link
    The use of correntropy as a similarity measure has been increasing in different scenarios due to the well-known ability to extract high-order statistic information from data. Recently, a new similarity measure between complex random variables was defined and called complex correntropy. Based on a Gaussian kernel, it extends the benefits of correntropy to complex-valued data. However, its properties have not yet been formalized. This paper studies the properties of this new similarity measure and extends this definition to positive-definite kernels. Complex correntropy is applied to a channel equalization problem as good results are achieved when compared with other algorithms such as the complex least mean square (CLMS), complex recursive least squares (CRLS), and least absolute deviation (LAD).Comment: 24 pages, 9 figure

    Multi-Kernel Correntropy for Robust Learning

    Full text link
    As a novel similarity measure that is defined as the expectation of a kernel function between two random variables, correntropy has been successfully applied in robust machine learning and signal processing to combat large outliers. The kernel function in correntropy is usually a zero-mean Gaussian kernel. In a recent work, the concept of mixture correntropy (MC) was proposed to improve the learning performance, where the kernel function is a mixture Gaussian kernel, namely a linear combination of several zero-mean Gaussian kernels with different widths. In both correntropy and mixture correntropy, the center of the kernel function is, however, always located at zero. In the present work, to further improve the learning performance, we propose the concept of multi-kernel correntropy (MKC), in which each component of the mixture Gaussian kernel can be centered at a different location. The properties of the MKC are investigated and an efficient approach is proposed to determine the free parameters in MKC. Experimental results show that the learning algorithms under the maximum multi-kernel correntropy criterion (MMKCC) can outperform those under the original maximum correntropy criterion (MCC) and the maximum mixture correntropy criterion (MMCC).Comment: 10 pages, 5 figure

    Robustness of Maximum Correntropy Estimation Against Large Outliers

    Full text link
    The maximum correntropy criterion (MCC) has recently been successfully applied in robust regression, classification and adaptive filtering, where the correntropy is maximized instead of minimizing the well-known mean square error (MSE) to improve the robustness with respect to outliers (or impulsive noises). Considerable efforts have been devoted to develop various robust adaptive algorithms under MCC, but so far little insight has been gained as to how the optimal solution will be affected by outliers. In this work, we study this problem in the context of parameter estimation for a simple linear errors-in-variables (EIV) model where all variables are scalar. Under certain conditions, we derive an upper bound on the absolute value of the estimation error and show that the optimal solution under MCC can be very close to the true value of the unknown parameter even with outliers (whose values can be arbitrarily large) in both input and output variables. Illustrative examples are presented to verify and clarify the theory.Comment: 8 pages, 7 figure

    Diffusion Maximum Correntropy Criterion Algorithms for Robust Distributed Estimation

    Full text link
    Robust diffusion adaptive estimation algorithms based on the maximum correntropy criterion (MCC), including adaptation to combination MCC and combination to adaptation MCC, are developed to deal with the distributed estimation over network in impulsive (long-tailed) noise environments. The cost functions used in distributed estimation are in general based on the mean square error (MSE) criterion, which is desirable when the measurement noise is Gaussian. In non-Gaussian situations, such as the impulsive-noise case, MCC based methods may achieve much better performance than the MSE methods as they take into account higher order statistics of error distribution. The proposed methods can also outperform the robust diffusion least mean p-power(DLMP) and diffusion minimum error entropy (DMEE) algorithms. The mean and mean square convergence analysis of the new algorithms are also carried out.Comment: 17 pages,10 figure

    Bias-Compensated Normalized Maximum Correntropy Criterion Algorithm for System Identification with Noisy Input

    Full text link
    This paper proposed a bias-compensated normalized maximum correntropy criterion (BCNMCC) algorithm charactered by its low steady-state misalignment for system identification with noisy input in an impulsive output noise environment. The normalized maximum correntropy criterion (NMCC) is derived from a correntropy based cost function, which is rather robust with respect to impulsive noises. To deal with the noisy input, we introduce a bias-compensated vector (BCV) to the NMCC algorithm, and then an unbiasedness criterion and some reasonable assumptions are used to compute the BCV. Taking advantage of the BCV, the bias caused by the input noise can be effectively suppressed. System identification simulation results demonstrate that the proposed BCNMCC algorithm can outperform other related algorithms with noisy input especially in an impulsive output noise environment.Comment: 14 pages, 4 figure

    Maximum correntropy criterion based sparse adaptive filtering algorithms for robust channel estimation under non-Gaussian environments

    Full text link
    Sparse adaptive channel estimation problem is one of the most important topics in broadband wireless communications systems due to its simplicity and robustness. So far many sparsity-aware channel estimation algorithms have been developed based on the well-known minimum mean square error (MMSE) criterion, such as the zero-attracting least mean square (ZALMS), which are robust under Gaussian assumption. In non-Gaussian environments, however, these methods are often no longer robust especially when systems are disturbed by random impulsive noises. To address this problem, we propose in this work a robust sparse adaptive filtering algorithm using correntropy induced metric (CIM) penalized maximum correntropy criterion (MCC) rather than conventional MMSE criterion for robust channel estimation. Specifically, MCC is utilized to mitigate the impulsive noise while CIM is adopted to exploit the channel sparsity efficiently. Both theoretical analysis and computer simulations are provided to corroborate the proposed methods.Comment: 29 pages, 12 figures, accepted by Journal of the Franklin Institut

    Minimum Error Entropy Kalman Filter

    Full text link
    To date most linear and nonlinear Kalman filters (KFs) have been developed under the Gaussian assumption and the well-known minimum mean square error (MMSE) criterion. In order to improve the robustness with respect to impulsive (or heavy-tailed) non-Gaussian noises, the maximum correntropy criterion (MCC) has recently been used to replace the MMSE criterion in developing several robust Kalman-type filters. To deal with more complicated non-Gaussian noises such as noises from multimodal distributions, in the present paper we develop a new Kalman-type filter, called minimum error entropy Kalman filter (MEE-KF), by using the minimum error entropy (MEE) criterion instead of the MMSE or MCC. Similar to the MCC based KFs, the proposed filter is also an online algorithm with recursive process, in which the propagation equations are used to give prior estimates of the state and covariance matrix, and a fixed-point algorithm is used to update the posterior estimates. In addition, the minimum error entropy extended Kalman filter (MEE-EKF) is also developed for performance improvement in the nonlinear situations. The high accuracy and strong robustness of MEE-KF and MEE-EKF are confirmed by experimental results.Comment: 12 pages, 4 figure

    Maximum Correntropy Adaptive Filtering Approach for Robust Compressive Sensing Reconstruction

    Full text link
    Robust compressive sensing(CS) reconstruction has become an attractive research topic in recent years. Robust CS aims to reconstruct the sparse signals under non-Gaussian(i.e. heavy tailed) noises where traditional CS reconstruction algorithms may perform very poorly due to utilizing l2l_2 norm of the residual vector in optimization. Most of existing robust CS reconstruction algorithms are based on greedy pursuit method or convex relaxation approach. Recently, the adaptive filtering framework has been introduced to deal with the CS reconstruction, which shows desirable performance in both efficiency and reconstruction performance under Gaussian noise. In this paper, we propose an adaptive filtering based robust CS reconstruction algorithm, called l0l_0 regularized maximum correntropy criterion(l0l_0-MCC) algorithm, which combines the adaptive filtering framework and maximum correntropy criterion(MCC). MCC has recently been successfully used in adaptive filtering due to its robustness to impulsive non-Gaussian noises and low computational complexity. We analyze theoretically the stability of the proposed l0l_0-MCC algorithm. A mini-batch based l0l_0-MCC(MB-l0l_0-MCC) algorithm is further developed to speed up the convergence. Comparison with existing robust CS reconstruction algorithms is conducted via simulations, showing that the proposed l0l_0-MCC and MB-l0l_0-MCC can achieve significantly better performance than other algorithms

    Maximum Correntropy Derivative-Free Robust Kalman Filter and Smoother

    Full text link
    We consider the problem of robust estimation involving filtering and smoothing for nonlinear state space models which are disturbed by heavy-tailed impulsive noises. To deal with heavy-tailed noises and improve the robustness of the traditional nonlinear Gaussian Kalman filter and smoother, we propose in this work a general framework of robust filtering and smoothing, which adopts a new maximum correntropy criterion to replace the minimum mean square error for state estimation. To facilitate understanding, we present our robust framework in conjunction with the cubature Kalman filter and smoother. A half-quadratic optimization method is utilized to solve the formulated robust estimation problems, which leads to a new maximum correntropy derivative-free robust Kalman filter and smoother. Simulation results show that the proposed methods achieve a substantial performance improvement over the conventional and existing robust ones with slight computational time increase

    Quantized Minimum Error Entropy Criterion

    Full text link
    Comparing with traditional learning criteria, such as mean square error (MSE), the minimum error entropy (MEE) criterion is superior in nonlinear and non-Gaussian signal processing and machine learning. The argument of the logarithm in Renyis entropy estimator, called information potential (IP), is a popular MEE cost in information theoretic learning (ITL). The computational complexity of IP is however quadratic in terms of sample number due to double summation. This creates computational bottlenecks especially for large-scale datasets. To address this problem, in this work we propose an efficient quantization approach to reduce the computational burden of IP, which decreases the complexity from O(N*N) to O (MN) with M << N. The new learning criterion is called the quantized MEE (QMEE). Some basic properties of QMEE are presented. Illustrative examples are provided to verify the excellent performance of QMEE
    • …
    corecore