58 research outputs found

    Robustness analysis of a Maximum Correntropy framework for linear regression

    Full text link
    In this paper we formulate a solution of the robust linear regression problem in a general framework of correntropy maximization. Our formulation yields a unified class of estimators which includes the Gaussian and Laplacian kernel-based correntropy estimators as special cases. An analysis of the robustness properties is then provided. The analysis includes a quantitative characterization of the informativity degree of the regression which is appropriate for studying the stability of the estimator. Using this tool, a sufficient condition is expressed under which the parametric estimation error is shown to be bounded. Explicit expression of the bound is given and discussion on its numerical computation is supplied. For illustration purpose, two special cases are numerically studied.Comment: 10 pages, 5 figures, To appear in Automatic

    Partial Maximum Correntropy Regression for Robust Trajectory Decoding from Noisy Epidural Electrocorticographic Signals

    Full text link
    The Partial Least Square Regression (PLSR) exhibits admirable competence for predicting continuous variables from inter-correlated brain recordings in the brain-computer interface. However, PLSR is in essence formulated based on the least square criterion, thus, being non-robust with respect to noises. The aim of this study is to propose a new robust implementation for PLSR. To this end, the maximum correntropy criterion (MCC) is used to propose a new robust variant of PLSR, called as Partial Maximum Correntropy Regression (PMCR). The half-quadratic optimization is utilized to calculate the robust projectors for the dimensionality reduction, and the regression coefficients are optimized by a fixed-point approach. We evaluate the proposed PMCR with a synthetic example and the public Neurotycho electrocorticography (ECoG) datasets. The extensive experimental results demonstrate that, the proposed PMCR can achieve better prediction performance than the conventional PLSR and existing variants with three different performance indicators in high-dimensional and noisy regression tasks. PMCR can suppress the performance degradation caused by the adverse noise, ameliorating the decoding robustness of the brain-computer interface

    Investigation of the performance of multi-input multi-output detectors based on deep learning in non-Gaussian environments

    Get PDF
    The next generation of wireless cellular communication networks must be energy efficient, extremely reliable, and have low latency, leading to the necessity of using algorithms based on deep neural networks (DNN) which have better bit error rate (BER) or symbol error rate (SER) performance than traditional complex multi-antenna or multi-input multi-output (MIMO) detectors. This paper examines deep neural networks and deep iterative detectors such as OAMP-Net based on information theory criteria such as maximum correntropy criterion (MCC) for the implementation of MIMO detectors in non-Gaussian environments, and the results illustrate that the proposed method has better BER or SER performance

    Maximum Correntropy Ensemble Kalman Filter

    Full text link
    In this article, a robust ensemble Kalman filter (EnKF) called MC-EnKF is proposed for nonlinear state-space model to deal with filtering problems with non-Gaussian observation noises. Our MC-EnKF is derived based on maximum correntropy criterion (MCC) with some technical approximations. Moreover, we propose an effective adaptive strategy for kernel bandwidth selection.Besides, the relations between the common EnKF and MC-EnKF are given, i.e., MC-EnKF will converge to the common EnKF when the kernel bandwidth tends to infinity. This justification provides a complementary understanding of the kernel bandwidth selection for MC-EnKF. In experiments, non-Gaussian observation noises significantly reduce the performance of the common EnKF for both linear and nonlinear systems, whereas our proposed MC-EnKF with a suitable kernel bandwidth maintains its good performance at only a marginal increase in computing cost, demonstrating its robustness and efficiency to non-Gaussian observation noises.Comment: Accepted by 62nd IEEE Conference on Decision and Control (CDC 2023
    • …
    corecore