25,451 research outputs found

    Performance Analysis of l_0 Norm Constraint Least Mean Square Algorithm

    Full text link
    As one of the recently proposed algorithms for sparse system identification, l0l_0 norm constraint Least Mean Square (l0l_0-LMS) algorithm modifies the cost function of the traditional method with a penalty of tap-weight sparsity. The performance of l0l_0-LMS is quite attractive compared with its various precursors. However, there has been no detailed study of its performance. This paper presents all-around and throughout theoretical performance analysis of l0l_0-LMS for white Gaussian input data based on some reasonable assumptions. Expressions for steady-state mean square deviation (MSD) are derived and discussed with respect to algorithm parameters and system sparsity. The parameter selection rule is established for achieving the best performance. Approximated with Taylor series, the instantaneous behavior is also derived. In addition, the relationship between l0l_0-LMS and some previous arts and the sufficient conditions for l0l_0-LMS to accelerate convergence are set up. Finally, all of the theoretical results are compared with simulations and are shown to agree well in a large range of parameter setting.Comment: 31 pages, 8 figure

    Stochastic Behavior of the Nonnegative Least Mean Fourth Algorithm for Stationary Gaussian Inputs and Slow Learning

    Full text link
    Some system identification problems impose nonnegativity constraints on the parameters to estimate due to inherent physical characteristics of the unknown system. The nonnegative least-mean-square (NNLMS) algorithm and its variants allow to address this problem in an online manner. A nonnegative least mean fourth (NNLMF) algorithm has been recently proposed to improve the performance of these algorithms in cases where the measurement noise is not Gaussian. This paper provides a first theoretical analysis of the stochastic behavior of the NNLMF algorithm for stationary Gaussian inputs and slow learning. Simulation results illustrate the accuracy of the proposed analysis.Comment: 11 pages, 8 figures, submitted for publicatio

    A Robust Zero-point Attraction LMS Algorithm on Near Sparse System Identification

    Full text link
    The newly proposed l1l_1 norm constraint zero-point attraction Least Mean Square algorithm (ZA-LMS) demonstrates excellent performance on exact sparse system identification. However, ZA-LMS has less advantage against standard LMS when the system is near sparse. Thus, in this paper, firstly the near sparse system modeling by Generalized Gaussian Distribution is recommended, where the sparsity is defined accordingly. Secondly, two modifications to the ZA-LMS algorithm have been made. The l1l_1 norm penalty is replaced by a partial l1l_1 norm in the cost function, enhancing robustness without increasing the computational complexity. Moreover, the zero-point attraction item is weighted by the magnitude of estimation error which adjusts the zero-point attraction force dynamically. By combining the two improvements, Dynamic Windowing ZA-LMS (DWZA-LMS) algorithm is further proposed, which shows better performance on near sparse system identification. In addition, the mean square performance of DWZA-LMS algorithm is analyzed. Finally, computer simulations demonstrate the effectiveness of the proposed algorithm and verify the result of theoretical analysis.Comment: 20 pages, 11 figure
    • …
    corecore