720 research outputs found
Performance Analysis of l_0 Norm Constraint Least Mean Square Algorithm
As one of the recently proposed algorithms for sparse system identification,
norm constraint Least Mean Square (-LMS) algorithm modifies the cost
function of the traditional method with a penalty of tap-weight sparsity. The
performance of -LMS is quite attractive compared with its various
precursors. However, there has been no detailed study of its performance. This
paper presents all-around and throughout theoretical performance analysis of
-LMS for white Gaussian input data based on some reasonable assumptions.
Expressions for steady-state mean square deviation (MSD) are derived and
discussed with respect to algorithm parameters and system sparsity. The
parameter selection rule is established for achieving the best performance.
Approximated with Taylor series, the instantaneous behavior is also derived. In
addition, the relationship between -LMS and some previous arts and the
sufficient conditions for -LMS to accelerate convergence are set up.
Finally, all of the theoretical results are compared with simulations and are
shown to agree well in a large range of parameter setting.Comment: 31 pages, 8 figure
Sparsity-Aware Adaptive Algorithms Based on Alternating Optimization with Shrinkage
This letter proposes a novel sparsity-aware adaptive filtering scheme and
algorithms based on an alternating optimization strategy with shrinkage. The
proposed scheme employs a two-stage structure that consists of an alternating
optimization of a diagonally-structured matrix that speeds up the convergence
and an adaptive filter with a shrinkage function that forces the coefficients
with small magnitudes to zero. We devise alternating optimization least-mean
square (LMS) algorithms for the proposed scheme and analyze its mean-square
error. Simulations for a system identification application show that the
proposed scheme and algorithms outperform in convergence and tracking existing
sparsity-aware algorithms.Comment: 10 pages, 3 figures. IEEE Signal Processing Letters, 201
A Robust Zero-point Attraction LMS Algorithm on Near Sparse System Identification
The newly proposed norm constraint zero-point attraction Least Mean
Square algorithm (ZA-LMS) demonstrates excellent performance on exact sparse
system identification. However, ZA-LMS has less advantage against standard LMS
when the system is near sparse. Thus, in this paper, firstly the near sparse
system modeling by Generalized Gaussian Distribution is recommended, where the
sparsity is defined accordingly. Secondly, two modifications to the ZA-LMS
algorithm have been made. The norm penalty is replaced by a partial
norm in the cost function, enhancing robustness without increasing the
computational complexity. Moreover, the zero-point attraction item is weighted
by the magnitude of estimation error which adjusts the zero-point attraction
force dynamically. By combining the two improvements, Dynamic Windowing ZA-LMS
(DWZA-LMS) algorithm is further proposed, which shows better performance on
near sparse system identification. In addition, the mean square performance of
DWZA-LMS algorithm is analyzed. Finally, computer simulations demonstrate the
effectiveness of the proposed algorithm and verify the result of theoretical
analysis.Comment: 20 pages, 11 figure
- β¦