As one of the recently proposed algorithms for sparse system identification,
l0 norm constraint Least Mean Square (l0-LMS) algorithm modifies the cost
function of the traditional method with a penalty of tap-weight sparsity. The
performance of l0-LMS is quite attractive compared with its various
precursors. However, there has been no detailed study of its performance. This
paper presents all-around and throughout theoretical performance analysis of
l0-LMS for white Gaussian input data based on some reasonable assumptions.
Expressions for steady-state mean square deviation (MSD) are derived and
discussed with respect to algorithm parameters and system sparsity. The
parameter selection rule is established for achieving the best performance.
Approximated with Taylor series, the instantaneous behavior is also derived. In
addition, the relationship between l0-LMS and some previous arts and the
sufficient conditions for l0-LMS to accelerate convergence are set up.
Finally, all of the theoretical results are compared with simulations and are
shown to agree well in a large range of parameter setting.Comment: 31 pages, 8 figure