As one of the recently proposed algorithms for sparse system identification,
l0β norm constraint Least Mean Square (l0β-LMS) algorithm modifies the cost
function of the traditional method with a penalty of tap-weight sparsity. The
performance of l0β-LMS is quite attractive compared with its various
precursors. However, there has been no detailed study of its performance. This
paper presents all-around and throughout theoretical performance analysis of
l0β-LMS for white Gaussian input data based on some reasonable assumptions.
Expressions for steady-state mean square deviation (MSD) are derived and
discussed with respect to algorithm parameters and system sparsity. The
parameter selection rule is established for achieving the best performance.
Approximated with Taylor series, the instantaneous behavior is also derived. In
addition, the relationship between l0β-LMS and some previous arts and the
sufficient conditions for l0β-LMS to accelerate convergence are set up.
Finally, all of the theoretical results are compared with simulations and are
shown to agree well in a large range of parameter setting.Comment: 31 pages, 8 figure