38 research outputs found
Reweighted lp Constraint LMS-Based Adaptive Sparse Channel Estimation for Cooperative Communication System
This paper studies the issue of sparsity adaptive channel reconstruction in time-varying cooperative
communication networks through the amplify-and-forward transmission scheme. A new sparsity adaptive system
identification method is proposed, namely reweighted norm ( < < ) penalized least mean square(LMS)algorithm.
The main idea of the algorithm is to add a norm penalty of sparsity into the cost function of the LMS algorithm. By doing
so, the weight factor becomes a balance parameter of the associated norm adaptive sparse system identification.
Subsequently, the steady state of the coefficient misalignment vector is derived theoretically, with a performance upper
bounds provided which serve as a sufficient condition for the LMS channel estimation of the precise reweighted norm.
With the upper bounds, we prove that the ( < < ) norm sparsity inducing cost function is superior to the
reweighted norm. An optimal selection of for the norm problem is studied to recover various sparse channel
vectors. Several experiments verify that the simulation results agree well with the theoretical analysis, and thus
demonstrate that the proposed algorithm has a better convergence speed and better steady state behavior than other LMS
algorithms
SPARSE CHANNEL ESTIMATION WITH L P -NORM AND REWEIGHTED L 1 -NORM PENALIZED LEAST MEAN SQUARES
ABSTRACT The least mean squares (LMS) algorithm is one of the most popular recursive parameter estimation methods. In its standard form it does not take into account any special characteristics that the parameterized model may have. Assuming that such model is sparse in some domain (for example, it has sparse impulse or frequency response), we aim at developing such LMS algorithms that can adapt to the underlying sparsity and achieve better parameter estimates. Particularly, the example of channel estimation with sparse channel impulse response is considered. The proposed modifications of LMS are the l p -norm and reweighted l 1 -norm penalized LMS algorithms. Our simulation results confirm the superiority of the proposed algorithms over the standard LMS as well as other sparsity-aware modifications of LMS available in the literature