Computer Science – Information Theory
Scientific paper
2012-03-07
Computer Science
Information Theory
13 pages, 8 figures
Scientific paper
As one of the recently proposed algorithms for sparse system identification, $l_0$ norm constraint Least Mean Square ($l_0$-LMS) algorithm modifies the cost function of the traditional method with a penalty of tap-weight sparsity. The performance of $l_0$-LMS is quite attractive compared with its various precursors. However, there has been no detailed study of its performance. This paper presents all-around and throughout theoretical performance analysis of $l_0$-LMS for white Gaussian input data based on some reasonable assumptions. Expressions for steady-state mean square deviation (MSD) are derived and discussed with respect to algorithm parameters and system sparsity. The parameter selection rule is established for achieving the best performance. Approximated with Taylor series, the instantaneous behavior is also derived. In addition, the relationship between $l_0$-LMS and some previous arts and the sufficient conditions for $l_0$-LMS to accelerate convergence are set up. Finally, all of the theoretical results are compared with simulations and are shown to agree well in a large range of parameter setting.
Gu Yuantao
Jin Jeongwan
Su Guolong
Wang Jian
No associations
LandOfFree
Performance Analysis of $l_0$ Norm Constraint Least Mean Square Algorithm does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Performance Analysis of $l_0$ Norm Constraint Least Mean Square Algorithm, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Performance Analysis of $l_0$ Norm Constraint Least Mean Square Algorithm will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-395005