Computer Science – Information Theory
Scientific paper
2012-04-03
Computer Science
Information Theory
6 pages, 5 figures, to be included in 20th Iranian Conference on Electrical Engineering, IEEE, May 2012
Scientific paper
We propose a new algorithm for recovery of sparse signals from their compressively sensed samples. The proposed algorithm benefits from the strategy of gradual movement to estimate the positions of non-zero samples of sparse signal. We decompose each sample of signal into two variables, namely "value" and "detector", by a weighted exponential function. We update these new variables using gradient descent method. Like the traditional compressed sensing algorithms, the first variable is used to solve the Least Absolute Shrinkage and Selection Operator (Lasso) problem. As a new strategy, the second variable participates in the regularization term of the Lasso (l1 norm) that gradually detects the non-zero elements. The presence of the second variable enables us to extend the corresponding vector of the first variable to matrix form. This makes possible use of the correlation matrix for a heuristic search in the case that there are correlations among the samples of signal. We compare the performance of the new algorithm with various algorithms for uncorrelated and correlated sparsity. The results indicate the efficiency of the proposed methods.
Hosseini Seyed Hossein
Shayesteh Mahrokh G.
No associations
LandOfFree
Gradually Atom Pruning for Sparse Reconstruction and Extension to Correlated Sparsity does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Gradually Atom Pruning for Sparse Reconstruction and Extension to Correlated Sparsity, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Gradually Atom Pruning for Sparse Reconstruction and Extension to Correlated Sparsity will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-355532