Dictionary learning under global sparsity constraint

Computer Science – Data Structures and Algorithms

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

27 pages, 9 figures, 1 table

Scientific paper

A new method is proposed in this paper to learn overcomplete dictionary from training data samples. Differing from the current methods that enforce similar sparsity constraint on each of the input samples, the proposed method attempts to impose global sparsity constraint on the entire data set. This enables the proposed method to fittingly assign the atoms of the dictionary to represent various samples and optimally adapt to the complicated structures underlying the entire data set. By virtue of the sparse coding and sparse PCA techniques, a simple algorithm is designed for the implementation of the method. The efficiency and the convergence of the proposed algorithm are also theoretically analyzed. Based on the experimental results implemented on a series of signal and image data sets, it is apparent that our method performs better than the current dictionary learning methods in original dictionary recovering, input data reconstructing, and salient data structure revealing.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Dictionary learning under global sparsity constraint does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Dictionary learning under global sparsity constraint, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Dictionary learning under global sparsity constraint will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-524249

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.