Computer Science – Information Theory
Scientific paper
2010-10-22
Computer Science
Information Theory
4 pages, 2 figures
Scientific paper
The power of sparse signal coding with learned dictionaries has been demonstrated in a variety of applications and fields, from signal processing to statistical inference and machine learning. However, the statistical properties of these models, such as underfitting or overfitting given sets of data, are still not well characterized in the literature. This work aims at filling this gap by means of the Minimum Description Length (MDL) principle -- a well established information-theoretic approach to statistical inference. The resulting framework derives a family of efficient sparse coding and modeling (dictionary learning) algorithms, which by virtue of the MDL principle, are completely parameter free. Furthermore, such framework allows to incorporate additional prior information in the model, such as Markovian dependencies, in a natural way. We demonstrate the performance of the proposed framework with results for image denoising and classification tasks.
Ramirez Ignacio
Sapiro Guillermo
No associations
LandOfFree
Sparse coding and dictionary learning based on the MDL principle does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Sparse coding and dictionary learning based on the MDL principle, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Sparse coding and dictionary learning based on the MDL principle will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-660881