Computer Science – Information Theory
Scientific paper
2010-05-05
Computer Science
Information Theory
Accepted to the Journal of Machine Learning Research (Feb 2011)
Scientific paper
The problem of learning forest-structured discrete graphical models from i.i.d. samples is considered. An algorithm based on pruning of the Chow-Liu tree through adaptive thresholding is proposed. It is shown that this algorithm is both structurally consistent and risk consistent and the error probability of structure learning decays faster than any polynomial in the number of samples under fixed model size. For the high-dimensional scenario where the size of the model d and the number of edges k scale with the number of samples n, sufficient conditions on (n,d,k) are given for the algorithm to satisfy structural and risk consistencies. In addition, the extremal structures for learning are identified; we prove that the independent (resp. tree) model is the hardest (resp. easiest) to learn using the proposed algorithm in terms of error rates for structure learning.
Anandkumar Animashree
Tan Vincent Y. F.
Willsky Alan S.
No associations
LandOfFree
Learning High-Dimensional Markov Forest Distributions: Analysis of Error Rates does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Learning High-Dimensional Markov Forest Distributions: Analysis of Error Rates, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Learning High-Dimensional Markov Forest Distributions: Analysis of Error Rates will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-437210