Lipschitz Parametrization of Probabilistic Graphical Models

Computer Science – Learning

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

We show that the log-likelihood of several probabilistic graphical models is Lipschitz continuous with respect to the lp-norm of the parameters. We discuss several implications of Lipschitz parametrization. We present an upper bound of the Kullback-Leibler divergence that allows understanding methods that penalize the lp-norm of differences of parameters as the minimization of that upper bound. The expected log-likelihood is lower bounded by the negative lp-norm, which allows understanding the generalization ability of probabilistic models. The exponential of the negative lp-norm is involved in the lower bound of the Bayes error rate, which shows that it is reasonable to use parameters as features in algorithms that rely on metric spaces (e.g. classification, dimensionality reduction, clustering). Our results do not rely on specific algorithms for learning the structure or parameters. We show preliminary results for activity recognition and temporal segmentation.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Lipschitz Parametrization of Probabilistic Graphical Models does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Lipschitz Parametrization of Probabilistic Graphical Models, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Lipschitz Parametrization of Probabilistic Graphical Models will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-90505

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.