Statistics – Machine Learning
Scientific paper
2011-10-25
Statistics
Machine Learning
27 pages, 8 figures
Scientific paper
Latent feature models are widely used to decompose data into a small number of components. Bayesian nonparametric variants of these models, which use the Indian buffet process (IBP) as a prior over latent features, allow the number of features to be determined from the data. We present a generalization of the IBP, the distance dependent Indian buffet process (dd-IBP), for modeling non-exchangeable data. It relies on a distance function defined between data points, biasing nearby data to share more features. The choice of distance function allows for many kinds of dependencies, including temporal or spatial. Further, the original IBP is a special case of the dd-IBP. In this paper, we develop the dd-IBP and theoretically characterize the distribution of how features are shared between data. We derive a Markov chain Monte Carlo sampler for a linear Gaussian model with a dd-IBP prior and study its performance on several data sets for which exchangeability is not a reasonable assumption.
Blei David M.
Frazier Peter I.
Gershman Samuel J.
No associations
LandOfFree
Distance Dependent Infinite Latent Feature Models does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Distance Dependent Infinite Latent Feature Models, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Distance Dependent Infinite Latent Feature Models will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-94348