Statistics – Machine Learning
Scientific paper
2011-12-03
Statistics
Machine Learning
Scientific paper
Information-maximization clustering learns a probabilistic classifier in an unsupervised manner so that mutual information between feature vectors and cluster assignments is maximized. A notable advantage of this approach is that it only involves continuous optimization of model parameters, which is substantially easier to solve than discrete optimization of cluster assignments. However, existing methods still involve non-convex optimization problems, and therefore finding a good local optimal solution is not straightforward in practice. In this paper, we propose an alternative information-maximization clustering method based on a squared-loss variant of mutual information. This novel approach gives a clustering solution analytically in a computationally efficient way via kernel eigenvalue decomposition. Furthermore, we provide a practical model selection procedure that allows us to objectively optimize tuning parameters included in the kernel function. Through experiments, we demonstrate the usefulness of the proposed approach.
Hachiya Hirotaka
Kimura Manabu
Sugiyama Masashi
Yamada Makoto
No associations
LandOfFree
Information-Maximization Clustering based on Squared-Loss Mutual Information does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Information-Maximization Clustering based on Squared-Loss Mutual Information, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Information-Maximization Clustering based on Squared-Loss Mutual Information will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-697687