Computer Science – Learning
Scientific paper
2012-02-14
Computer Science
Learning
Scientific paper
We consider active, semi-supervised learning in an offline transductive setting. We show that a previously proposed error bound for active learning on undirected weighted graphs can be generalized by replacing graph cut with an arbitrary symmetric submodular function. Arbitrary non-symmetric submodular functions can be used via symmetrization. Different choices of submodular functions give different versions of the error bound that are appropriate for different kinds of problems. Moreover, the bound is deterministic and holds for adversarially chosen labels. We show exactly minimizing this error bound is NP-complete. However, we also introduce for any submodular function an associated active semi-supervised learning method that approximately minimizes the corresponding error bound. We show that the error bound is tight in the sense that there is no other bound of the same form which is better. Our theoretical results are supported by experiments on real data.
Bilmes Jeff A.
Guillory Andrew
No associations
LandOfFree
Active Semi-Supervised Learning using Submodular Functions does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Active Semi-Supervised Learning using Submodular Functions, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Active Semi-Supervised Learning using Submodular Functions will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-90467