Active Semi-Supervised Learning using Submodular Functions

Computer Science – Learning

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

We consider active, semi-supervised learning in an offline transductive setting. We show that a previously proposed error bound for active learning on undirected weighted graphs can be generalized by replacing graph cut with an arbitrary symmetric submodular function. Arbitrary non-symmetric submodular functions can be used via symmetrization. Different choices of submodular functions give different versions of the error bound that are appropriate for different kinds of problems. Moreover, the bound is deterministic and holds for adversarially chosen labels. We show exactly minimizing this error bound is NP-complete. However, we also introduce for any submodular function an associated active semi-supervised learning method that approximately minimizes the corresponding error bound. We show that the error bound is tight in the sense that there is no other bound of the same form which is better. Our theoretical results are supported by experiments on real data.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Active Semi-Supervised Learning using Submodular Functions does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Active Semi-Supervised Learning using Submodular Functions, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Active Semi-Supervised Learning using Submodular Functions will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-90467

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.