Feature Hashing for Large Scale Multitask Learning

Computer Science – Artificial Intelligence

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Fixed broken theorem

Scientific paper

Empirical evidence suggests that hashing is an effective strategy for dimensionality reduction and practical nonparametric estimation. In this paper we provide exponential tail bounds for feature hashing and show that the interaction between random subspaces is negligible with high probability. We demonstrate the feasibility of this approach with experimental results for a new use case -- multitask learning with hundreds of thousands of tasks.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Feature Hashing for Large Scale Multitask Learning does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Feature Hashing for Large Scale Multitask Learning, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Feature Hashing for Large Scale Multitask Learning will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-356782

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.