COMET: A Recipe for Learning and Using Large Ensembles on Massive Data

Computer Science – Learning

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

COMET is a single-pass MapReduce algorithm for learning on large-scale data. It builds multiple random forest ensembles on distributed blocks of data and merges them into a mega-ensemble. This approach is appropriate when learning from massive-scale data that is too large to fit on a single machine. To get the best accuracy, IVoting should be used instead of bagging to generate the training subset for each decision tree in the random forest. Experiments with two large datasets (5GB and 50GB compressed) show that COMET compares favorably (in both accuracy and training time) to learning on a subsample of data using a serial algorithm. Finally, we propose a new Gaussian approach for lazy ensemble evaluation which dynamically decides how many ensemble members to evaluate per data point; this can reduce evaluation cost by 100X or more.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

COMET: A Recipe for Learning and Using Large Ensembles on Massive Data does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with COMET: A Recipe for Learning and Using Large Ensembles on Massive Data, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and COMET: A Recipe for Learning and Using Large Ensembles on Massive Data will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-631673

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.