Smoothness, Low-Noise and Fast Rates

Computer Science – Learning

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

We establish am excess risk bound of order $H \Rad_n^2 + \sqrt{H L^*}\Rad_n$ for ERM with an H-smooth loss function and a hypothesis class with Rademacher complexity $\Rad_n$, where $L^*$ is the best risk achievable by the hypothesis class. For typical hypothesis classes where $\Rad_n = \sqrt{R/n}$, this translates to a learning rate of order $RH/n$ in the separable ($L^*=0$) case and $RH/n + \sqrt{L^* RH/n}$ more generally. We also provide similar guarantees for online and stochastic convex optimization of a smooth non-negative objective.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Smoothness, Low-Noise and Fast Rates does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Smoothness, Low-Noise and Fast Rates, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Smoothness, Low-Noise and Fast Rates will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-265639

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.