Computer Science – Learning
Scientific paper
2010-09-20
Computer Science
Learning
Scientific paper
We establish am excess risk bound of order $H \Rad_n^2 + \sqrt{H L^*}\Rad_n$ for ERM with an H-smooth loss function and a hypothesis class with Rademacher complexity $\Rad_n$, where $L^*$ is the best risk achievable by the hypothesis class. For typical hypothesis classes where $\Rad_n = \sqrt{R/n}$, this translates to a learning rate of order $RH/n$ in the separable ($L^*=0$) case and $RH/n + \sqrt{L^* RH/n}$ more generally. We also provide similar guarantees for online and stochastic convex optimization of a smooth non-negative objective.
Srebro Nathan
Sridharan Karthik
Tewari Ambuj
No associations
LandOfFree
Smoothness, Low-Noise and Fast Rates does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Smoothness, Low-Noise and Fast Rates, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Smoothness, Low-Noise and Fast Rates will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-265639