Fast rates in learning with dependent observations

Mathematics – Statistics Theory

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

In this paper we tackle the problem of fast rates in time series forecasting from a statistical learning perspective. In a serie of papers (e.g. Meir 2000, Modha and Masry 1998, Alquier and Wintenberger 2012) it is shown that the main tools used in learning theory with iid observations can be extended to the prediction of time series. The main message of these papers is that, given a family of predictors, we are able to build a new predictor that predicts the series as well as the best predictor in the family, up to a remainder of order $1/\sqrt{n}$. It is known that this rate cannot be improved in general. In this paper, we show that in the particular case of the least square loss, and under a strong assumption on the time series (phi-mixing) the remainder is actually of order $1/n$. Thus, the optimal rate for iid variables, see e.g. Tsybakov 2003, and individual sequences, see \cite{lugosi} is, for the first time, achieved for uniformly mixing processes. We also show that our method is optimal for aggregating sparse linear combinations of predictors.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Fast rates in learning with dependent observations does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Fast rates in learning with dependent observations, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Fast rates in learning with dependent observations will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-564467

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.