Computer Science – Information Theory
Scientific paper
2009-01-13
Computer Science
Information Theory
5 pages; to appear in Proc. ISIT 2009
Scientific paper
The problem of statistical learning is to construct an accurate predictor of a random variable as a function of a correlated random variable on the basis of an i.i.d. training sample from their joint distribution. Allowable predictors are constrained to lie in some specified class, and the goal is to approach asymptotically the performance of the best predictor in the class. We consider two settings in which the learning agent only has access to rate-limited descriptions of the training data, and present information-theoretic bounds on the predictor performance achievable in the presence of these communication constraints. Our proofs do not assume any separation structure between compression and learning and rely on a new class of operational criteria specifically tailored to joint design of encoders and learning algorithms in rate-constrained settings.
No associations
LandOfFree
Achievability results for statistical learning under communication constraints does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Achievability results for statistical learning under communication constraints, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Achievability results for statistical learning under communication constraints will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-374630