Information Theoretic Limits on Learning Stochastic Differential Equations

Computer Science – Information Theory

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

6 pages, 2 figures, conference version

Scientific paper

Consider the problem of learning the drift coefficient of a stochastic differential equation from a sample path. In this paper, we assume that the drift is parametrized by a high dimensional vector. We address the question of how long the system needs to be observed in order to learn this vector of parameters. We prove a general lower bound on this time complexity by using a characterization of mutual information as time integral of conditional variance, due to Kadota, Zakai, and Ziv. This general lower bound is applied to specific classes of linear and non-linear stochastic differential equations. In the linear case, the problem under consideration is the one of learning a matrix of interaction coefficients. We evaluate our lower bound for ensembles of sparse and dense random matrices. The resulting estimates match the qualitative behavior of upper bounds achieved by computationally efficient procedures.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Information Theoretic Limits on Learning Stochastic Differential Equations does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Information Theoretic Limits on Learning Stochastic Differential Equations, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Information Theoretic Limits on Learning Stochastic Differential Equations will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-643463

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.