Dynamic importance sampling for uniformly recurrent markov chains

Mathematics – Probability

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Published at http://dx.doi.org/10.1214/105051604000001016 in the Annals of Applied Probability (http://www.imstat.org/aap/) by

Scientific paper

10.1214/105051604000001016

Importance sampling is a variance reduction technique for efficient estimation of rare-event probabilities by Monte Carlo. In standard importance sampling schemes, the system is simulated using an a priori fixed change of measure suggested by a large deviation lower bound analysis. Recent work, however, has suggested that such schemes do not work well in many situations. In this paper we consider dynamic importance sampling in the setting of uniformly recurrent Markov chains. By ``dynamic'' we mean that in the course of a single simulation, the change of measure can depend on the outcome of the simulation up till that time. Based on a control-theoretic approach to large deviations, the existence of asymptotically optimal dynamic schemes is demonstrated in great generality. The implementation of the dynamic schemes is carried out with the help of a limiting Bellman equation. Numerical examples are presented to contrast the dynamic and standard schemes.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

Dynamic importance sampling for uniformly recurrent markov chains does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with Dynamic importance sampling for uniformly recurrent markov chains, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Dynamic importance sampling for uniformly recurrent markov chains will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-684453

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.