A Lower Bound on the Complexity of Approximating the Entropy of a Markov Source

Computer Science – Information Theory

Scientific paper

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

Scientific paper

Suppose that, for any (k \geq 1), (\epsilon > 0) and sufficiently large $\sigma$, we are given a black box that allows us to sample characters from a $k$th-order Markov source over the alphabet (\{0, ..., \sigma - 1\}). Even if we know the source has entropy either 0 or at least (\log (\sigma - k)), there is still no algorithm that, with probability bounded away from (1 / 2), guesses the entropy correctly after sampling at most ((\sigma - k)^{k / 2 - \epsilon}) characters.

No associations

LandOfFree

Say what you really think

Search LandOfFree.com for scientists and scientific papers. Rate them and share your experience with other people.

Rating

A Lower Bound on the Complexity of Approximating the Entropy of a Markov Source does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.

If you have personal experience with A Lower Bound on the Complexity of Approximating the Entropy of a Markov Source, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and A Lower Bound on the Complexity of Approximating the Entropy of a Markov Source will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFWR-SCP-O-568411

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.