Computer Science – Information Theory
Scientific paper
2006-03-14
Computer Science
Information Theory
The relaxed condtions for entropy rate and examples are taken out (to be part of another paper). The section about general pri
Scientific paper
Consider a hidden Markov chain obtained as the observation process of an ordinary Markov chain corrupted by noise. Zuk, et. al. [13], [14] showed how, in principle, one can explicitly compute the derivatives of the entropy rate of at extreme values of the noise. Namely, they showed that the derivatives of standard upper approximations to the entropy rate actually stabilize at an explicit finite time. We generalize this result to a natural class of hidden Markov chains called ``Black Holes.'' We also discuss in depth special cases of binary Markov chains observed in binary symmetric noise, and give an abstract formula for the first derivative in terms of a measure on the simplex due to Blackwell.
Han Guangyue
Marcus Brian
No associations
LandOfFree
Derivatives of Entropy Rate in Special Families of Hidden Markov Chains does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Derivatives of Entropy Rate in Special Families of Hidden Markov Chains, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Derivatives of Entropy Rate in Special Families of Hidden Markov Chains will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-728862