Computer Science – Information Theory
Scientific paper
2006-06-27
Computer Science
Information Theory
7 pages. Submitted to IEEE Transactions on Information Theory
Scientific paper
We consider a pair of correlated processes {Z_n} and {S_n} (two sided), where the former is observable and the later is hidden. The uncertainty in the estimation of Z_n upon its finite past history is H(Z_n|Z_0^{n-1}), and for estimation of S_n upon this observation is H(S_n|Z_0^{n-1}), which are both sequences of n. The limits of these sequences (and their existence) are of practical and theoretical interest. The first limit, if exists, is the entropy rate. We call the second limit the estimation entropy. An example of a process jointly correlated to another one is the hidden Markov process. It is the memoryless observation of the Markov state process where state transitions are independent of past observations. We consider a new representation of hidden Markov process using iterated function system. In this representation the state transitions are deterministically related to the process. This representation provides a unified framework for the analysis of the two limiting entropies for this process, resulting in integral expressions for the limits. This analysis shows that under mild conditions the limits exist and provides a simple method for calculating the elements of the corresponding sequences.
No associations
LandOfFree
Hidden Markov Process: A New Representation, Entropy Rate and Estimation Entropy does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Hidden Markov Process: A New Representation, Entropy Rate and Estimation Entropy, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Hidden Markov Process: A New Representation, Entropy Rate and Estimation Entropy will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-572245