Computer Science – Information Theory
Scientific paper
2009-04-24
Computer Science
Information Theory
21 pages
Scientific paper
A word-valued source $\mathbf{Y} = Y_1,Y_2,...$ is discrete random process that is formed by sequentially encoding the symbols of a random process $\mathbf{X} = X_1,X_2,...$ with codewords from a codebook $\mathscr{C}$. These processes appear frequently in information theory (in particular, in the analysis of source-coding algorithms), so it is of interest to give conditions on $\mathbf{X}$ and $\mathscr{C}$ for which $\mathbf{Y}$ will satisfy an ergodic theorem and possess an Asymptotic Equipartition Property (AEP). In this correspondence, we prove the following: (1) if $\mathbf{X}$ is asymptotically mean stationary, then $\mathbf{Y}$ will satisfy a pointwise ergodic theorem and possess an AEP; and, (2) if the codebook $\mathscr{C}$ is prefix-free, then the entropy rate of $\mathbf{Y}$ is equal to the entropy rate of $\mathbf{X}$ normalized by the average codeword length.
Blackmore Kim
Hanlen Leif
Timo Roy
No associations
LandOfFree
Word-Valued Sources: an Ergodic Theorem, an AEP and the Conservation of Entropy does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Word-Valued Sources: an Ergodic Theorem, an AEP and the Conservation of Entropy, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Word-Valued Sources: an Ergodic Theorem, an AEP and the Conservation of Entropy will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-609095