Computer Science – Information Theory
Scientific paper
2010-11-16
Computer Science
Information Theory
26 pages, 6 figures, Submitted to IEEE Transactions on Information Theory
Scientific paper
We present a new lossy compressor for discrete-valued sources. For coding a sequence $x^n$, the encoder starts by assigning a certain cost to each possible reconstruction sequence. It then finds the one that minimizes this cost and describes it losslessly to the decoder via a universal lossless compressor. The cost of each sequence is a linear combination of its distance from the sequence $x^n$ and a linear function of its $k^{\rm th}$ order empirical distribution. The structure of the cost function allows the encoder to employ the Viterbi algorithm to recover the minimizer of the cost. We identify a choice of the coefficients comprising the linear function of the empirical distribution used in the cost function which ensures that the algorithm universally achieves the optimum rate-distortion performance of any stationary ergodic source in the limit of large $n$, provided that $k$ diverges as $o(\log n)$. Iterative techniques for approximating the coefficients, which alleviate the computational burden of finding the optimal coefficients, are proposed and studied.
Jalali Shirin
Montanari Andrea
Weissman Tsachy
No associations
LandOfFree
Lossy compression of discrete sources via Viterbi algorithm does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Lossy compression of discrete sources via Viterbi algorithm, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Lossy compression of discrete sources via Viterbi algorithm will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-464920