Computer Science – Information Theory
Scientific paper
2010-12-24
Computer Science
Information Theory
14 pages, 6 figures
Scientific paper
The problem of random number generation from an uncorrelated random source (of unknown probability distribution) dates back to von Neumann's 1951 work. Elias (1972) generalized von Neumann's scheme and showed how to achieve optimal efficiency in unbiased random bits generation. Hence, a natural question is what if the sources are correlated? Both Elias and Samuelson proposed methods for generating unbiased random bits in the case of correlated sources (of unknown probability distribution), specifically, they considered finite Markov chains. However, their proposed methods are not efficient or have implementation difficulties. Blum (1986) devised an algorithm for efficiently generating random bits from degree-2 finite Markov chains in expected linear time, however, his beautiful method is still far from optimality on information-efficiency. In this paper, we generalize Blum's algorithm to arbitrary degree finite Markov chains and combine it with Elias's method for efficient generation of unbiased bits. As a result, we provide the first known algorithm that generates unbiased random bits from an arbitrary finite Markov chain, operates in expected linear time and achieves the information-theoretic upper bound on efficiency.
Bruck Jehoshua
Zhou Hongchao
No associations
LandOfFree
Efficient Generation of Random Bits from Finite State Markov Chains does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with Efficient Generation of Random Bits from Finite State Markov Chains, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Efficient Generation of Random Bits from Finite State Markov Chains will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-642643