Computer Science – Information Theory
Scientific paper
2011-03-13
Computer Science
Information Theory
v4: substantial corrections; 13 pages
Scientific paper
In 1997, Z.Zhang and R.W.Yeung found the first example of a conditional information inequality in four variables that is not "Shannon-type". This linear inequality for entropies is called conditional (or constraint) since it holds only under condition that some linear equations are satisfied for the involved entropies. Later, the same authors and other researchers discovered several unconditional information inequalities that do not follow from Shannon's inequalities for entropy. In this paper we show that some non Shannon-type conditional inequalities are "essentially" conditional, i.e., they cannot be extended to any unconditional inequality. We prove one new essentially conditional information inequality for Shannon's entropy and discuss conditional information inequalities for Kolmogorov complexity.
Kaced Tarik
Romashchenko Andrei
No associations
LandOfFree
On essentially conditional information inequalities does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with On essentially conditional information inequalities, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and On essentially conditional information inequalities will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-234055