Computer Science – Computational Complexity
Scientific paper
2011-12-09
Computer Science
Computational Complexity
Scientific paper
This paper provides the first general technique for proving information lower bounds on two-party unbounded-rounds communication problems. We show that the discrepancy lower bound, which applies to randomized communication complexity, also applies to information complexity. More precisely, if the discrepancy of a two-party function $f$ with respect to a distribution $\mu$ is $Disc_\mu f$, then any two party randomized protocol computing $f$ must reveal at least $\Omega(\log (1/Disc_\mu f))$ bits of information to the participants. As a corollary, we obtain that any two-party protocol for computing a random function on $\{0,1\}^n\times\{0,1\}^n$ must reveal $\Omega(n)$ bits of information to the participants. In addition, we prove that the discrepancy of the Greater-Than function is $\Omega(1/\sqrt{n})$, which provides a simplified proof of the $\Omega(\log n)$ lower bound on the communication complexity of this well-studied function and, combined with our main result, proves the tight $\Omega(\log n)$ lower bound on its information complexity. The proof of our main result develops a new simulation procedure that may be of an independent interest.
Braverman Mark
Weinstein Omri
No associations
LandOfFree
A discrepancy lower bound for information complexity does not yet have a rating. At this time, there are no reviews or comments for this scientific paper.
If you have personal experience with A discrepancy lower bound for information complexity, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and A discrepancy lower bound for information complexity will most certainly appreciate the feedback.
Profile ID: LFWR-SCP-O-43315