Publication
ISIT 1991
Conference paper

Improved bounds on interactive communication

View publication

Abstract

(X, Y) is a pair of random variables. Person Px knows X, Person Py knows Y, and both know the joint probability distribution of (X, Y). Using a predetermined protocol, they exchange messages over a binary, error-free, channel in order for Py to learn X. Px may or may not learn Y.Ĉm(X|Y) is the number of information bits that must be transmitted (by both persons) in the worst case if only m messages are allowed. Ĉ∞(X|Y) is the corresponding number of bits when there is no restriction on the number of messages exchanged. It is known that one-message communication may require exponentially more bits than the minimum possible: for some (X, Y) pairs, Ĉ(X|Y) = 2Ĉ∞(X|Y)-1. Yet just two messages suffice to reduce communication to almost the minimum: for all (X, Y) pairs, Ĉ2(X|Y) ≤ 4Ĉ∞(X|Y)+3. It was further shown that for any positive ϵ there are (X,Y) pairs satisfying Ĉ2(X\Y) ≥ (2 - ϵ)Ĉ∞(X|Y). An obvious open problem is whether there is an m such that m messages are asymptotically optimal: Ĉm(X|Y) ≤ Ĉ∞(X|Y) + o(Ĉ∞(X|Y)) for all (X,Y) pairs. We improve the above bounds, thereby stepping towards a resolution of the open problem. We show that for all (X, Y) pairs, Ĉ2(X|Y) ≤ eĈ∞(X|Y) + 7 and that Ĉ4(X|Y) < 2Ĉ∞(X|Y) + 3.2 log Ĉ∞(X|Y) + 4. The proofs improve known techniques for both lower- and upper-bounds.

Date

Publication

ISIT 1991

Authors

Topics

Share