Two applications of information complexity
Abstract
We show the following new lower bounds in two concrete complexity models: (1) In the two-party communication complexity model, we show that the tribes function on n inputs has two-sided error randomized complexity Ω(n), while its nondeterminstic complexity and co-nondeterministic complexity are both ⊖(√n). This separation between randomized and nondeterministic complexity is the best possible and it settles an open problem in Kushilevitz and Nisan, which was also posed by Beame and Lawry. (2) In the Boolean decision tree model, we show that the recursive majority-of-three function on 3h inputs has randomized complexity Ω((7/3)h). The deterministic complexity of this function is ⊖(3h), and the nondeterministic complexity is ⊖(2h). Our lower bound on the randomized complexity is a substantial improvement over any lower bound for this problem that can be obtained via the techniques of Saks and Wigderson, Heiman and Wigderson, and Heiman, Newman, and Wigderson. Recursive majority is an important function for which a class of natural algorithms known as directional algorithms does not achieve the best randomized decision tree upper bound. These lower bounds are obtained using generalizations of information complexity, which quantifies the minimum amount of information that will have to be revealed about the inputs by every correct algorithm in a given model of computation.