Publication
IEEE Trans. Inf. Theory
Paper

Inference under Information Constraints I: Lower Bounds from Chi-Square Contraction

View publication

Abstract

Multiple players are each given one independent sample, about which they can only provide limited information to a central referee. Each player is allowed to describe its observed sample to the referee using a channel from a family of channels $\mathcal {W}$ , which can be instantiated to capture, among others, both the communication- and privacy-constrained settings. The referee uses the players' messages to solve an inference problem on the unknown distribution that generated the samples. We derive lower bounds for the sample complexity of learning and testing discrete distributions in this information-constrained setting. Underlying our bounds is a characterization of the contraction in chi-square distance between the observed distributions of the samples when information constraints are placed. This contraction is captured in a local neighborhood in terms of chi-square and decoupled chi-square fluctuations of a given channel, two quantities we introduce. The former captures the average distance between distributions of channel output for two product distributions on the input, and the latter for a product distribution and a mixture of product distribution on the input. Our bounds are tight for both public- and private-coin protocols. Interestingly, the sample complexity of testing is order-wise higher when restricted to private-coin protocols.

Date

Publication

IEEE Trans. Inf. Theory

Authors

Share