On optimum conventional quantization for source coding with side information at the decoder
Abstract
Let X and Y denote two jointly memoryless sources with finite alphabets. Suppose that X is to be encoded in a lossy manner with Y as the side information available only at the decoder. A common approach to this lossy source coding problem is to apply conventional vector quantization followed by Slepian-Wolf coding. In this paper we are interested in the rate-distortion performance achievable asymptotically by this approach. Given an arbitrary single letter distortion measure d, it is shown that the best rate achievable asymptotically under the constraint that X is recovered with distortion level no greater than D ≥ 0 is R̂WZ(D) = min X̂[I(X; X̂) - I(Y; X̂)], where the minimum is taken over all auxiliary random variables X̂ such that Ed(X, X̂) ≤ D and X̂ → X → Y is a Markov chain. An extended Blahut-Arimoto algorithm is then proposed to calculate R̂W Z (D) for any (X, Y) and any distortion measure, and the convergence of the algorithm is also proved. Interestingly, it is observed that the random variable X̂ achieving R̂W Z (D) is, in general, different from the random variable X̂′ achieving the classical rate-distortion function R(D) of X at distortion D. In particular, it is shown that in the case of binary sources and Hamming distortion measure, the random variable X̂ achieving R̂W Z(D) is the same as the random variable X̂′ achieving R(D) if and only if the channel p Y|X from X to Y is symmetric. Thus, the design of conventional quantization in the case of side information at the decoder should be different from the case of no side information. © 2007 IEEE.