Nonasymptotic lossy compression

Victoria Kostina
Graduate Student, Princeton University
Given on: April 1st, 2013

Abstract

The fundamental problem of lossy compression is to represent an object with the best compression ratio (rate) possible while satisfying a constraint on the fidelity of reproduction. Traditional (asymptotic) information theory describes the optimum tradeoff between rate and fidelity that is achievable in the limit of infinite length of the source block to be compressed. Since limited delay is a key design requirement in many modern compression and transmission applications, we drop the assumption of unbounded blocklength and analyze the optimum tradeoff among rate, fidelity and blocklength achievable, regardless of the complexity of the code. We have found that the key random variable that governs the nonasymptotic fundamental limit of lossy compression is the so-called d-tilted information in x, which quantifies the number of bits required to represent source outcome x within distortion d. If we let the blocklength increase indefinitely, by the law of large numbers the d-tilted information in most source outcomes would be near its mean, which is equal to Shannon's rate-distortion function. At finite blocklength, however, the whole distribution of d-tilted information matters. Interestingly, in a seemingly unrelated problem of channel coding under input cost constraints, the maximum nonasymptotically achievable channel coding rate can be bounded in terms of the distribution of the random variable termed the b-tilted information in channel input x, which parallels the notion of the d-tilted information in lossy compression. Finally, we have analyzed the nonasymptotic fundamental limits of lossy joint source-channel coding, showing that separate design of source and channel codes, known to be optimal asymptotically, fails to achieve the non-asymptotic fundamental limit. Nonasymptotic information theory thus forces us to unlearn the lessons instilled by traditional asymptotic thinking.

Bio

Victoria is finishing up her PhD at Princeton University. She is working with Prof. Sergio VerdĂș on nonasymptotic information-theoretic limits. Previously, she received a Bachelor's degree from Moscow Institute of Physics and Technology and a Master's degree from University of Ottawa.