## Lossy compression of sparse sources

Sparse signal representations play an important role in lossy data
compression and other applications. We focus on sparse memoryless
sources with unimodal densities, characterizing their rate distortion
behavior or alternatively their entropy. A first approach involves an
upper bound on the operational distortion rate function for a class of
quantizers based on magnitude classification. Then we argue that the
geometric mean is a useful measure of sparseness, since it leads to a
lower bound on entropy that is the continuous counterpart to a
discrete entropy bound by Wyner. Together with the source variance,
the geometric mean yields also an entropy upper bound via the maximum
entropy approach. As an application example, we show how the
geometric mean of a source generalizes the concept of coding gain from
(vector) transform coding to scalar sources.