The sparsity and compressibility of finite-dimensional signals are of great
interest in fields such as compressed sensing. The notion of compressibility is
also extended to infinite sequences of i.i.d. or ergodic random variables based
on the observed error in their nonlinear k-term approximation. In this work, we
use the entropy measure to study the compressibility of continuous-domain
innovation processes (alternatively known as white noise). Specifically, we
define such a measure as the entropy limit of the doubly quantized (time and
amplitude) process. This provides a tool to compare the compressibility of
various innovation processes. It also allows us to identify an analogue of the
concept of "entropy dimension" which was originally defined by R\'enyi for
random variables. Particular attention is given to stable and impulsive Poisson
innovation processes. Here, our results recognize Poisson innovations as the
more compressible ones with an entropy measure far below that of stable
innovations. While this result departs from the previous knowledge regarding
the compressibility of fat-tailed distributions, our entropy measure ranks
stable innovations according to their tail decay.Comment: 58 page