This paper derives fundamental limits associated with compressive
classification of Gaussian mixture source models. In particular, we offer an
asymptotic characterization of the behavior of the (upper bound to the)
misclassification probability associated with the optimal Maximum-A-Posteriori
(MAP) classifier that depends on quantities that are dual to the concepts of
diversity gain and coding gain in multi-antenna communications. The diversity,
which is shown to determine the rate at which the probability of
misclassification decays in the low noise regime, is shown to depend on the
geometry of the source, the geometry of the measurement system and their
interplay. The measurement gain, which represents the counterpart of the coding
gain, is also shown to depend on geometrical quantities. It is argued that the
diversity order and the measurement gain also offer an optimization criterion
to perform dictionary learning for compressive classification applications.Comment: 5 pages, 3 figures, submitted to the 2013 IEEE International
Symposium on Information Theory (ISIT 2013