Modern technologies are producing a wealth of data with complex structures.
For instance, in two-dimensional digital imaging, flow cytometry, and
electroencephalography, matrix type covariates frequently arise when
measurements are obtained for each combination of two underlying variables. To
address scientific questions arising from those data, new regression methods
that take matrices as covariates are needed, and sparsity or other forms of
regularization are crucial due to the ultrahigh dimensionality and complex
structure of the matrix data. The popular lasso and related regularization
methods hinge upon the sparsity of the true signal in terms of the number of
its nonzero coefficients. However, for the matrix data, the true signal is
often of, or can be well approximated by, a low rank structure. As such, the
sparsity is frequently in the form of low rank of the matrix parameters, which
may seriously violate the assumption of the classical lasso. In this article,
we propose a class of regularized matrix regression methods based on spectral
regularization. Highly efficient and scalable estimation algorithm is
developed, and a degrees of freedom formula is derived to facilitate model
selection along the regularization path. Superior performance of the proposed
method is demonstrated on both synthetic and real examples.Comment: 27 pages, 5 figur