Datasets with sheer volume have been generated from fields including computer
vision, medical imageology, and astronomy whose large-scale and
high-dimensional properties hamper the implementation of classical statistical
models. To tackle the computational challenges, one of the efficient approaches
is subsampling which draws subsamples from the original large datasets
according to a carefully-design task-specific probability distribution to form
an informative sketch. The computation cost is reduced by applying the original
algorithm to the substantially smaller sketch. Previous studies associated with
subsampling focused on non-regularized regression from the computational
efficiency and theoretical guarantee perspectives, such as ordinary least
square regression and logistic regression. In this article, we introduce a
randomized algorithm under the subsampling scheme for the Elastic-net
regression which gives novel insights into L1-norm regularized regression
problem. To effectively conduct consistency analysis, a smooth approximation
technique based on alpha absolute function is firstly employed and
theoretically verified. The concentration bounds and asymptotic normality for
the proposed randomized algorithm are then established under mild conditions.
Moreover, an optimal subsampling probability is constructed according to
A-optimality. The effectiveness of the proposed algorithm is demonstrated upon
synthetic and real data datasets.Comment: 28 pages, 7 figure