In this paper, we focus on the high-dimensional double sparse structure,
where the parameter of interest simultaneously encourages group-wise sparsity
and element-wise sparsity in each group. By combining the Gilbert-Varshamov
bound and its variants, we develop a novel lower bound technique for the metric
entropy of the parameter space, specifically tailored for the double sparse
structure over βuβ(βqβ)-balls with u,qβ[0,1]. We prove lower
bounds on the estimation error using an information-theoretic approach,
leveraging our proposed lower bound technique and Fano's inequality. To
complement the lower bounds, we establish matching upper bounds through a
direct analysis of constrained least-squares estimators and utilize results
from empirical processes. A significant finding of our study is the discovery
of a phase transition phenomenon in the minimax rates for u,qβ(0,1].
Furthermore, we extend the theoretical results to the double sparse regression
model and determine its minimax rate for estimation error. To tackle double
sparse linear regression, we develop the DSIHT (Double Sparse Iterative Hard
Thresholding) algorithm, demonstrating its optimality in the minimax sense.
Finally, we demonstrate the superiority of our method through numerical
experiments.Comment: 49 pages, 6 figure