1 research outputs found
Pivotal Estimation via Self-Normalization for High-Dimensional Linear Models with Error in Variables
We propose a new estimator for the high-dimensional linear regression model
with observation error in the design where the number of coefficients is
potentially larger than the sample size. The main novelty of our procedure is
that the choice of penalty parameters is pivotal. The estimator is based on
applying a self-normalization to the constraints that characterize the
estimator. Importantly, we show how to cast the computation of the estimator as
the solution of a convex program with second order cone constraints. This
allows the use of algorithms with theoretical guarantees and reliable
implementation. Under sparsity assumptions, we derive -rates of
convergence and show that consistency can be achieved even if the number of
regressors exceeds the sample size. We further provide a simple to implement
rule to threshold the estimator that yields a provably sparse estimator with
similar and -rates of convergence. The thresholds are
data-driven and component dependents. Finally, we also study the rates of
convergence of estimators that refit the data based on a selected support with
possible model selection mistakes. In addition to our finite sample theoretical
results that allow for non-i.i.d. data, we also present simulations to compare
the performance of the proposed estimators