7,344 research outputs found
Adaptive complexity regularization for linear inverse problems
We tackle the problem of building adaptive estimation procedures for
ill-posed inverse problems. For general regularization methods depending on
tuning parameters, we construct a penalized method that selects the optimal
smoothing sequence without prior knowledge of the regularity of the function to
be estimated. We provide for such estimators oracle inequalities and optimal
rates of convergence. This penalized approach is applied to Tikhonov
regularization and to regularization by projection.Comment: Published in at http://dx.doi.org/10.1214/07-EJS115 the Electronic
Journal of Statistics (http://www.i-journals.org/ejs/) by the Institute of
Mathematical Statistics (http://www.imstat.org
On rate optimality for ill-posed inverse problems in econometrics
In this paper, we clarify the relations between the existing sets of
regularity conditions for convergence rates of nonparametric indirect
regression (NPIR) and nonparametric instrumental variables (NPIV) regression
models. We establish minimax risk lower bounds in mean integrated squared error
loss for the NPIR and the NPIV models under two basic regularity conditions
that allow for both mildly ill-posed and severely ill-posed cases. We show that
both a simple projection estimator for the NPIR model, and a sieve minimum
distance estimator for the NPIV model, can achieve the minimax risk lower
bounds, and are rate-optimal uniformly over a large class of structure
functions, allowing for mildly ill-posed and severely ill-posed cases.Comment: 27 page
Necessary conditions for variational regularization schemes
We study variational regularization methods in a general framework, more
precisely those methods that use a discrepancy and a regularization functional.
While several sets of sufficient conditions are known to obtain a
regularization method, we start with an investigation of the converse question:
How could necessary conditions for a variational method to provide a
regularization method look like? To this end, we formalize the notion of a
variational scheme and start with comparison of three different instances of
variational methods. Then we focus on the data space model and investigate the
role and interplay of the topological structure, the convergence notion and the
discrepancy functional. Especially, we deduce necessary conditions for the
discrepancy functional to fulfill usual continuity assumptions. The results are
applied to discrepancy functionals given by Bregman distances and especially to
the Kullback-Leibler divergence.Comment: To appear in Inverse Problem
On rate optimality for ill-posed inverse problems in econometrics
In this paper, we clarify the relations between the existing sets of regularity conditions for convergence rates of nonparametric indirect regression (NPIR) and nonparametric instrumental variables (NPIV) regression models. We establish minimax risk lower bounds in mean integrated squared error loss for the NPIR and the NPIV models under two basic regularity conditions that allow for both mildly ill-posed and severely ill-posed cases.We show that both a simple projection estimator for the NPIR model, and a sieve minimum distance estimator for the NPIV model,can achieve the minimax risk lower bounds, and are rate-optimal uniformly over a large class of structure functions, allowing for mildly ill-posed and severely ill-posed cases.
- …