5,181 research outputs found
Spectral Representation Learning for Conditional Moment Models
Many problems in causal inference and economics can be formulated in the
framework of conditional moment models, which characterize the target function
through a collection of conditional moment restrictions. For nonparametric
conditional moment models, efficient estimation often relies on preimposed
conditions on various measures of ill-posedness of the hypothesis space, which
are hard to validate when flexible models are used. In this work, we address
this issue by proposing a procedure that automatically learns representations
with controlled measures of ill-posedness. Our method approximates a linear
representation defined by the spectral decomposition of a conditional
expectation operator, which can be used for kernelized estimators and is known
to facilitate minimax optimal estimation in certain settings. We show this
representation can be efficiently estimated from data, and establish L2
consistency for the resulting estimator. We evaluate the proposed method on
proximal causal inference tasks, exhibiting promising performance on
high-dimensional, semi-synthetic data
Estimation of nonparametric conditional moment models with possibly nonsmooth moments
This paper studies nonparametric estimation of conditional moment models in which the residual functions could be nonsmooth with respect to the unknown functions of endogenous variables. It is a problem of nonparametric nonlinear instrumental variables (IV) estimation, and a difficult nonlinear ill-posed inverse problem with an unknown operator. We first propose a penalized sieve minimum distance (SMD) estimator of the unknown functions that are identified via the conditional moment models. We then establish its consistency and convergence rate (in strong metric), allowing for possibly non-compact function parameter spaces, possibly non-compact finite or infinite dimensional sieves with flexible lower semicompact or convex penalty, or finite dimensional linear sieves without penalty. Under relatively low-level sufficient conditions, and for both mildly and severely ill-posed problems, we show that the convergence rates for the nonlinear ill-posed inverse problems coincide with the known minimax optimal rates for the nonparametric mean IV regression. We illustrate the theory by two important applications: root-n asymptotic normality of the plug-in penalized SMD estimator of a weighted average derivative of a nonparametric nonlinear IV regression, and the convergence rate of a nonparametric additive quantile IV regression. We also present a simulation study and an empirical estimation of a system of nonparametric quantile IV Engel curves.
Adaptive, Rate-Optimal Testing in Instrumental Variables Models
This paper proposes simple, data-driven, optimal rate-adaptive inferences on a structural function in semi-nonparametric conditional moment restrictions. We consider two types of hypothesis tests based on leave-one-out sieve estimators. A structure- space test (ST) uses a quadratic distance between the structural functions of endogenous variables; while an image-space test (IT) uses a quadratic distance of the conditional moment from zero. For both tests, we analyze their respective classes of nonparametric alternative models that are separated from the null hypothesis by the minimax rate of testing. That is, the sum of the type I and the type II errors of the test, uniformly over the class of nonparametric alternative models, cannot be improved by any other test. Our new minimax rate of ST diļ¬ers from the known minimax rate of estimation in nonparametric instrumental variables (NPIV) models. We propose computationally simple and novel exponential scan data-driven choices of sieve regularization parameters and adjusted chi-squared critical values. The resulting tests attain the minimax rate of testing, and hence optimally adapt to the unknown smoothness of functions and are robust to the unknown degree of ill-posedness (endogeneity). Data-driven conļ¬dence sets are easily obtained by inverting the adaptive ST. Monte Carlo studies demonstrate that our adaptive ST has good size and power properties in ļ¬nite samples for testing monotonicity or equality restrictions in NPIV models. Empirical applications to nonparametric multi-product demands with endogenous prices are presented
Estimation of Nonparametric Conditional Moment Models with Possibly Nonsmooth Moments
This paper studies nonparametric estimation of conditional moment models in which the residual functions could be nonsmooth with respect to the unknown functions of endogenous variables. It is a problem of nonparametric nonlinear instrumental variables (IV) estimation, and a difficult nonlinear ill-posed inverse problem with an unknown operator. We first propose a penalized sieve minimum distance (SMD) estimator of the unknown functions that are idenĀtified via the conditional moment models. We then establish its consistency and convergence rate (in strong metric), allowing for possibly non-compact function parameter spaces, possibly non-compact finite or infinite dimensional sieves with flexible lower semicompact or convex penalty, or finite dimensional linear sieves without penalty. Under relatively low-level sufficient conditions, and for both mildly and severely ill-posed problems, we show that the convergence rates for the nonlinear ill-posed inverse problems coincide with the known minimax optimal rates for the nonparametric mean IV regression. We illustrate the theory by two important apĀplications: root-n asymptotic normality of the plug-in penalized SMD estimator of a weighted average derivative of a nonparametric nonlinear IV regression, and the convergence rate of a nonparametric additive quantile IV regression. We also present a simulation study and an empirical estimation of a system of nonparametric quantile IV Engel curves.Nonsmooth residuals, Nonlinear ill-posed inverse, Penalized sieve minimum distance, Modulus of continuity, Average derivative of a nonparametric nonlinear IV regression, Non-parametric additive quantile IV regression
A Note on Minimax Testing and Confidence Intervals in Moment Inequality Models
This note uses a simple example to show how moment inequality models used in
the empirical economics literature lead to general minimax relative efficiency
comparisons. The main point is that such models involve inference on a low
dimensional parameter, which leads naturally to a definition of "distance"
that, in full generality, would be arbitrary in minimax testing problems. This
definition of distance is justified by the fact that it leads to a duality
between minimaxity of confidence intervals and tests, which does not hold for
other definitions of distance. Thus, the use of moment inequalities for
inference in a low dimensional parametric model places additional structure on
the testing problem, which leads to stronger conclusions regarding minimax
relative efficiency than would otherwise be possible
Optimal Uniform Convergence Rates for Sieve Nonparametric Instrumental Variables Regression
We study the problem of nonparametric regression when the regressor is
endogenous, which is an important nonparametric instrumental variables (NPIV)
regression in econometrics and a difficult ill-posed inverse problem with
unknown operator in statistics. We first establish a general upper bound on the
sup-norm (uniform) convergence rate of a sieve estimator, allowing for
endogenous regressors and weakly dependent data. This result leads to the
optimal sup-norm convergence rates for spline and wavelet least squares
regression estimators under weakly dependent data and heavy-tailed error terms.
This upper bound also yields the sup-norm convergence rates for sieve NPIV
estimators under i.i.d. data: the rates coincide with the known optimal
-norm rates for severely ill-posed problems, and are power of
slower than the optimal -norm rates for mildly ill-posed problems. We then
establish the minimax risk lower bound in sup-norm loss, which coincides with
our upper bounds on sup-norm rates for the spline and wavelet sieve NPIV
estimators. This sup-norm rate optimality provides another justification for
the wide application of sieve NPIV estimators. Useful results on
weakly-dependent random matrices are also provided
Robust State Space Filtering under Incremental Model Perturbations Subject to a Relative Entropy Tolerance
This paper considers robust filtering for a nominal Gaussian state-space
model, when a relative entropy tolerance is applied to each time increment of a
dynamical model. The problem is formulated as a dynamic minimax game where the
maximizer adopts a myopic strategy. This game is shown to admit a saddle point
whose structure is characterized by applying and extending results presented
earlier in [1] for static least-squares estimation. The resulting minimax
filter takes the form of a risk-sensitive filter with a time varying risk
sensitivity parameter, which depends on the tolerance bound applied to the
model dynamics and observations at the corresponding time index. The
least-favorable model is constructed and used to evaluate the performance of
alternative filters. Simulations comparing the proposed risk-sensitive filter
to a standard Kalman filter show a significant performance advantage when
applied to the least-favorable model, and only a small performance loss for the
nominal model
Estimation of Nonparametric Conditional Moment Models with Possibly Nonsmooth Moments
This paper studies nonparametric estimation of conditional moment models in which the residual functions could be nonsmooth with respect to the unknown functions of endogenous variables. It is a problem of nonparametric nonlinear instrumental variables (IV) estimation, and a diļ¬icult nonlinear ill-posed inverse problem with an unknown operator. We ļ¬rst propose a penalized sieve minimum distance (SMD) estimator of the unknown functions that are identiļ¬ed via the conditional moment models. We then establish its consistency and convergence rate (in strong metric), allowing for possibly non-compact function parameter spaces, possibly non-compact ļ¬nite or inļ¬nite dimensional sieves with flexible lower semicompact or convex penalty, or ļ¬nite dimensional linear sieves without penalty. Under relatively low-level suļ¬icient conditions, and for both mildly and severely ill-posed problems, we show that the convergence rates for the nonlinear ill-posed inverse problems coincide with the known minimax optimal rates for the nonparametric mean IV regression. We illustrate the theory by two important applications: root-n asymptotic normality of the plug-in penalized SMD estimator of a weighted average derivative of a nonparametric nonlinear IV regression, and the convergence rate of a nonparametric additive quantile IV regression. We also present a simulation study and an empirical estimation of a system of nonparametric quantile IV Engel curves
- ā¦