11,749 research outputs found
Inference on counterfactual distributions
In this paper we develop procedures for performing inference in regression models about how potential policy interventions affect the entire marginal distribution of an outcome of interest. These policy interventions consist of either changes in the distribution of covariates related to the outcome holding the conditional distribution of the outcome given covariates fixed, or changes in the conditional distribution of the outcome given covariates holding the marginal distribution of the covariates fixed. Under either of these assumptions, we obtain uniformly consistent estimates and functional central limit theorems for the counterfactual and status quo marginal distributions of the outcome as well as other function-valued effects of the policy, including, for example, the effects of the policy on the marginal distribution function, quantile function, and other related functionals. We construct simultaneous confidence sets for these functions; these sets take into account the sampling variation in the estimation of the relationship between the outcome and covariates. Our procedures rely on, and our theory covers, all main regression approaches for modeling and estimating conditional distributions, focusing especially on classical, quantile, duration, and distribution regressions. Our procedures are general and accommodate both simple unitary changes in the values of a given covariate as well as changes in the distribution of the covariates or the conditional distribution of the outcome given covariates of general form. We apply the procedures to examine the effects of labor market institutions on the U.S. wage distribution.
Functional delta residuals and applications to functional effect sizes
Given a functional central limit (fCLT) and a parameter transformation, we
use the functional delta method to construct random processes, called
functional delta residuals, which asymptotically have the same covariance
structure as the transformed limit process. Moreover, we prove a multiplier
bootstrap fCLT theorem for these transformed residuals and show how this can be
used to construct simultaneous confidence bands for transformed functional
parameters. As motivation for this methodology, we provide the formal
application of these residuals to a functional version of the effect size
parameter Cohen's , a problem appearing in current brain imaging
applications. The performance and necessity of such residuals is illustrated in
a simulation experiment for the covering rate of simultaneous confidence bands
for the functional Cohen's parameter
Quantile and Probability Curves Without Crossing
This paper proposes a method to address the longstanding problem of lack of
monotonicity in estimation of conditional and structural quantile functions,
also known as the quantile crossing problem. The method consists in sorting or
monotone rearranging the original estimated non-monotone curve into a monotone
rearranged curve. We show that the rearranged curve is closer to the true
quantile curve in finite samples than the original curve, establish a
functional delta method for rearrangement-related operators, and derive
functional limit theory for the entire rearranged curve and its functionals. We
also establish validity of the bootstrap for estimating the limit law of the
the entire rearranged curve and its functionals. Our limit results are generic
in that they apply to every estimator of a monotone econometric function,
provided that the estimator satisfies a functional central limit theorem and
the function satisfies some smoothness conditions. Consequently, our results
apply to estimation of other econometric functions with monotonicity
restrictions, such as demand, production, distribution, and structural
distribution functions. We illustrate the results with an application to
estimation of structural quantile functions using data on Vietnam veteran
status and earnings.Comment: 29 pages, 4 figure
Inference on Counterfactual Distributions
Counterfactual distributions are important ingredients for policy analysis
and decomposition analysis in empirical economics. In this article we develop
modeling and inference tools for counterfactual distributions based on
regression methods. The counterfactual scenarios that we consider consist of
ceteris paribus changes in either the distribution of covariates related to the
outcome of interest or the conditional distribution of the outcome given
covariates. For either of these scenarios we derive joint functional central
limit theorems and bootstrap validity results for regression-based estimators
of the status quo and counterfactual outcome distributions. These results allow
us to construct simultaneous confidence sets for function-valued effects of the
counterfactual changes, including the effects on the entire distribution and
quantile functions of the outcome as well as on related functionals. These
confidence sets can be used to test functional hypotheses such as no-effect,
positive effect, or stochastic dominance. Our theory applies to general
counterfactual changes and covers the main regression methods including
classical, quantile, duration, and distribution regressions. We illustrate the
results with an empirical application to wage decompositions using data for the
United States.
As a part of developing the main results, we introduce distribution
regression as a comprehensive and flexible tool for modeling and estimating the
\textit{entire} conditional distribution. We show that distribution regression
encompasses the Cox duration regression and represents a useful alternative to
quantile regression. We establish functional central limit theorems and
bootstrap validity results for the empirical distribution regression process
and various related functionals.Comment: 55 pages, 1 table, 3 figures, supplementary appendix with additional
results available from the authors' web site
Delta method in large deviations and moderate deviations for estimators
The delta method is a popular and elementary tool for deriving limiting
distributions of transformed statistics, while applications of asymptotic
distributions do not allow one to obtain desirable accuracy of approximation
for tail probabilities. The large and moderate deviation theory can achieve
this goal. Motivated by the delta method in weak convergence, a general delta
method in large deviations is proposed. The new method can be widely applied to
driving the moderate deviations of estimators and is illustrated by examples
including the Wilcoxon statistic, the Kaplan--Meier estimator, the empirical
quantile processes and the empirical copula function. We also improve the
existing moderate deviations results for -estimators and -statistics by
the new method. Some applications of moderate deviations to statistical
hypothesis testing are provided.Comment: Published in at http://dx.doi.org/10.1214/10-AOS865 the Annals of
Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical
Statistics (http://www.imstat.org
Program Evaluation and Causal Inference with High-Dimensional Data
In this paper, we provide efficient estimators and honest confidence bands
for a variety of treatment effects including local average (LATE) and local
quantile treatment effects (LQTE) in data-rich environments. We can handle very
many control variables, endogenous receipt of treatment, heterogeneous
treatment effects, and function-valued outcomes. Our framework covers the
special case of exogenous receipt of treatment, either conditional on controls
or unconditionally as in randomized control trials. In the latter case, our
approach produces efficient estimators and honest bands for (functional)
average treatment effects (ATE) and quantile treatment effects (QTE). To make
informative inference possible, we assume that key reduced form predictive
relationships are approximately sparse. This assumption allows the use of
regularization and selection methods to estimate those relations, and we
provide methods for post-regularization and post-selection inference that are
uniformly valid (honest) across a wide-range of models. We show that a key
ingredient enabling honest inference is the use of orthogonal or doubly robust
moment conditions in estimating certain reduced form functional parameters. We
illustrate the use of the proposed methods with an application to estimating
the effect of 401(k) eligibility and participation on accumulated assets.Comment: 118 pages, 3 tables, 11 figures, includes supplementary appendix.
This version corrects some typos in Example 2 of the published versio
Quantile and probability curves without crossing
The most common approach to estimating conditional quantile curves is to fit a curve, typically linear, pointwise for each quantile. Linear functional forms, coupled with pointwise fitting, are used for a number of reasons including parsimony of the resulting approximations and good computational properties. The resulting fits, however, may not respect a logical monotonicity requirement that the quantile curve be increasing as a function of probability. This paper studies the natural monotonization of these empirical curves induced by sampling from the estimated non-monotone model, and then taking the resulting conditional quantile curves that by construction are monotone in the probability.
- …
