37,655 research outputs found

    Machine Learning for Set-Identified Linear Models

    Full text link
    This paper provides estimation and inference methods for an identified set where the selection among a very large number of covariates is based on modern machine learning tools. I characterize the boundary of the identified set (i.e., support function) using a semiparametric moment condition. Combining Neyman-orthogonality and sample splitting ideas, I construct a root-N consistent, uniformly asymptotically Gaussian estimator of the support function and propose a weighted bootstrap procedure to conduct inference about the identified set. I provide a general method to construct a Neyman-orthogonal moment condition for the support function. Applying my method to Lee (2008)'s endogenous selection model, I provide the asymptotic theory for the sharp (i.e., the tightest possible) bounds on the Average Treatment Effect in the presence of high-dimensional covariates. Furthermore, I relax the conventional monotonicity assumption and allow the sign of the treatment effect on the selection (e.g., employment) to be determined by covariates. Using JobCorps data set with very rich baseline characteristics, I substantially tighten the bounds on the JobCorps effect on wages under weakened monotonicity assumption

    Simple Inference on Functionals of Set-Identified Parameters Defined by Linear Moments

    Full text link
    This paper considers uniformly valid (over a class of data generating processes) inference for linear functionals of partially identified parameters in cases where the identified set is defined by linear (in the parameter) moment inequalities. We propose a bootstrap procedure for constructing uniformly valid confidence sets for a linear functional of a partially identified parameter. The proposed method amounts to bootstrapping the value functions of a linear optimization problem, and subsumes subvector inference as a special case. In other words, this paper shows the conditions under which ``naively'' bootstrapping a linear program can be used to construct a confidence set with uniform correct coverage for a partially identified linear functional. Unlike other proposed subvector inference procedures, our procedure does not require the researcher to repeatedly invert a hypothesis test, and is extremely computationally efficient. In addition to the new procedure, the paper also discusses connections between the literature on optimization and the literature on subvector inference in partially identified models

    Asymptotically Efficient Estimation of Weighted Average Derivatives with an Interval Censored Variable

    Full text link
    This paper studies the identification and estimation of weighted average derivatives of conditional location functionals including conditional mean and conditional quantiles in settings where either the outcome variable or a regressor is interval-valued. Building on Manski and Tamer (2002) who study nonparametric bounds for mean regression with interval data, we characterize the identified set of weighted average derivatives of regression functions. Since the weighted average derivatives do not rely on parametric specifications for the regression functions, the identified set is well-defined without any parametric assumptions. Under general conditions, the identified set is compact and convex and hence admits characterization by its support function. Using this characterization, we derive the semiparametric efficiency bound of the support function when the outcome variable is interval-valued. We illustrate efficient estimation by constructing an efficient estimator of the support function for the case of mean regression with an interval censored outcome

    Accuracy of simulations for stochastic dynamic models

    Get PDF
    This paper provides a general framework for the simulation of stochastic dynamic models. Our analysis rests upon a continuity property of invariant distributions and a generalized law of large numbers. We then establish that the simulated moments from numerical approximations converge to their exact values as the approximation errors of the computed solutions converge to zero. These asymptotic results are of further interest in the comparative study of dynamic solutions, model estimation, and derivation of error bounds for the simulated moments

    Point Decisions for Interval-Identified Parameters

    Get PDF
    This paper focuses on a situation where the decision-maker prefers to make a point-decision when the object of interest is interval-identified. Such a situation frequently arises when the interval-identified parameter is closely related to an optimal policy decision. To obtain a reasonable decision, this paper slices asymptotic normal experiments into subclasses corresponding to localized interval lengths, and finds a local asymptotic minimax decision for each subclass. Then, this paper suggests a decision that is based on the subclass minimax decisions, and explains the sense in which the decision is reasonable. One remarkable aspect of this solution is that the optimality of the solution remains intact even when the order of the interval bounds is misspecified. A small sample simulation study illustrates the solution’s usefulness.Partial Identification, Inequality Restrictions, Local Asymptotic Minimax Estimation, Semiparametric Efficiency

    ISIPTA'07: Proceedings of the Fifth International Symposium on Imprecise Probability: Theories and Applications

    Get PDF
    B

    Alternative models for moment inequalities

    Get PDF
    Behavioral choice models generate inequalities which, when combined with additional assumptions, can be used as a basis for estimation. This paper considers two sets of such assumptions and uses them in two empirical examples. The second example examines the structure of payments resulting from the upstream interactions in a vertical market. We then mimic the empirical setting for this example in a numerical analysis which computes actual equilibria, examines how their characteristics vary with the market setting, and compares them to the empirical results. The final section uses the numerical results in a Monte Carlo analysis of the robustness of the two approaches to estimation to their underlying assumptions.
    corecore