18 research outputs found
Testing for optimal monetary policy via moment inequalities
The specification of an optimizing model of the monetary transmission mechanism requires selecting a policy regime, commonly commitment or discretion. In this paper we propose a new procedure for testing optimal monetary policy, relying on moment inequalities that nest
commitment and discretion as two special cases. The approach is based on the derivation of bounds for in
ation that are consistent with optimal policy under either policy regime. We derive testable implications that allow for specification tests and discrimination between the
two alternative regimes. The proposed procedure is implemented to examine the conduct of monetary policy in the United States economy
Generalized empirical likelihood tests in time series models with potential identification failure
We introduce test statistics based on generalized empirical likelihood methods that can be used to test simple hypotheses involving the unknown parameter vector in moment condition time series models. The test statistics generalize those in Guggenberger and Smith (2005) from the i.i.d. to the time series context and are alternatives to those in Kleibergen (2001) and Otsu (2003). The main feature of these tests is that their empirical null rejection probabilities are not affected much by the strength or weakness of identification. More precisely, we show that the statistics are asymptotically distributed as chiāsquare under both classical asymptotic theory and weak instrument asymptotics of Stock and Wright (2000). A Monte Carlo study reveals that the finiteāsample performance of the suggested tests is very competitive.Generalized Empirical Likelihood, Nonlinear Moment Conditions, Similar Tests, Size Distortion, Weak Identification
Bootstrap for Interval Endpoints Defined by Moment Inequalities
This paper analyzes the ļ¬nite-sample and asymptotic properties of several bootstrap and m out of n bootstrap methods for constructing conļ¬dence interval (CI) endpoints in models deļ¬ned by moment inequalities. In particular, we consider using these methods directly to construct CI endpoints. By considering two very simple models, the paper shows that neither the bootstrap nor the m out of n bootstrap is valid in ļ¬nite samples or in a uniform asymptotic sense in general when applied directly to construct CI endpoints. In contrast, other results in the literature show that other ways of applying the bootstrap, m out of n bootstrap, and subsampling do lead to uniformly asymptotically valid conļ¬dence sets in moment inequality models. Thus, the uniform asymptotic validity of resampling methods in moment inequality models depends on the way in which the resampling methods are employed
Empirical exchange rate models and currency risk: some evidence from density forecasts
A large literature in exchange rate economics has investigated the forecasting performance of empirical exchange rate models using conventional point forecast accuracy criteria. However, in the context of managing exchange rate risk, interest centers on more than just point forecasts. This paper provides a formal evaluation of recent exchange rate models based on the term structure of forward exchange rates, which previous research has shown to be satisfactory in point forecasting, in terms of density forecasting performance. The economic value of the exchange rate density forecasts is investigated in the context of an application to a simple risk management exercise
A Generalized Argmax Theorem with Applications
The argmax theorem is a useful result for deriving the limiting distribution
of estimators in many applications. The conclusion of the argmax theorem states
that the argmax of a sequence of stochastic processes converges in distribution
to the argmax of a limiting stochastic process. This paper generalizes the
argmax theorem to allow the maximization to take place over a sequence of
subsets of the domain. If the sequence of subsets converges to a limiting
subset, then the conclusion of the argmax theorem continues to hold. We
demonstrate the usefulness of this generalization in three applications:
estimating a structural break, estimating a parameter on the boundary of the
parameter space, and estimating a weakly identified parameter. The generalized
argmax theorem simplifies the proofs for existing results and can be used to
prove new results in these literatures
Empirical Models of Auctions
Many important economic questions arising in auctions can be answered only with knowledge of the underlying primitive distributions governing bidder demand and information. An active literature has developed aiming to estimate these primitives by exploiting restrictions from economic theory as part of the econometric model used to interpret auction data. We review some highlights of this recent literature, focusing on identification and empirical applications. We describe three insights that underlie much of the recent methodological progress in this area and discuss some of the ways these insights have been extended to richer models allowing more convincing empirical applications. We discuss several recent empirical studies using these methods to address a range of important economic questions.
estimation and inference with weak instruments and near exogeneity
Empirical economic studies are often confronted by the joint problem of weak instruments and near exogeneity, such as labor economics and empirical economic growth theory. This dissertation presents new evidence and solutions on estimation and inference with weak instruments and near exogeneity. Chapter 1 reexamines the effect of institutions on economic performance in Acemoglu, Johnson and Robinson (2001) where the measurement of current institutions is instrumented by European settler mortality rates. Since many economists argue that the settler mortality rates can possibly affect economic performance through other channels, I reexamine the effect of institutions by considering near exogeneity. I provide some evidence to show that the effect of institutions is not significant in many regression specifications when the settler mortality rates are used as the main instrument.Chapter 2 studies estimation and inference with weak instruments and near exogeneity in a linear simultaneous equations model. I show that near exogeneity can exaggerate asymptotic bias of the TSLS and the LIML estimators. When using critical values from chi-square distributions, Anderson-Rubin and Kleibergen tests under exogeneity have a large size distortion. I propose the delete-d jackknife based Anderson-Rubin and Kleibergen tests to automatically reduce the size distortion in finite samples without a need for any pretest of exogeneity.Chapter 3 extends estimation and inference with weak identification and near exogeneity into a GMM framework with instrumental variables. A GMM framework allows nonlinear and nondifferentiable moment conditions. I examine asymptotic results of one-step GMM estimator, two-step efficient GMM estimator and continuously updating estimator with weak identification and near exogeneity. Near exogeneity can produce relatively large bias for all these estimators. The Anderson-Rubin type and the Kleibergen type tests under near exogeneity converge in distribution to nonstandard distributions, which creates large size distortion when using critical values from chi-square distributions. The delete-d jackknife based approach can reduce the size distortio