96 research outputs found

    Estimation of linear dynamic panel data models with time-invariant regressors (working paper)

    Get PDF
    This is the final version of the article. This paper can be downloaded without charge from the European Central Bank, the Social Science Research Network electronic library or from RePEc: Research Papers in Economics via the links in this record.We propose a two-stage estimation procedure to identify the effects of time-invariant regressors in a dynamic version of the Hausman-Taylor model. We first estimate the coeffi- cients of the time-varying regressors and subsequently regress the first-stage residuals on the time-invariant regressors providing analytical standard error adjustments for the second-stage coefficients. The two-stage approach is more robust against misspecification than GMM estimators that obtain all parameter estimates simultaneously. In addition, it allows exploiting advantages of estimators relying on transformations to eliminate the unit-specific heterogeneity. We analytically demonstrate under which conditions the one-stage and two-stage GMM estimators are equivalent. Monte Carlo results highlight the advantages of the two-stage approach in finite samples. Finally, the approach is illustrated with the estimation of a dynamic gravity equation for U.S. outward foreign direct investment

    Quasi–maximum likelihood estimation of linear dynamic short-T panel-data models

    Get PDF
    This is the author accepted manuscript. The final version is available from SAGE Publications via the DOI in this recordIn this article, I describe the xtdpdqml command for the quasi– maximum likelihood estimation of linear dynamic panel-data models when the time horizon is short and the number of cross-sectional units is large. Based on the theoretical groundwork by Bhargava and Sargan (1983, Econometrica 51: 1635–1659) and Hsiao, Pesaran, and Tahmiscioglu (2002, Journal of Econometrics 109: 107–150), the marginal distribution of the initial observations is modeled as a function of the observed variables to circumvent a short-T dynamic panel-data bias. Both random-effects and fixed-effects versions are available. Copyright 2016 by StataCorp LP

    Review of A. Colin Cameron and Pravin K. Trivedi’s Microeconometrics Using Stata, Second Edition

    Get PDF
    This is the author accepted manuscript. The final version is available from SAGE Publications via the DOI in this recordIn this article, I review Microeconometrics Using Stata, Second Edition, by A. Colin Cameron and Pravin K. Trivedi (2022, Stata Press)

    ardl: Estimating autoregressive distributed lag and equilibrium correction models

    Get PDF
    This is the final version. Available on open access from SAGE Publications via the DOI in this recordWe present a command, ardl, for the estimation of autoregressive distributed lag (ARDL) models in a time-series context. The ardl command can be used to fit an ARDL model with the optimal number of autoregressive and distributed lags based on the Akaike or Bayesian (Schwarz) information criterion. The regression results can be displayed in the ARDL levels form or in the error-correction representation of the model. The latter separates long-run and short-run effects and is available in two different parameterizations of the long-run (cointegrating) relationship. The popular bounds-testing procedure for the existence of a long-run levels relationship is implemented as a postestimation feature. Comprehensive critical values and approximate p-values obtained from response-surface regressions facilitate statistical inference

    Instrumental-variable estimation of large-T panel-data models with common factors

    Get PDF
    This is the final version. Available on open access from SAGE Publications via the DOI in this recordIn this article, we introduce the xtivdfreg command, which implements a general instrumental-variables (IV) approach for fitting panel-data models with many time-series observations, T, and unobserved common factors or interactive effects, as developed by Norkute et al. (2021, Journal of Econometrics 220: 416–446) and Cui et al. (2020a, ISER Discussion Paper 1101). The underlying idea of this approach is to project out the common factors from exogenous covariates using principal-components analysis and to run IV regression in both of two stages, using defactored covariates as instruments. The resulting two-stage IV estimator is valid for models with homogeneous or heterogeneous slope coefficients and has several advantages relative to existing popular approaches. In addition, the xtivdfreg command extends the two-stage IV approach in two major ways. First, the algorithm accommodates estimation of unbalanced panels. Second, the algorithm permits a flexible specification of instruments. We show that when one imposes zero factors, the xtivdfreg command can replicate the results of the popular Stata ivregress command. Notably, unlike ivregress, xtivdfreg permits estimation of the two-way error-components paneldata model with heterogeneous slope coefficients.Australian Research Council (ARC

    Instrument approval by the Sargan test and its consequences for coefficient estimation

    Get PDF
    This is the author accepted manuscript. The final version is available on open access from Elsevier via the DOI in this recordEmpirical econometric findings are often vindicated by supplementing them with the p-values of Sargan-Hansen tests for overidentifying restrictions, provided these exceed a chosen small nominal significance level. It is illustrated here that the probability that such tests reject instrument validity may often barely exceed small levels, even when instruments are seriously invalid, whereas even minor invalidity of instruments can severely undermine inference on regression coefficients by instrumental variable estimators. These uncomfortable patterns may be aggravated when particular valid or invalid instruments are relatively weak or strong

    kinkyreg: Instrument-free inference for linear regression models with endogenous regressors

    Get PDF
    This is the final version. Available on open access from SAGE Publications via the DOI in this recordIn models with endogenous regressors, a standard regression approach is to exploit just-identifying or overidentifying orthogonality conditions by using instrumental variables. In just-identified models, the identifying orthogonality assumptions cannot be tested without the imposition of other nontestable assumptions. While formal testing of overidentifying restrictions is possible, its interpretation still hinges on the validity of an initial set of untestable just-identifying orthogonality conditions. We present the kinkyreg command for kinky least-squares inference, which adopts an alternative approach to identification. By exploiting nonorthogonality conditions in the form of bounds on the admissible degree of endogeneity, feasible test procedures can be constructed that do not require instrumental variables. The kinky least-squares confidence bands can be more informative than confidence intervals obtained from instrumental-variables estimation, especially when the instruments are weak. Moreover, the approach facilitates a sensitivity analysis for standard instrumental-variables inference. In particular, it allows the user to assess the validity of previously untestable just-identifying exclusion restrictions. Further instrument-free tests include linear hypotheses, functional form, heteroskedasticity, and serial correlation tests

    Response Surface Regressions for Critical Value Bounds and Approximate p‐values in Equilibrium Correction Models

    Get PDF
    This is the final version. Available on open access from Wiley via the DOI in this record. We consider the popular ‘bounds test’ for the existence of a level relationship in conditional equilibrium correction models. By estimating response surface models based on about 95 billion simulated F‐statistics and 57 billion t‐statistics, we improve upon and substantially extend the set of available critical values, covering the full range of possible sample sizes and lag orders, and allowing for any number of long‐run forcing variables. By computing approximate P‐values, we find that the bounds test can be easily oversized by more than 5 percentage points in small samples when using asymptotic critical values

    One-Loop Quantum Energy Densities of Domain Wall Field Configurations

    Get PDF
    We discuss a simple procedure for computing one-loop quantum energies of any static field configuration that depends non-trivially on only a single spatial coordinate. We specifically focus on domain wall-type field configurations that connect two distinct minima of the effective potential, and may or may not be the solutions of classical field equations. We avoid the conventional summation of zero-point energies, and instead exploit the relation between functional determinants and solutions of associated differential equations. This approach allows ultraviolet divergences to be easily isolated and extracted using any convenient regularization scheme. Two examples are considered: two-dimensional ϕ4\phi^4 theory, and three-dimensional scalar electrodynamics with spontaneous symmetry breaking at the one-loop level.Comment: RevTex, 29 pages, 1 figure, minor corrections, references adde

    Derivative expansion and gauge independence of the false vacuum decay rate in various gauges

    Get PDF
    In theories with radiative symmetry breaking, the calculation of the false vacuum decay rate requires the inclusion of higher-order terms in the derivative expansion of the effective action. I show here that, in the case of covariant gauges, the presence of infrared singularities forbids the consistent calculation by keeping the lowest-order terms. The situation is remedied, however, in the case of RΟR_{\xi} gauges. Using the Nielsen identities I show that the final result is gauge independent for generic values of the gauge parameter vv that are not anomalously small.Comment: Some comments and references adde
    • 

    corecore