108 research outputs found

    Improved HAR Inference

    Get PDF
    Employing power kernels suggested in earlier work by the authors (2003), this paper shows how to re.ne methods of robust inference on the mean in a time series that rely on families of untruncated kernel estimates of the long-run parameters. The new methods improve the size properties of heteroskedastic and autocorrelation robust (HAR) tests in comparison with conventional methods that employ consistent HAC estimates, and they raise test power in comparison with other tests that are based on untruncated kernel estimates. Large power parameter (rho) asymptotic expansions of the nonstandard limit theory are developed in terms of the usual limiting chi-squared distribution, and corresponding large sample size and large rho asymptotic expansions of the finite sample distribution of Wald tests are developed to justify the new approach. Exact finite sample distributions are given using operational techniques. The paper further shows that the optimal rho that minimizes a weighted sum of type I and II errors has an expansion rate of at most O(T^{1/2}) and can even be O(1) for certain loss functions, and is therefore slower than the O(T^{2/3}) rate which minimizes the asymptotic mean squared error of the corresponding long run variance estimator. A new plug-in procedure for implementing the optimal rho is suggested. Simulations show that the new plug-in procedure works well in finite samples.Asymptotic expansion, consistent HAC estimation, data-determined kernel estimation, exact distribution, HAR inference, large rho asymptotics, long run variance, loss function, power parameter, sharp origin kernel

    Nonstationary Discrete Choice: A Corrigendum and Addendum

    Get PDF
    We correct the limit theory presented in an earlier paper by Hu and Phillips (Journal of Econometrics, 2004) for nonstationary time series discrete choice models with multiple choices and thresholds. The new limit theory shows that, in contrast to the binary choice model with nonstationary regressors and a zero threshold where there are dual rates of convergence (n^{1/4} and n^{3/4}), all parameters including the thresholds converge at the rate n^{3/4}. The presence of non-zero thresholds therefore materially affects rates of convergence. Dual rates of convergence reappear when stationary variables are present in the system. Some simulation evidence is provided, showing how the magnitude of the thresholds affects finite sample performance. A new finding is that predicted probabilities and marginal effect estimates have finite sample distributions that manifest a pile-up, or increasing density, towards the limits of the domain of definition.Brownian motion, Brownian local time, Discrete choices, Integrated processes, Pile-up problem, Threshold parameters

    A New Approach to Robust Inference in Cointegration

    Get PDF
    A new approach to robust testing in cointegrated systems is proposed using nonparametric HAC estimators without truncation. While such HAC estimates are inconsistent, they still produce asymptotically pivotal tests and, as in conventional regression settings, can improve testing and inference. The present contribution makes use of steep origin kernels which are obtained by exponentiating traditional quadratic kernels. Simulations indicate that tests based on these methods have improved size properties relative to conventional tests and better power properties than other tests that use Bartlett or other traditional kernels with no truncation.Cointegration, HAC estimation, long-run covariance matrix, robust inference, steep origin kernel, fully modified estimation

    Consistent HAC Estimation and Robust Regression Testing Using Sharp Origin Kernels with No Truncation

    Get PDF
    A new family of kernels is suggested for use in heteroskedasticity and autocorrelation consistent (HAC) and long run variance (LRV) estimation and robust regression testing. The kernels are constructed by taking powers of the Bartlett kernel and are intended to be used with no truncation (or bandwidth) parameter. As the power parameter (rho) increases, the kernels become very sharp at the origin and increasingly downweight values away fro the origin, thereby achieving effects similar to a bandwidth parameter. Sharp origin kernels can be used in regression testing in much the same way as conventional kernels with no truncation, as suggested in the work of Kiefer and Vogelsang (2002a, 2002b). A unified representation of HAC limit theory for untruncated kernels is provided using a new proof based on Mercer's theorem that allows for kernels which may or may not be differentiable at the origin. This new representation helps to explain earlier findings like the dominance of the Bartlett kernel over quadratic kernels in test power and yields new findings about the asymptotic properties of tests with sharp origin kernels. Analysis and simulations indicate that sharp origin kernels lead to tests with improved size properties relative to conventional tests and better power properties than other tests using Bartlett and other conventional kernels without truncation. If rho is passed to infinity with the sample size (T), the new kernels provide consistent HAC and LRV estimates as well as continued robust regression testing. Optimal choice of rho based on minimizing the asymptotic mean squared error of estimation is considered, leading to a rate of convergence of the kernel estimate of T^{1/3}, analogous to that of a conventional truncated Bartlett kernel estimate with an optimal choice of bandwidth. A data-based version of the consistent sharp origin kernel is obtained which is easily implementable in practical work. Within this new framework, untruncated kernel estimation can be regarded as a form of conventional kernel estimation in which the usual bandwidth parameter is replaced by a power parameter that serves to control the degree of downweighting. Simulations show that in regression testing with the sharp origin kernel, the power properties are better than those with simple untruncated kernels (where rho = 1) and at least as good as those with truncated kernels. Size is generally more accurate with sharp origin kernels than truncated kernels. In practice a simple fixed choice of the exponent parameter around rho = 16 for the sharp origin kernel produces favorable results for both size and power in regression testing with sample sizes that are typical in econometric applications.Consistent HAC estimation, Data determined kernel estimation, Long run variance, Mercer?s theorem, Power parameter, Sharp origin kernel

    The Rise in House Prices in China: Bubbles or Fundamentals?

    Get PDF
    The dramatic rise of house prices in many cities of China has brought huge attention from both the governmental and academic circles. There is a huge debate on whether the increasing house prices are driven by market fundamentals or just by speculation. Like Levin and Wright (1997a, 1997b), we decompose house prices in China into fundamental and non-fundamental components. We also consider potential nonlinear feedback from the historical growth rate of house prices on the current house prices and propose a semiparametric approach to estimate the speculative components in the model. We demonstrate that the non-fundamental part contributes a relatively small proportion of the rise of house prices in China.

    Optimal Bandwidth Selection in Heteroskedasticity-Autocorrelation Robust Testing

    Get PDF
    In time series regressions with nonparametrically autocorrelated errors, it is now standard empirical practice to use kernel-based robust standard errors that involve some smoothing function over the sample autocorrelations. The underlying smoothing parameter b, which can be defined as the ratio of the bandwidth (or truncation lag) to the sample size, is a tuning parameter that plays a key role in determining the asymptotic properties of the standard errors and associated semiparametric tests. Small-b asymptotics involve standard limit theory such as standard normal or chi-squared limits, whereas fixed-b asymptotics typically lead to nonstandard limit distributions involving Brownian bridge functionals. The present paper shows that the nonstandard fixed-b limit distributions of such nonparametrically studentized tests provide more accurate approximations to the finite sample distributions than the standard small-b limit distribution. In particular, using asymptotic expansions of both the finite sample distribution and the nonstandard limit distribution, we confirm that the second-order corrected critical value based on the expansion of the nonstandard limiting distribution is also second-order correct under the standard small-b asymptotics. We further show that, for typical economic time series, the optimal bandwidth that minimizes a weighted average of type I and type II errors is larger by an order of magnitude than the bandwidth that minimizes the asymptotic mean squared error of the corresponding long-run variance estimator. A plug-in procedure for implementing this optimal bandwidth is suggested and simulations confirm that the new plug-in procedure works well in finite samples.Asymptotic expansion, Bandwidth choice, Kernel method, Long-run variance, Loss function, Nonstandard asymptotics, Robust standard error, Type I and Type II errors

    Business Cycles, Trend Elimination, and the HP Filter

    Get PDF
    We analyze trend elimination methods and business cycle estimation by data filtering of the type introduced by Whittaker (1923) and popularized in economics in a particular form by Hodrick and Prescott (1980/1997; HP). A limit theory is developed for the HP filter for various classes of stochastic trend, trend break, and trend stationary data. Properties of the filtered series are shown to depend closely on the choice of the smoothing parameter (λ). For instance, when λ = O ( n 4 ) where n is the sample size, and the HP filter is applied to an I(1) process, the filter does not remove the stochastic trend in the limit as n → ∞. Instead, the filter produces a smoothed Gaussian limit process that is differentiable to the 4’th order. The residual ‘cyclical’ process has the random wandering non-differentiable characteristics of Brownian motion, thereby explaining the frequently observed ‘spurious cycle’ effect of the HP filter. On the other hand, when λ = o ( n ), the filter reproduces the limit Brownian motion and eliminates the stochastic trend giving a zero ‘cyclical’ process. Simulations reveal that the λ = O ( n 4 ) limit theory provides a good approximation to the actual HP filter for sample sizes common in practical work. When it is used as a trend removal device, the HP filter therefore typically fails to eliminate stochastic trends, contrary to what is now standard belief in applied macroeconomics. The findings are related to recent public debates about the long run effects of the global financial crisis

    The KPSS Test with Seasonal Dummies

    Get PDF
    It is shown that the KPSS test for stationarity may be applied without change to regressions with seasonal dummies. In particular, the limit distribution of the KPSS statistic is the same under both the null and alternative hypotheses whether or not seasonal dummies are used.KPSS test, Seasonal dummies, Stationarity test, Unit root

    Discrete choice modeling with nonstationary panels applied to exchange rate regime choice

    Get PDF
    This paper develops a regression limit theory for discrete choice nonstationary panels with large cross section (N) and timeseries (T) dimensions. Some results emerging from this theory are directly applicable in the wider context of M-estimation. This includes an extension of work by Wooldridge [Wooldridge,J.M., 1994. Estimation and Inference for Dependent Processes. In: Engle, R.F., McFadden, D.L. (Eds.). Handbook of Econometrics, vol. 4, North-Holland, Amsterdam] on the limit theory of local extremum estimators to multi-indexed processes in nonlinear nonstationary panel data models. It is shown that the maximum likelihood (ML) estimator is consistent without an incidental parameters problem and has a limit theory with a fast rate of convergence N(1/2)T(3/4) (in the stationary case, the rate is N(1/2)T(1/2)) for the regression coefficients and thresholds, and a normal limit distribution. In contrast, the limit distribution is known to be mixed normal in time series modeling, as shown in [Park, J.Y., Phillips, P.C.B., 2000, Nonstationary binary choice. Econometrica, 68, 1249-1280] (hereafter PP), and [Phillips, P.C.B.Jin, S., Hu, L., 2007. Nonstationary discrete choice: A corrigendum and addendum. journal of Econometrics 141(2), 1115-1130] (hereafter, PJH). The approach is applied to exchange rate regime choice by monetary authorities, and we provide an analysis of the empirical phenomenon known as "fear of floating". (c) 2008 Elsevier B.V. All rights reserved.EconomicsMathematics, Interdisciplinary ApplicationsSocial Sciences, Mathematical MethodsSCI(E)EICPCI-SSH(ISSHP)SSCI
    corecore