670 research outputs found
Bootstrapping Spurious Regression
The bootstrap is shown to be inconsistent in spurious regression. The failure of the bootstrap is spectacular in that the bootstrap effectively turns a spurious regression into a cointegrating regression. In particular, the serial correlation coefficient of the residuals in the bootstrap regression does not converge to unity, so the bootstrap is not even first order consistent. The block bootstrap serial correlation coefficient does converge to unity and is therefore first order consistent, but has a slower rate of convergence and a different limit distribution from that of the sample data serial correlation coefficient. The analysis covers spurious regressions involving both deterministic trends and stochastic trends. The results reinforce earlier warnings about routine use of the bootstrap with dependent data.Asymptotic theory, Bootstrap, Brownian motion, Cointegration, LK representation, Nonstationarity, Residual diagnostics, Unit root
Challenges of Trending Time Series Econometrics
We discuss some challenges presented by trending data in time series econometrics. To the empirical economist there is little guidance from theory about the source of trend behavior and even less guidance about practical formulations. Moreover, recent proximity theorems reveal that trends are more elusive to model empirically than stationary processes, with the upshot that optimal forecasts are also harder to estimate when the data involve trends. These limitations are implicitly acknowledged in much practical modeling and forecasting work, where adaptive methods are often used to help keep models on track as trends evolve. The paper discusses these broader issues and limitations of econometrics and o.ers some thoughts on new practical possibilities for data analysis in the absence of good theory models for trends. In particular, a new concept of coordinate cointegration is introduced and some new econometric methodology is suggested for analyzing trends and comovement and for producing forecasts in a general way that is agnostic about the specific nature of the trend process. Some simulation exercises are conducted and some long historical series on prices and yields on long securities are used to illustrate the methods.Coordinate instrumental variables, coordinate reduced rank regression, coordinate trend functions, limitations of econometrics, nonstationarity, trend
Regression with Slowly Varying Regressors
Slowly varying regressors are asymptotically collinear in linear regression. Usual regression formulae for asymptotic standard errors remain valid but rates of convergence are affected and the limit distribution of the regression coefficients is shown to be one dimensional. Some asymptotic representations of partial sums of slowly varying functions and central limit theorems with slowly varying weights are given that assist in the development of a regression theory. Multivariate regression and polynomial regression with slowly varying functions are considered and shown to be equivalent, up to standardization, to regression on a polynomial in a logarithmic trend. The theory involves second, third and higher order forms of slow variation. Some applications to trend regression are discussed.Asymptotic expansion, collinearity, Karamata representation, slow variation, smooth variation, trend regression
Long Memory and Long Run Variation
May 2008 A commonly used defining property of long memory time series is the power law decay of the autocovariance function. Some alternative methods of deriving this property are considered working from the alternate definition in terms of a fractional pole in the spectrum at the origin. The methods considered involve the use of (i) Fourier transforms of generalized functions, (ii) asymptotic expansions of Fourier integrals with singularities, (iii) direct evaluation using hypergeometric function algebra, and (iv) conversion to a simple gamma integral. The paper is largely pedagogical but some novel methods and results involving complete asymptotic series representations are presented. The formulae are useful in many ways including the calculation of long run variation matrices for multivariate time series with long memory and the econometric estimation of such models.Asymptotic expansion, Autocovariance function, Fractional pole, Fourier integral, Generalized function, Long memory, Long range dependence, Singularity
Trending Time Series and Macroeconomic Activity: Some Present and Future Challenges
Some challenges for econometric research on trending time are discussed in relation to some perceived needs of macroeconomics and macroeconomic policy making.Breaks, growth, policy intervention, production, trend mechanisms, unit roots
Unit Root Model Selection
Some limit properties for information based model selection criteria are given in the context of unit root evaluation and various assumptions about initial conditions. Allowing for a nonparametric short memory component, standard information criteria are shown to be weakly consistent for a unit root provided the penalty coefficient C_n -> infinity and C_n/n -> 0 as n -> infinity. Strong consistency holds when C_n/(loglog n)^3 -> infinity under conventional assumptions on initial conditions and under a slightly stronger condition when initial conditions are infinitely distant in the unit root model. The limit distribution of the AIC criterion is obtained.AIC, Consistency, Model selection, Nonparametric, Unit root
Econometric Analysis of Fisher's Equation
Fisher's equation for the determination of the real rate of interest is studied from a fresh econometric perspective. Some new methods of data description for nonstationary time series are introduced. The methods provide a nonparametric mechanism for modelling the spatial densities of a time series that displays random wandering characteristics, like interest rates and inflation. Hazard rate functionals are also constructed, an asymptotic theory is given and the techniques are illustrated in some empirical applications to real interest rates for the US. The paper ends by calculating Gaussian semiparametric estimates of long range dependence in US real interest rates, using a new asymptotic theory that covers the nonstationary case. The empirical results indicate that the real rate of interest in the US is (fractionally) nonstationary over 1934-1997 and over the more recent subperiods 1961-1985 and 1961-1997. Unit root nonstationarity and short memory stationarity are both strongly rejected for all these periods.Fractional integration; hazard rate; long range dependence; real rate of interest; semiparametric estimation; sojourn time; spatial density
Folklore Theorems, Implicit Maps and New Unit Root Limit Theory
The delta method and continuous mapping theorem are among the most extensively used tools in asymptotic derivations in econometrics. Extensions of these methods are provided for sequences of functions, which are commonly encountered in applications, and where the usual methods sometimes fail. Important examples of failure arise in the use of simulation based estimation methods such as indirect inference. The paper explores the application of these methods to the indirect inference estimator (IIE) in first order autoregressive estimation. The IIE uses a binding function that is sample size dependent. Its limit theory relies on a sequence-based delta method in the stationary case and a sequence-based implicit continuous mapping theorem in unit root and local to unity cases. The new limit theory shows that the IIE achieves much more than bias correction. It changes the limit theory of the maximum likelihood estimator (MLE) when the autoregressive coefficient is in the locality of unity, reducing the bias and the variance of the MLE without affecting the limit theory of the MLE in the stationary case. Thus, in spite of the fact that the IIE is a continuously differentiable function of the MLE, the limit distribution of the IIE is not simply a scale multiple of the MLE but depends implicitly on the full binding function mapping. The unit root case therefore represents an important example of the failure of the delta method and shows the need for an implicit mapping extension of the continuous mapping theorem.Binding function, Delta method, Exact bias, Implicit continuous maps, Indirect inference, Maximum likelihood
Unit Root Log Periodogram Regression
Log periodogram (LP) regression is shown to be consistent and to have a mixed normal limit distribution when the memory parameter d = 1. Gaussian errors are not required. Tests of d = 1 based on LP regression are consistent against d 1 alternatives. A test based on a modified LP regression that is consistent in both directions is provided.Discrete Fourier transform, fractional Brownian motion, fractional integration, log periodogram regression, long memory parameter, nonstationarity, semiparametric estimation and testing, unit root
Meritocracy Voting: Measuring the Unmeasurable
Learned societies commonly carry out selection processes to add new fellows to an existing fellowship. Criteria vary across societies but are typically based on subjective judgments concerning the merit of individuals who are nominated for fellowships. These subjective assessments may be made by existing fellows as they vote in elections to determine the new fellows or they may be decided by a selection committee of fellows and officers of the society who determine merit after reviewing nominations and written assessments. Human judgment inevitably plays a central role in these determinations and, notwithstanding its limitations, is usually regarded as being a necessary ingredient in making an overall assessment of qualifications for fellowship. The present paper suggests a mechanism by which these merit assessments may be complemented with a quantitative rule that incorporates both subjective and objective elements. The goal of 'measuring merit' may be elusive but quantitative assessment rules can help to widen the effective electorate (for instance, by including the decisions of editors, the judgments of independent referees, and received opinion about research) and mitigate distortions that can arise from cluster effects, invisible college coalition voting and inner sanctum bias. The rule considered here is designed to assist the selection process by explicitly taking into account subjective assessments of individual candidates for election as well as direct quantitative measures of quality obtained from bibliometric data. The methodology has application to a wide arena of quality assessment and professional ranking exercises.Bibliometric data, Election, Fellowship, Measurement, Meritocracy, Peer review, Quantification, Subjective assessment, Voting
- âŠ