A limit theory is developed for mildly explosive autoregression under both weakly and strongly dependent innovation errors. The asymptotic behaviour of the sample moments is affected by the memory of the innovation process both in the form of the limiting distribution and, in the case of long range dependence, in the rate of convergence. However, this effect is not present in least squares regression theory as it is cancelled out by the interaction between the sample moments. As a result, the Cauchy regression theory of Phillips and Magdalinos (2007a) is invariant to the dependence structure of the innovation sequence
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.