465 research outputs found

    Contemporaneous Aggregation of GARCH Processes

    Get PDF
    We study the impact of large cross-sections of contemporaneous aggregation of GARCH processes and of dynamic GARCH factor models. The results crucially depend on the shape of the cross-sectional distribution of the GARCH coefficients and on the cross-sectional dependence properties of the rescaled innovation. The aggregate maintains the core nonlinearity of a volatility model, uncorrelation in the levels but autocorrelation in the squares, when the rescaled innovation is common across units. The nonlinearity is, however, lost at the aggregate level, when the rescaled innovation is orthogonal across units. This is not a consequence of the usual result of a vanishing importance of purely idiosyncratic risk as, under appropriate conditions, this is simply not fully diversifiable in arbitrary large portfolios. Non-GARCH memory properties arise at the aggregate level. Strict stationarity, ergodicity and finite kurtosis might fail for the aggregate despite the micro GARCH do satisfy these properties. Under no conditions aggregation of GARCH induces long memory conditional heteroskedasticity.Contemporaneous aggregation, GARCH, conditionally heteroskedastic factor models, common and idiosyncratic risk, nonlinearity, memory

    (Fractional) Beta Convergence

    Get PDF
    Unit roots in output, an exponential 2 per cent rate of convergence and no change in the underlying dynamics of output seem to be three stylized facts that cannot go together. This paper extends the Solow-Swan growth model allowing for cross-sectional heterogeneity. In this framework, aggregate shocks might vanish at a hyperbolic rather than at an exponential rate. This implies that the level of output can exhibit long memory and that standard tests fail to reject the null of a unit root despite mean reversion. Exploiting secular time series properties GDP, we conclude that traditional approaches to test for uniform (conditional and unconditional) convergence suit first step approximation. We show both theoretically and empirically how the uniform 2 per cent rate of convergence repeatedly found in the empirical literature is the outcome of an underlying parameter of fractional integration strictly between 1/2 and 1. This is consistent with both time series and cross-sectional evidence recently produced.growth model, convergence, long memory, aggregation

    Pseudo-Maximum Likelihood Estimation of ARCH(8) Models

    Get PDF
    Strong consistency and asymptotic normality of the Gaussian pseudo-maximumlikelihood estimate of the parameters in a wide class of ARCH(8) processesare established. We require the ARCH weights to decay at least hyperbolically,with a faster rate needed for the central limit theorem than for the law of largenumbers. Various rates are illustrated in examples of particular parameteriza-tions in which our conditions are shown to be satisfied.ARCH(8,)models, pseudo-maximum likelihoodestimation, asymptotic inference

    Whittle estimation of EGARCH and other exponential volatility models

    Full text link
    The strong consistency and asymptotic normality of the Whittle estimate of the parameters in a class of exponential volatility processes are established. Our main focus here are the EGARCH model of [Nelson, D. 1991. Conditional heteroscedasticity in asset pricing: a new approach. Econometrica 59, 347–370] and other one-shock models such as the GJR model of [Glosten, L., Jaganathan, R., Runkle, D., 1993. On the relation between the expected value and the volatility of the nominal excess returns on stocks. Journal of Finance, 48, 1779–1801], but two-shock models, such as the SV model of [Taylor, S. 1986. Modelling Financial Time Series. Wiley, Chichester (UK)], are also comprised by our assumptions. The variable of interest might not have finite fractional moment of any order and so, in particular, finite variance is not imposed. We allow for a wide range of degrees of persistence of shocks to conditional variance, allowing for both short and long memory

    Aggregation and memory of models of changing volatility

    Get PDF
    In this paper we study the effect of contemporaneous aggregation of an arbitrarily large number of processes featuring dynamic conditional heteroskedasticity with short memory when heterogeneity across units is allowed for. We look at the memory properties of the limit aggregate. General, necessary, conditions for long memory are derived. More specific results relative to certain stochastic volatility models are also developed, providing some examples of how long memory volatility can be obtained by aggregation
    corecore