7,933 research outputs found

    The Multistep Beveridge-Nelson Decomposition

    Get PDF
    The Beveridge-Nelson decomposition defines the trend component in terms of the eventual forecast function, as the value the series would take if it were on its long-run path. The paper introduces the multistep Beveridge-Nelson decomposition, which arises when the forecast function is obtained by the direct autoregressive approach, which optimizes the predictive ability of the AR model at forecast horizons greater than one. We compare our proposal with the standard Beveridge-Nelson decomposition, for which the forecast function is obtained by iterating the one-step-ahead predictions via the chain rule. We illustrate that the multistep Beveridge-Nelson trend is more efficient than the standard one in the presence of model misspecification and we subsequently assess the predictive validity of the extracted transitory component with respect to future growth.Trend and Cycle; Forecasting; Filtering.

    Some Reflections on Trend-Cycle Decompositions with Correlated Components

    Get PDF
    This paper discusses a few interpretative issues arising from trend- cycle decompositions with correlated components. We determine the conditions under which correlated components may originate from: underestimation of the cyclical component; a cycle in growth rates, rather than in the levels; the hysteresis phenomenon; permanent- transitory decompositions, where the permanent component has richer dynamics than a pure random walk. Moreover, the consequences for smoothing and signal extraction are discussed: in particular, we establish that a negative correlation implies that future observations carry most of the information needed to assess cyclical stance. As a result, the components will be subject to high revisions. The overall conclusion is that the characterisation of economic fluctuations in macroeconomic time series largely remains an open issue.Kalman Filter and Smoother, Signal Extraction, Frequency Domain Estimation, Hysteresis, Permanent-Transitory Decomposition, Revisions

    Trend Estimation

    Get PDF
    Trend estimation deals with the characterization of the underlying, or longā€“run, evolution of a time series. Despite being a very pervasive theme in time series analysis since its inception, it still raises a lot of controversies. The difficulties, or better, the challenges, lie in the identification of the sources of the trend dynamics, and in the definition of the time horizon which defines the long run. The prevalent view in the literature considers the trend as a genuinely latent component, i.e. as the component of the evolution of a series that is persistent and cannot be ascribed to observable factors. As a matter of fact, the univariate approaches reviewed here assume that the trend is either a deterministic or random function of time.Time series models; unobserved components.

    On the Model-Based Interpretation of Filters and the Reliability of Trend-Cycle Estimates

    Get PDF
    The paper is concerned with a class of trend cycle filters, encompassing popular ones, such as the Hodrick-Prescott filter, that are derived using the Wiener-Kolmogorov signal extraction theory under maintained models that prove unrealistic in applied time series analysis. As the maintained model is misspecified, inference about the unobserved components, and in particular their first two conditional moments, given the observations, are not delivered by the Kalman filter and smoother or the Wiener-Kolmogorov filter for the maintained model. The paper proposes a model based framework according to which the same class of filters is adapted to the particular time series under investigation; via a suitable decomposition of the innovation process, it is shown that any linear time series with ARIMA representation can be broken down into orthogonal trend and cycle components, for which the class of filters is optimal. Finite sample inferences are provided by the Kalman filter and smoother for the relevant state space representation of the decomposition. In this framework it is possible to discuss two aspects of the reliability of the signalsā€™ estimates: the mean square error of the final estimates and the extent of the revisions. The paper discusses and illustrates how the uncertainty is related to features of the series and the design parameters of the filter, the role of smoothness priors, and the fundamental trade-off between the uncertainty and the magnitude of the revisions as new observations become available.Signal Extraction, Revisions, Kalman filter and Smoother.

    Seasonality, Forecast Extensions and Business Cycle Uncertainty

    Get PDF
    Seasonality is one of the most important features of economic time series. The possibility to abstract from seasonality for the assessment of economic conditions is a widely debated issue. In this paper we propose a strategy for assessing the role of seasonal adjustment on business cycle measurement. In particular, we provide a method for quantifying the contribution to the unreliability of the estimated cycles extracted by popular filters, such as Baxter and King and Hodrick-Prescott. The main conclusion is that the contribution is larger around the turning points of the series and at the extremes of the sample period; moreover, it much more sizeable for highpass filters, like the Hodrick-Prescott filter, which retain to a great extent the high frequency fluctuations in a time series, the latter being the ones that are more affected by seasonal adjustment. If a bandpass component is considered, the effect has reduced size. Finally, we discuss the role of forecast extensions and the prediction of the cycle. For the time series of industrial production considered in the illustration, it is not possible to provide a reliable estimate of the cycle at the end of the sample.Linear filters; Unobserved Components; Seasonal Adjustment; Reliability.

    Temporal Disaggregation by State Space Methods: Dynamic Regression Methods Revisited

    Get PDF
    The paper documents and illustrates state space methods that implement time series disaggregation by regression methods, with dynamics that depend on a single autoregressive parameter. The most popular techniques for the distribution of economic flow variables, such as Chow-Lin, Fernandez and Litterman, are encompassed by this unifying framework. The state space methodology offers the generality that is required to address a variety of inferential issues, such as the role of initial conditions, which are relevant for the properties of the maximum likelihood estimates and for the the derivation of encompassing representations that nest exactly the traditional disaggregation models, and the definition of a suitable set of real time diagnostics on the quality of the disaggregation and revision histories that support model selection. The exact treatment of temporal disaggregation by dynamic regression models, when the latter are formulated in the logarithms, rather than the levels, of an economic variable, is also provided. The properties of the profile and marginal likelihood are investigated and the problems with estimating the Litterman model are illustrated. In the light of the nonstationary nature of the economic time series usually entertained in practice, the suggested strategy is to fit an autoregressive distribute lag model, which, under a reparameterisation and suitable initial conditions, nests both the Chow-Lin and the Fernandez model, thereby incorporating our uncertainty about the presence of cointegration between the aggregated series and the indicators.Autoregressive Distributed Lag Models, COMFAC, Augmented Kalman filter and smoother, Marginal Likelihood, Logarithmic Transformation.

    Forecasting and Signal Extraction with Misspecified Models

    Get PDF
    The paper illustrates and compares estimation methods alternative to maximum likelihood, among which multistep estimation and leave-one-out cross-validation, for the purposes of signal extraction, and in particular the separation of the trend from the cycle in economic time series, and long-range forecasting, in the presence of a misspecified, but simply parameterised model. Our workhorse models are two popular unobserved components models, namely the local level and the local linear model. The paper introduces a metric for assessing the accuracy of the unobserved components estimates and concludes that cross- validation is not a suitable estimation criterion for the purpose considered, whereas multistep estimation can be valuable. Finally, we propose a local likelihood estimator in the frequency domain that provides a simple and alternative way of making operative the notion of emphasising the long-run properties of a time series.Business cycles, Unobserved components models, Cross- validation, Smoothing, Hodrick-Prescott filter, Multistep estimation.
    • ā€¦
    corecore