2 research outputs found

    Extrapolation for Time-Series and Cross-Sectional Data

    Get PDF
    Extrapolation methods are reliable, objective, inexpensive, quick, and easily automated. As a result, they are widely used, especially for inventory and production forecasts, for operational planning for up to two years ahead, and for long-term forecasts in some situations, such as population forecasting. This paper provides principles for selecting and preparing data, making seasonal adjustments, extrapolating, assessing uncertainty, and identifying when to use extrapolation. The principles are based on received wisdom (i.e., experts’ commonly held opinions) and on empirical studies. Some of the more important principles are:• In selecting and preparing data, use all relevant data and adjust the data for important events that occurred in the past.• Make seasonal adjustments only when seasonal effects are expected and only if there is good evidence by which to measure them.• In extrapolating, use simple functional forms. Weight the most recent data heavily if there are small measurement errors, stable series, and short forecast horizons. Domain knowledge and forecasting expertise can help to select effective extrapolation procedures. When there is uncertainty, be conservative in forecasting trends. Update extrapolation models as new data are received.• To assess uncertainty, make empirical estimates to establish prediction intervals.• Use pure extrapolation when many forecasts are required, little is known about the situation, the situation is stable, and expert forecasts might be biased

    Econometric Forecasting

    No full text
    Several principles are useful for econometric forecasters: keep the model simple, use all the data you can get, and use theory (not the data) as a guide to selecting causal variables. But theory gives little guidance on dynamics, that is, on which lagged values of the selected variables to use. Early econometric models failed in comparison with extrapolative methods because they paid too little attention to dynamic structure. In a fairly simple way, the vector autoregression (VAR) approach that first appeared in the 1980s resolved the problem by shifting emphasis towards dynamics and away from collecting many causal variables. The VAR approach also resolves the question of how to make long-term forecasts where the causal variables themselves must be forecast. When the analyst does not need to forecast causal variables or can use other sources, he or she can use a single equation with the same dynamic structure. Ordinary least squares is a perfectly adequate estimation method. Evidence supports estimating the initial equation in levels, whether the variables are stationary or not. We recommend a general-to-specific model-building strategy: start with a large number of lags in the initial estimation, although simplifying by reducing the number of lags pays off. Evidence on the value of further simplification is mixed. If cointegration among variables, then error-correction models (ECMs) will do worse than equations in levels. But ECMs are only sometimes an improvement eve
    corecore