20,592 research outputs found

    Forecasting the term structure of government bond yields

    Get PDF
    Despite powerful advances in yield curve modeling in the last twenty years, comparatively little attention has been paid to the key practical problem of forecasting the yield curve. In this paper we do so. We use neither the no-arbitrage approach, which focuses on accurately fitting the cross section of interest rates at any given time but neglects time-series dynamics, nor the equilibrium approach, which focuses on time-series dynamics (primarily those of the instantaneous rate) but pays comparatively little attention to fitting the entire cross section at any given time and has been shown to forecast poorly. Instead, we use variations on the Nelson-Siegel exponential components framework to model the entire yield curve, period-by-period, as a three-dimensional parameter evolving dynamically. We show that the three time-varying parameters may be interpreted as factors corresponding to level, slope and curvature, and that they may be estimated with high efficiency. We propose and estimate autoregressive models for the factors, and we show that our models are consistent with a variety of stylized facts regarding the yield curve. We use our models to produce term-structure forecasts at both short and long horizons, with encouraging results. In particular, our forecasts appear much more accurate at long horizons than various standard benchmark forecasts. JEL Code: G1, E4, C

    Evaluating point and density forecasts of DSGE models : [Version 13 März 2012]

    Get PDF
    This paper investigates the accuracy of point and density forecasts of four DSGE models for inflation, output growth and the federal funds rate. Model parameters are estimated and forecasts are derived successively from historical U.S. data vintages synchronized with the Fed’s Greenbook projections. Point forecasts of some models are of similar accuracy as the forecasts of nonstructural large dataset methods. Despite their common underlying New Keynesian modeling philosophy, forecasts of different DSGE models turn out to be quite distinct. Weighted forecasts are more precise than forecasts from individual models. The accuracy of a simple average of DSGE model forecasts is comparable to Greenbook projections for medium term horizons. Comparing density forecasts of DSGE models with the actual distribution of observations shows that the models overestimate uncertainty around point forecasts

    Bayesian and Non-Bayesian Approaches to Scientific Modeling and Inference in Economics and Econometrics

    Get PDF
    After brief remarks on the history of modeling and inference techniques in economics and econometrics , attention is focused on the emergence of economic science in the 20th century. First, the broad objectives of science and the Pearson-Jeffreys' "unity of science" principle will be reviewed. Second, key Bayesian and non-Bayesian practical scientific inference and decision methods will be compared using applied examples from economics, econometrics and business. Third, issues and controversies on how to model the behavior of economic units and systems will be reviewed and the structural econometric modeling, time series analysis (SEMTSA) approach will be described and illustrated using a macro-economic modeling and forecasting problem involving analyses of data for 18 industrialized countries over the years since the 1950s. Point and turning point forecasting results will be summarized. Last, a few remarks will be made about the future of scientific inference and modeling techniques in economics and econometrics.

    Prediction of time series by statistical learning: general losses and fast rates

    Full text link
    We establish rates of convergences in time series forecasting using the statistical learning approach based on oracle inequalities. A series of papers extends the oracle inequalities obtained for iid observations to time series under weak dependence conditions. Given a family of predictors and nn observations, oracle inequalities state that a predictor forecasts the series as well as the best predictor in the family up to a remainder term Δn\Delta_n. Using the PAC-Bayesian approach, we establish under weak dependence conditions oracle inequalities with optimal rates of convergence. We extend previous results for the absolute loss function to any Lipschitz loss function with rates Δnc(Θ)/n\Delta_n\sim\sqrt{c(\Theta)/ n} where c(Θ)c(\Theta) measures the complexity of the model. We apply the method for quantile loss functions to forecast the french GDP. Under additional conditions on the loss functions (satisfied by the quadratic loss function) and on the time series, we refine the rates of convergence to Δnc(Θ)/n\Delta_n \sim c(\Theta)/n. We achieve for the first time these fast rates for uniformly mixing processes. These rates are known to be optimal in the iid case and for individual sequences. In particular, we generalize the results of Dalalyan and Tsybakov on sparse regression estimation to the case of autoregression

    Use and Communication of Probabilistic Forecasts

    Full text link
    Probabilistic forecasts are becoming more and more available. How should they be used and communicated? What are the obstacles to their use in practice? I review experience with five problems where probabilistic forecasting played an important role. This leads me to identify five types of potential users: Low Stakes Users, who don't need probabilistic forecasts; General Assessors, who need an overall idea of the uncertainty in the forecast; Change Assessors, who need to know if a change is out of line with expectatations; Risk Avoiders, who wish to limit the risk of an adverse outcome; and Decision Theorists, who quantify their loss function and perform the decision-theoretic calculations. This suggests that it is important to interact with users and to consider their goals. The cognitive research tells us that calibration is important for trust in probability forecasts, and that it is important to match the verbal expression with the task. The cognitive load should be minimized, reducing the probabilistic forecast to a single percentile if appropriate. Probabilities of adverse events and percentiles of the predictive distribution of quantities of interest seem often to be the best way to summarize probabilistic forecasts. Formal decision theory has an important role, but in a limited range of applications

    Challenges in macro-finance modeling

    Get PDF
    This article discusses various challenges in the specification and implementation of "macro-finance" models in which macroeconomic variables and term structure variables are modeled together in a no-arbitrage framework. The author classifies macro-finance models into pure latent-factor models ("internal basis models") and models that have observed macroeconomic variables as state variables ("external basis models") and examines the underlying assumptions behind these models. Particular attention is paid to the issue of unspanned short-run fluctuations in macroeconomic variables and their potentially adverse effect on the specification of external basis models. The author also discusses the challenge of addressing features such as structural breaks and time-varying inflation uncertainty. Empirical difficulties in the estimation and evaluation of macro-finance models are also discussed in detail.Econometric models ; Macroeconomics
    corecore