197 research outputs found

    Spline methods for extracting interest rate curves from coupon bond prices

    Get PDF
    Cubic splines have long been used to extract the discount, yield, and forward rate curves from coupon bond data. McCulloch used regression splines to estimate the discount function, and, more recently, Fisher, Nychka, and Zervos used smoothed splines, with the roughness penalty selected by generalized cross-validation, to estimate the forward rate curve. I propose using a smoothed spline but with a roughness penalty that can vary across maturities, to estimate the forward rate curve. This method is tested against the methods of McCulloch and Fisher, Nychka, and Zervos using monthly bond data from 1970 through 1995.Econometric models ; Financial markets ; Prices ; Statistics

    Conditional forecasts in dynamic multivariate models

    Get PDF
    In the existing literature, conditional forecasts in the vector autoregressive (VAR) framework have not been commonly presented with probability distributions or error bands. This paper develops Bayesian methods for computing such distributions or bands. It broadens the class of conditional forecasts to which the methods can be applied. The methods work for both structural and reduced-form VAR models and, in contrast to common practices, account for the parameter uncertainty in small samples. Empirical examples under the flat prior and under the reference prior of Sims and Zha (1998) are provided to show the use of these methods.Econometric models ; Forecasting ; Time-series analysis

    Likelihood-preserving normalization in multiple equation models

    Get PDF
    Causal analysis in multiple equation models often revolves around the evaluation of the effects of an exogenous shift in a structural equation. When taking into account the uncertainty implied by the shape of the likelihood, we argue that how normalization is implemented matters for inferential conclusions around the maximum likelihood (ML) estimates of such effects. We develop a general method that eliminates the distortion of finite-sample inferences about these ML estimates after normalization. We show that our likelihood-preserving normalization always maintains coherent economic interpretations while an arbitrary implementation of normalization can lead to ill-determined inferential results.Time-series analysis ; Supply and demand ; Demand for money ; Money supply

    A Gibbs simulator for restricted VAR models

    Get PDF
    Many economic applications call for simultaneous equations VAR modeling. We show that the existing importance sampler can be prohibitively inefficient for this type of models. We develop a Gibbs simulator that works for both simultaneous and recursive VAR models with a much broader range of linear restrictions than those in the existing literature. We show that the required computation is of an SUR type, and thus our method can be implemented cheaply even for large systems of multiple equations.Econometric models ; Vector autoregression ; Monetary policy ; Time-series analysis

    Confronting Model Misspecification in Macroeconomics

    Get PDF
    We estimate a Markov-switching mixture of two familiar macroeconomic models: a richly parameterized DSGE model and a corresponding BVAR model. We show that the Markov-switching mixture model dominates both individual models and improves the fit considerably. Our estimation indicates that the DSGE model plays an important role only in the late 1970s and the early 1980s. We show how to use the mixture model as a data filter for estimation of the DSGE model when the BVAR model is not identified. Moreover, we show how to compute the impulse responses to the same type of shock shared by the DSGE and BVAR models when the shock is identified in the BVAR model. Our exercises demonstrate the importance of integrating model uncertainty and parameter uncertainty to address potential model misspecification in macroeconomics.

    Normalization, probability distribution, and impulse responses

    Get PDF
    When impulse responses in dynamic multivariate models such as identified VARs are given economic interpretations, it is important that reliable statistical inferences be provided. Before probability assessments are provided, however, the model must be normalized. Contrary to the conventional wisdom, this paper argues that normalization, a rule of reversing signs of coefficients in equations in a particular way, could considerably affect the shape of the likelihood and thus probability bands for impulse responses. A new concept called ML distance normalization is introduced to avoid distorting the shape of the likelihood. Moreover, this paper develops a Monte Carlo simulation technique for implementing ML distance normalization.Econometric models ; Monetary policy

    Closing the question on the continuation of turn-of-the-month effects: evidence from the S&P 500 Index futures contract

    Get PDF
    Prior research documents unusually high returns on the last trading day of the month and over the next three consecutive trading days. This phenomenon is known as the turn-of-the-month (TOTM) effect. According to Siegel (1998), why these anomalies occur is not well understood, and whether they will continue to be significant in the future is an open question. In this paper, we examine the S&P 500 futures contract for evidence that turn-of-the-month effects have continued. Transaction costs are low for index futures, and the absence of short-sale restrictions makes index futures an attractive venue for testing the continuation of market anomalies because of the low cost of arbitrage. We find that TOTM effects for S&P 500 futures disappear after 1990, and this result carries over to the S&P 500 spot market. We conjecture that a change in the preference of individual investors over time from making direct to making indirect stock purchases through mutual funds is related to the disappearance of the TOTM effect for more recent return data. In this paper, we argue that turn-of-the-month return patterns for both spot and futures prices are dynamic and related to market microstructure and therefore subject to change without notice. Financial economists should be careful when making out-of-sample inferences from observed in-sample return regularities.Financial markets ; Futures

    Evaluating Wall Street Journal survey forecasters: a multivariate approach

    Get PDF
    This paper proposes a methodology for assessing the joint performance of multivariate forecasts of economic variables. The methodology is illustrated by comparing the rankings of forecasters by the Wall Street Journal with the authors’ alternative rankings. The results show that the methodology can provide useful insights as to the certainty of forecasts as well as the extent to which various forecasts are similar or different.Forecasting

    Sources of the Great Moderation: shocks, frictions, or monetary policy?

    Get PDF
    We study the sources of the Great Moderation by estimating a variety of medium-scale dynamic stochastic general equilibrium (DSGE) models that incorporate regime switches in shock variances and the inflation target. The best-fit model—the one with two regimes in shock variances—gives quantitatively different dynamics compared with the benchmark constant-parameter model. Our estimates show that three kinds of shocks accounted for most of the Great Moderation and business-cycle fluctuations: capital depreciation shocks, neutral technology shocks, and wage markup shocks. In contrast to the existing literature, we find that changes in the inflation target or shocks in the investment-specific technology played little role in macroeconomic volatility. Moreover, our estimates indicate considerably fewer nominal rigidities than the literature suggests.Econometric models
    corecore