57 research outputs found

    Large Time-Varying Parameter VARs

    Get PDF
    In this paper, we develop methods for estimation and forecasting in large time-varying parameter vector autoregressive models (TVP-VARs). To overcome computational constraints, we draw on ideas from the dynamic model averaging literature which achieve reductions in the computational burden through the use forgetting factors. We then extend the TVP-VAR so that its dimension can change over time. For instance, we can have a large TVP-VAR as the forecasting model at some points in time, but a smaller TVP-VAR at others. A final extension lies in the development of a new method for estimating, in a time-varying manner, the parameter(s) of the shrinkage priors commonly-used with large VARs. These extensions are operationalized through the use of forgetting factor methods and are, thus, computationally simple. An empirical application involving forecasting inflation, real output and interest rates demonstrates the feasibility and usefulness of our approach

    Forecasting in dynamic factor models using Bayesian model averaging

    Get PDF
    This paper considers the problem of forecasting in dynamic factor models using Bayesian model averaging. Theoretical justifications for averaging across models, as opposed to selecting a single model, are given. Practical methods for implementing Bayesian model averaging with factor models are described. These methods involve algorithms which simulate from the space defined by all possible models. We discuss how these simulation algorithms can also be used to select the model with the highest marginal likelihood (or highest value of an information criterion) in an efficient manner. We apply these methods to the problem of forecasting GDP and inflation using quarterly U.S. data on 162 time series. For both GDP and inflation, we find that the models which contain factors do out-forecast an AR(p), but only by a relatively small amount and only at short horizons. We attribute these findings to the presence of structural instability and the fact that lags of dependent variable seem to contain most of the information relevant for forecasting. Relative to the small forecasting gains provided by including factors, the gains provided by using Bayesian model averaging over forecasting methods based on a single model are appreciable

    An investigation of thresholds in air pollution-mortality effects

    Get PDF
    In this paper we introduce and implement new techniques to investigate threshold effects in air pollution-mortality relationships. Our key interest is in measuring the dose-response relationship above and below a given threshold level where we allow for a large number of potential explanatory variables to trigger the threshold effect. This is in contrast to existing approaches that usually focus on a single threshold trigger. We allow for a myriad of threshold effects within a Bayesian statistical framework that accounts for model uncertainty (i.e. uncertainty about which threshold trigger and explanatory variables are appropriate). We apply these techniques in an empirical exercise using daily data from Toronto for 1992-1997. We investigate the existence and nature of threshold effects in the relationship between mortality and ozone (O3), total particulate matter (PM) and an index of other conventionally occurring air pollutants. In general, we find the effects of the pollutants we consider on mortality to be statistically indistinguishable from zero with no evidence of thresholds. The one exception is ozone, for which results present an ambiguous picture. Ozone has no significant effect on mortality when we exclude threshold effects from the analysis. Allowing for thresholds we find a positive and significant effect for this pollutant when the threshold trigger is the average change in ozone two days ago. However, this significant effect is not observed after controlling for PM

    Semiparametric Bayesian inference in smooth coefficient models

    Get PDF
    We describe procedures for Bayesian estimation and testing in cross-sectional, panel data and nonlinear smooth coefficient models. The smooth coefficient model is a generalization of the partially linear or additive model wherein coefficients on linear explanatory variables are treated as unknown functions of an observable covariate. In the approach we describe, points on the regression lines are regarded as unknown parameters and priors are placed on differences between adjacent points to introduce the potential for smoothing the curves. The algorithms we describe are quite simple to implement - for example, estimation, testing and smoothing parameter selection can be carried out analytically in the cross-sectional smooth coefficient model. We apply our methods using data from the National Longitudinal Survey of Youth (NLSY). Using the NLSY data we first explore the relationship between ability and log wages and flexibly model how returns to schooling vary with measured cognitive ability. We also examine a model of female labor supply and use this example to illustrate how the described techniques can been applied in nonlinear settings

    Re-examining the consumption-wealth relationship : the role of model uncertainty

    Get PDF
    This paper discusses the consumption-wealth relationship. Following the recent influential workof Lettau and Ludvigson [e.g. Lettau and Ludvigson (2001), (2004)], we use data on consumption, assets andlabor income and a vector error correction framework. Key …ndings of their work are that consumption doesrespond to permanent changes in wealth in the expected manner, but that most changes in wealth are transitoryand have no e¤ect on consumption. We investigate the robustness of these results to model uncertainty andargue for the use of Bayesian model averaging. We …nd that there is model uncertainty with regards to thenumber of cointegrating vectors, the form of deterministic components, lag length and whether the cointegratingresiduals a¤ect consumption and income directly. Whether this uncertainty has important empirical implicationsdepends on the researcher's attitude towards the economic theory used by Lettau and Ludvigson. If we workwith their model, our findings are very similar to theirs. However, if we work with a broader set of models andlet the data speak, we obtain somewhat di¤erent results. In the latter case, we …nd that the exact magnitudeof the role of permanent shocks is hard to estimate precisely. Thus, although some support exists for the viewthat their role is small, we cannot rule out the possibility that they have a substantive role to play

    The Stiffness of Steel-Wood-Steel Connection Loaded Parallel to the Grain

    Get PDF
    In Eurocode 5, the stiffness equation for bolted steel-wood-steel is stated as a function of wood density and fastener diameter only. In this research, an experimental study on various configurations of tested bolted steel-wood-steel (SWS) connections has been undertaken to predict the initial stiffness of each connection. In order to validate the Eurocode 5 stiffness equation, tests on 50 timber specimens (40 glued laminated timbers and 10 laminated veneer lumbers (LVL)) with steel plates were undertaken. The number of bolts was kept similar and the connector diameter, timber thickness, and wood density were varied. The results obtained in the experimental tests are compared with those obtained from the Eurocode 5 stiffness equation. From the analysis, it is signified that the stiffness equation specified in Eurocode 5 for bolted SWS connections does not adequately predict the initial stiffness. The results from Eurocode 5 stiffness equation are very far from the experimental values. The ratio of stiffness equation to experimental results ranges from 3.48 to 4.20, with the average at 3.77, where the equation overpredicted the experimental stiffness value for the connection. There is a need to consider or incorporated other parameters such as geometric configurations in Eurocode 5 stiffness equation to improve the ratio with the experimental data
    • …
    corecore