3,678 research outputs found
Analysis of variance for bayesian inference
This paper develops a multi-way analysis of variance for non-Gaussian multivariate distributions and provides a practical simulation algorithm to estimate the corresponding components of variance. It specifically addresses variance in Bayesian predictive distributions, showing that it may be decomposed into the sum of extrinsic variance, arising from posterior uncertainty about parameters, and intrinsic variance, which would exist even if parameters were known. Depending on the application at hand, further decomposition of extrinsic or intrinsic variance (or both) may be useful. The paper shows how to produce simulation-consistent estimates of all of these components, and the method demands little additional effort or computing time beyond that already invested in the posterior simulator. It illustrates the methods using a dynamic stochastic general equilibrium model of the US economy, both before and during the global financial crisis. JEL Classification: C11, C53Analysis of variance, Bayesian inference, posterior simulation, predictive distributions
Hierarchical Markov normal mixture models with applications to financial asset returns
With the aim of constructing predictive distributions for daily returns, we introduce a new Markov normal mixture model in which the components are themselves normal mixtures. We derive the restrictions on the autocovariances and linear representation of integer powers of the time series in terms of the number of components in the mixture and the roots of the Markov process. We use the model prior predictive distribution to study its implications for some interesting functions of returns. We apply the model to construct predictive distributions of daily S&P500 returns, dollarpound returns, and one- and ten-year bonds. We compare the performance of the model with ARCH and stochastic volatility models using predictive likelihoods. The model's performance is about the same as its competitors for the bond returns, better than its competitors for the S&P 500 returns, and much better for the dollar-pound returns. Validation exercises identify some potential improvements. JEL Classification: C53, G12, C11, C14Asset returns, Bayesian, forecasting, MCMC, mixture models
Optimal Prediction Pools
A prediction model is any statement of a probability distribution for an outcome not yet observed. This study considers the properties of weighted linear combinations of n prediction models, or linear pools, evaluated using the conventional log predictive scoring rule. The log score is a concave function of the weights and, in general, an optimal linear combination will include several models with positive weights despite the fact that exactly one model has limiting posterior probability one. The paper derives several interesting formal results: for example, a prediction model with positive weight in a pool may have zero weight if some other models are deleted from that pool. The results are illustrated using S&P 500 returns with prediction models from the ARCH, stochastic volatility and Markov mixture families. In this example models that are clearly inferior by the usual scoring criteria have positive weights in optimal linear pools, and these pools substantially outperform their best components. JEL Classification: C11, C53forecasting, GARCH, log scoring, Markov mixture, model combination
Comparing and evaluating Bayesian predictive distributions of assets returns
Bayesian inference in a time series model provides exact, out-of-sample predictive distributions that fully and coherently incorporate parameter uncertainty. This study compares and evaluates Bayesian predictive distributions from alternative models, using as an illustration five alternative models of asset returns applied to daily S&P 500 returns from 1976 through 2005. The comparison exercise uses predictive likelihoods and is inherently Bayesian. The evaluation exercise uses the probability integral transform and is inherently frequentist. The illustration shows that the two approaches can be complementary, each identifying strengths and weaknesses in models that are not evident using the other. JEL Classification: C11, C53forecasting, GARCH, inverse probability transform, Markov mixture, predictive likelihood, S&P 500 returns, stochastic volatility
Optimal Prediction Pools
A prediction model is any statement of a probability distribution for an outcome not yet observed. This study considers the properties of weighted linear combinations of n prediction models, or linear pools, evaluated using the conventional log predictive scoring rule. The log score is a concave function of the weights and, in general, an optimal linear combination will include several models with positive weights despite the fact that exactly one model has limiting posterior probability one. The paper derives several interesting formal results: for example, a prediction model with positive weight in a pool may have zero weight if some other models are deleted from that pool. The results are illustrated using S&P 500 returns with prediction models from the ARCH, stochastic volatility and Markov mixture families. In this example models that are clearly inferior by the usual scoring criteria have positive weights in optimal linear pools, and these pools substantially outperform their best components.forecasting; GARCH; log scoring; Markov mixture; model combination; S&P 500 returns; stochastic volatility
An Empirical Analysis of Income Dynamics among Men in the PSID: 1968–1989
This study uses data from the Panel Survey of Income Dynamics (PSID) to address a number of questions about life-cycle earnings mobility. It develops a dynamic reduced-form model of earnings and marital status that is nonstationary over the life-cycle. A Gibbs sampling-data augmentation algorithm facilitates use of the entire sample and provides numerical approximations to the exact posterior distribution of properties of earnings paths. This algorithm copes with the complex distribution of endogenous variables that are observed for short segments of an individual’s work history, not including the initial period. The study reaches several firm conclusions about life cycle earnings mobility. Incorporating non-Gaussian shocks makes it possible to account for transitions between low and higher earnings states, a heretofore unresolved problem. The non-Gaussian distribution substantially increases the lifetime return to postsecondary education, and substantially reduces differences in lifetime wages attributable to race. In a given year, the majority of variance in earnings not accounted for by race, education, and age is due to transitory shocks, but over a lifetime the majority is due to unobserved individual heterogeneity. Consequently, low earnings at early ages are strong predictors of low earnings later in life, even conditioning on observed individual characteristics.
Recommended from our members
Econometrics: A bird's eye view
As a unified discipline, econometrics is still relatively young and has been transforming and expanding very rapidly over the past few decades. Major advances have taken place in the analysis of cross sectional data by means of semi-parametric and non-parametric techniques. Heterogeneity of economic relations across individuals, firms and industries is increasingly acknowledge and attempts have been made to take them into account either by integrating out their effects or by modeling the sources of heterogeneity when suitable panel data exists. The counterfactual considerations that underlie policy analysis and treatment evaluation have been given a more satisfactory foundation. New time series econometric techniques have been developed and employed extensively in the areas of macroeconometrics and finance. Non-linear econometric techniques are used increasingly in the analysis of cross section and time series observations. Applications of Bayesian techniques to econometric problems have been given new impetus largely thanks to advances in computer power and computational techniques. The use of Bayesian techniques have in turn provided the investigators with a unifying framework where the tasks and forecasting, decision making, model evaluation and learning can be considered as parts of the same interactive and iterative process; thus paving the way for establishing the foundation of the "real time econometrics". This paper attempts to provide an overview of some of these developments
- …