74,750 research outputs found

    On Marginal Likelihood Computation in Change-point Models

    Get PDF
    Change-point models are useful for modeling times series subject to structural breaks. For interpretation and forecasting, it is essential to estimate correctly the number of change points in this class of models. In Bayesian inference, the number of change-points is typically chosen by the marginal likelihood criterion, computed by Chib’s method. This method requires to select a value in the parameter space at which the computation is done. We explain in detail how to perform Bayesian inference for a change point dynamic regression model and how to compute its marginal likelihood. Motivated by our results from three empirical illustrations, a simulation study shows that Chib’s method is robust with respect to the choice of the parameter value used in the computations, among posterior mean, mode and quartiles. Furthermore, the performance of the Bayesian information criterion, which is based on maximum likelihood estimates, in selecting the correct model is comparable to that of the marginal likelihood.BIC, Change-point model, Chib's method, Marginal likelihood

    On marginal likelihood computation in change-point models

    Get PDF
    Change-point models are useful for modeling time series subject to structural breaks. For interpretation and forecasting, it is essential to estimate correctly the number of change points in this class of models. In Bayesian inference, the number of change points is typically chosen by the marginal likelihood criterion, computed by Chib's method. This method requires to select a value in the parameter space at which the computation is done. We explain in detail how to perform Bayesian inference for a change-point dynamic regression model and how to compute its marginal likelihood. Motivated by our results from three empirical illustrations, a simulation study shows that Chib's method is robust with respect to the choice of the parameter value used in the computations, among posterior mean, mode and quartiles. Furthermore, the performance of the Bayesian information criterion, which is based on maximum likelihood estimates, in selecting the correct model is comparable to that of the marginal likelihood.BIC, change-point model, Chib's method, marginal likelihood

    Marginal Likelihood for Markov-Switching and Change-Point Garch Models

    Get PDF
    GARCH volatility models with fixed parameters are too restrictive for long time series due to breaks in the volatility process. Flexible alternatives are Markov-switching GARCH and change-point GARCH models. They require estimation by MCMC methods due to the path dependence problem. An unsolved issue is the computation of their marginal likelihood, which is essential for determining the number of regimes or change-points. We solve the problem by using particle MCMC, a technique proposed by Andrieu, Doucet, and Holenstein (2010). We examine the performance of this new method on simulated data, and we illustrate its use on several return series.C11, C15, C22, C58,

    Detection of trend changes in time series using Bayesian inference

    Full text link
    Change points in time series are perceived as isolated singularities where two regular trends of a given signal do not match. The detection of such transitions is of fundamental interest for the understanding of the system's internal dynamics. In practice observational noise makes it difficult to detect such change points in time series. In this work we elaborate a Bayesian method to estimate the location of the singularities and to produce some confidence intervals. We validate the ability and sensitivity of our inference method by estimating change points of synthetic data sets. As an application we use our algorithm to analyze the annual flow volume of the Nile River at Aswan from 1871 to 1970, where we confirm a well-established significant transition point within the time series.Comment: 9 pages, 12 figures, submitte

    Marginal Likelihood Estimation with the Cross-Entropy Method

    Get PDF
    We consider an adaptive importance sampling approach to estimating the marginal likelihood, a quantity that is fundamental in Bayesian model comparison and Bayesian model averaging. This approach is motivated by the difficulty of obtaining an accurate estimate through existing algorithms that use Markov chain Monte Carlo (MCMC) draws, where the draws are typically costly to obtain and highly correlated in high-dimensional settings. In contrast, we use the cross-entropy (CE) method, a versatile adaptive Monte Carlo algorithm originally developed for rare-event simulation. The main advantage of the importance sampling approach is that random samples can be obtained from some convenient density with little additional costs. As we are generating independent draws instead of correlated MCMC draws, the increase in simulation effort is much smaller should one wish to reduce the numerical standard error of the estimator. Moreover, the importance density derived via the CE method is in a well-defined sense optimal. We demonstrate the utility of the proposed approach by two empirical applications involving women's labor market participation and U.S. macroeconomic time series. In both applications the proposed CE method compares favorably to existing estimators
    corecore