623,986 research outputs found
Bayesian analysis of CCDM Models
Creation of Cold Dark Matter (CCDM), in the context of Einstein Field
Equations, leads to negative creation pressure, which can be used to explain
the accelerated expansion of the Universe. In this work we tested six different
spatially flat models for matter creation using statistical tools, at light of
SN Ia data: Akaike Information Criterion (AIC), Bayesian Information Criterion
(BIC) and Bayesian Evidence (BE). These approaches allow to compare models
considering goodness of fit and number of free parameters, penalizing excess of
complexity. We find that JO model is slightly favoured over LJO/CDM
model, however, neither of these, nor model can be
discarded from the current analysis. Three other scenarios are discarded either
from poor fitting, either from excess of free parameters.Comment: 16 pages, 6 figures, 6 tables. Corrected some text and language in
new versio
Bayesian semiparametric GARCH models
This paper aims to investigate a Bayesian sampling approach to parameter estimation in the semiparametric GARCH model with an unknown conditional error density, which we approximate by a mixture of Gaussian densities centered at individual errors and scaled by a common standard deviation. This mixture density has the form of a kernel density estimator of the errors with its bandwidth being the standard deviation. The proposed investigation is motivated by the lack of robustness in GARCH models with any parametric assumption of the error density for the purpose of error-density based inference such as value-at-risk (VaR) estimation. The contribution of the paper is to construct the likelihood and posterior of model and bandwidth parameters under the proposed mixture error density, and to forecast the one-step out-of-sample density of asset returns. The resulting VaR measure therefore would be distribution-free. Applying the semiparametric GARCH(1,1) model to daily stock-index returns in eight stock markets, we find that this semiparametric GARCH model is favoured against the GARCH(1,1) model with Student t errors for five indices, and that the GARCH model underestimates VaR compared to its semiparametric counterpart. We also investigate the use and benefit of localized bandwidths in the proposed mixture density of the errors.Bayes factors, kernel-form error density, localized bandwidths, Markov chain Monte Carlo, value-at-risk
Bayesian Semiparametric Multi-State Models
Multi-state models provide a unified framework for the description of the evolution of discrete phenomena in continuous time. One particular example are Markov processes which can be characterised by a set of time-constant transition intensities between the states. In this paper, we will extend such parametric approaches to semiparametric models with flexible transition intensities based on Bayesian versions of penalised splines. The transition intensities will be modelled as smooth functions of time and can further be related to parametric as well as nonparametric covariate effects. Covariates with time-varying effects and frailty terms can be included in addition. Inference will be conducted either fully Bayesian using Markov chain Monte Carlo simulation techniques or empirically Bayesian based on a mixed model representation. A counting process representation of semiparametric multi-state models provides the likelihood formula and also forms the basis for model validation via martingale residual processes. As an application, we will consider human sleep data with a discrete set of sleep states such as REM and Non-REM phases. In this case, simple parametric approaches are inappropriate since the dynamics underlying human sleep are strongly varying throughout the night and individual-specific variation has to be accounted for using covariate information and frailty terms
Frequentist tests for Bayesian models
Analogues of the frequentist chi-square and F tests are proposed for testing
goodness-of-fit and consistency for Bayesian models. Simple examples exhibit
these tests' detection of inconsistency between consecutive experiments with
identical parameters, when the first experiment provides the prior for the
second. In a related analysis, a quantitative measure is derived for judging
the degree of tension between two different experiments with partially
overlapping parameter vectors.Comment: 8 pages, 4 figures. Section 8 rewritten. Additional references.
Accepted by Astronomy & Astrophysic
Learning Topic Models and Latent Bayesian Networks Under Expansion Constraints
Unsupervised estimation of latent variable models is a fundamental problem
central to numerous applications of machine learning and statistics. This work
presents a principled approach for estimating broad classes of such models,
including probabilistic topic models and latent linear Bayesian networks, using
only second-order observed moments. The sufficient conditions for
identifiability of these models are primarily based on weak expansion
constraints on the topic-word matrix, for topic models, and on the directed
acyclic graph, for Bayesian networks. Because no assumptions are made on the
distribution among the latent variables, the approach can handle arbitrary
correlations among the topics or latent factors. In addition, a tractable
learning method via optimization is proposed and studied in numerical
experiments.Comment: 38 pages, 6 figures, 2 tables, applications in topic models and
Bayesian networks are studied. Simulation section is adde
- …
