984,094 research outputs found

    Approximate Bayesian Model Selection with the Deviance Statistic

    Full text link
    Bayesian model selection poses two main challenges: the specification of parameter priors for all models, and the computation of the resulting Bayes factors between models. There is now a large literature on automatic and objective parameter priors in the linear model. One important class are gg-priors, which were recently extended from linear to generalized linear models (GLMs). We show that the resulting Bayes factors can be approximated by test-based Bayes factors (Johnson [Scand. J. Stat. 35 (2008) 354-368]) using the deviance statistics of the models. To estimate the hyperparameter gg, we propose empirical and fully Bayes approaches and link the former to minimum Bayes factors and shrinkage estimates from the literature. Furthermore, we describe how to approximate the corresponding posterior distribution of the regression coefficients based on the standard GLM output. We illustrate the approach with the development of a clinical prediction model for 30-day survival in the GUSTO-I trial using logistic regression.Comment: Published at http://dx.doi.org/10.1214/14-STS510 in the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Real time change-point detection in a nonlinear quantile model

    Full text link
    Most studies in real time change-point detection either focus on the linear model or use the CUSUM method under classical assumptions on model errors. This paper considers the sequential change-point detection in a nonlinear quantile model. A test statistic based on the CUSUM of the quantile process subgradient is proposed and studied. Under null hypothesis that the model does not change, the asymptotic distribution of the test statistic is determined. Under alternative hypothesis that at some unknown observation there is a change in model, the proposed test statistic converges in probability to \infty. These results allow to build the critical regions on open-end and on closed-end procedures. Simulation results, using Monte Carlo technique, investigate the performance of the test statistic, specially for heavy-tailed error distributions. We also compare it with the classical CUSUM test statistic

    Bayesian Testing in Cointegration Models using the Jeffreys' Prior

    Get PDF
    We develop a Bayesian cointegration test statistic that can be used under a Jeffreys' prior. The test statistic is equal to the posterior expectation of the classical score statistic. Under the assumption of a full rank value of the long run multiplier the test statistic is a random variable with a chi-squared distribution. We evaluate whether the value of the test statistic under the restriction of cointegration is a plausible realization from its distribution under the encompassing, full rank model. We provide the posterior simulator that is needed to compute the test statistic. The simulator utilizes the invariance properties of the Jeffreys' prior such that the parameter drawings from a suitably rescaled model can be used. The test statistic can straightforwardly be extended to a more general model setting. For example, we show that structural breaks in the constant or trend and general mixtures of normal disturbances can be modelled, because conditional on some latent parameters all derivations still hold. We apply the Bayesian cointegration statistic to the Danish dataset of Johansen and Juselius (1990) and to four artificial examples to illustrate the use of the statistic as a diagnostic tool.

    Empirical likelihood test in a posteriori change-point nonlinear model

    Full text link
    In this paper, in order to test whether changes have occurred in a nonlinear parametric regression, we propose a nonparametric method based on the empirical likelihood. Firstly, we test the null hypothesis of no-change against the alternative of one change in the regression parameters. Under null hypothesis, the consistency and the convergence rate of the regression parameter estimators are proved. The asymptotic distribution of the test statistic under the null hypothesis is obtained, which allows to find the asymptotic critical value. On the other hand, we prove that the proposed test statistic has the asymptotic power equal to 1. These theoretical results allows find a simple test statistic, very useful for applications. The epidemic model, a particular model with two change-points under the alternative hypothesis, is also studied. Numerical studies by Monte-Carlo simulations show the performance of the proposed test statistic, compared to an existing method in literature

    Algorithmic Statistics

    Full text link
    While Kolmogorov complexity is the accepted absolute measure of information content of an individual finite object, a similarly absolute notion is needed for the relation between an individual data sample and an individual model summarizing the information in the data, for example, a finite set (or probability distribution) where the data sample typically came from. The statistical theory based on such relations between individual objects can be called algorithmic statistics, in contrast to classical statistical theory that deals with relations between probabilistic ensembles. We develop the algorithmic theory of statistic, sufficient statistic, and minimal sufficient statistic. This theory is based on two-part codes consisting of the code for the statistic (the model summarizing the regularity, the meaningful information, in the data) and the model-to-data code. In contrast to the situation in probabilistic statistical theory, the algorithmic relation of (minimal) sufficiency is an absolute relation between the individual model and the individual data sample. We distinguish implicit and explicit descriptions of the models. We give characterizations of algorithmic (Kolmogorov) minimal sufficient statistic for all data samples for both description modes--in the explicit mode under some constraints. We also strengthen and elaborate earlier results on the ``Kolmogorov structure function'' and ``absolutely non-stochastic objects''--those rare objects for which the simplest models that summarize their relevant information (minimal sufficient statistics) are at least as complex as the objects themselves. We demonstrate a close relation between the probabilistic notions and the algorithmic ones.Comment: LaTeX, 22 pages, 1 figure, with correction to the published journal versio

    Compressed matched filter for non-Gaussian noise

    Full text link
    We consider estimation of a deterministic unknown parameter vector in a linear model with non-Gaussian noise. In the Gaussian case, dimensionality reduction via a linear matched filter provides a simple low dimensional sufficient statistic which can be easily communicated and/or stored for future inference. Such a statistic is usually unknown in the general non-Gaussian case. Instead, we propose a hybrid matched filter coupled with a randomized compressed sensing procedure, which together create a low dimensional statistic. We also derive a complementary algorithm for robust reconstruction given this statistic. Our recovery method is based on the fast iterative shrinkage and thresholding algorithm which is used for outlier rejection given the compressed data. We demonstrate the advantages of the proposed framework using synthetic simulations
    corecore