1,345 research outputs found

    Calibration of conditional composite likelihood for Bayesian inference on Gibbs random fields

    Full text link
    Gibbs random fields play an important role in statistics, however, the resulting likelihood is typically unavailable due to an intractable normalizing constant. Composite likelihoods offer a principled means to construct useful approximations. This paper provides a mean to calibrate the posterior distribution resulting from using a composite likelihood and illustrate its performance in several examples.Comment: JMLR Workshop and Conference Proceedings, 18th International Conference on Artificial Intelligence and Statistics (AISTATS), San Diego, California, USA, 9-12 May 2015 (Vol. 38, pp. 921-929). arXiv admin note: substantial text overlap with arXiv:1207.575

    Hidden Gibbs random fields model selection using Block Likelihood Information Criterion

    Full text link
    Performing model selection between Gibbs random fields is a very challenging task. Indeed, due to the Markovian dependence structure, the normalizing constant of the fields cannot be computed using standard analytical or numerical methods. Furthermore, such unobserved fields cannot be integrated out and the likelihood evaluztion is a doubly intractable problem. This forms a central issue to pick the model that best fits an observed data. We introduce a new approximate version of the Bayesian Information Criterion. We partition the lattice into continuous rectangular blocks and we approximate the probability measure of the hidden Gibbs field by the product of some Gibbs distributions over the blocks. On that basis, we estimate the likelihood and derive the Block Likelihood Information Criterion (BLIC) that answers model choice questions such as the selection of the dependency structure or the number of latent states. We study the performances of BLIC for those questions. In addition, we present a comparison with ABC algorithms to point out that the novel criterion offers a better trade-off between time efficiency and reliable results

    Bayesian model selection for exponential random graph models via adjusted pseudolikelihoods

    Get PDF
    Models with intractable likelihood functions arise in areas including network analysis and spatial statistics, especially those involving Gibbs random fields. Posterior parameter es timation in these settings is termed a doubly-intractable problem because both the likelihood function and the posterior distribution are intractable. The comparison of Bayesian models is often based on the statistical evidence, the integral of the un-normalised posterior distribution over the model parameters which is rarely available in closed form. For doubly-intractable models, estimating the evidence adds another layer of difficulty. Consequently, the selection of the model that best describes an observed network among a collection of exponential random graph models for network analysis is a daunting task. Pseudolikelihoods offer a tractable approximation to the likelihood but should be treated with caution because they can lead to an unreasonable inference. This paper specifies a method to adjust pseudolikelihoods in order to obtain a reasonable, yet tractable, approximation to the likelihood. This allows implementation of widely used computational methods for evidence estimation and pursuit of Bayesian model selection of exponential random graph models for the analysis of social networks. Empirical comparisons to existing methods show that our procedure yields similar evidence estimates, but at a lower computational cost.Comment: Supplementary material attached. To view attachments, please download and extract the gzzipped source file listed under "Other formats

    Bayesian Nonstationary Spatial Modeling for Very Large Datasets

    Full text link
    With the proliferation of modern high-resolution measuring instruments mounted on satellites, planes, ground-based vehicles and monitoring stations, a need has arisen for statistical methods suitable for the analysis of large spatial datasets observed on large spatial domains. Statistical analyses of such datasets provide two main challenges: First, traditional spatial-statistical techniques are often unable to handle large numbers of observations in a computationally feasible way. Second, for large and heterogeneous spatial domains, it is often not appropriate to assume that a process of interest is stationary over the entire domain. We address the first challenge by using a model combining a low-rank component, which allows for flexible modeling of medium-to-long-range dependence via a set of spatial basis functions, with a tapered remainder component, which allows for modeling of local dependence using a compactly supported covariance function. Addressing the second challenge, we propose two extensions to this model that result in increased flexibility: First, the model is parameterized based on a nonstationary Matern covariance, where the parameters vary smoothly across space. Second, in our fully Bayesian model, all components and parameters are considered random, including the number, locations, and shapes of the basis functions used in the low-rank component. Using simulated data and a real-world dataset of high-resolution soil measurements, we show that both extensions can result in substantial improvements over the current state-of-the-art.Comment: 16 pages, 2 color figure

    Noisy Hamiltonian Monte Carlo for doubly-intractable distributions

    Full text link
    Hamiltonian Monte Carlo (HMC) has been progressively incorporated within the statistician's toolbox as an alternative sampling method in settings when standard Metropolis-Hastings is inefficient. HMC generates a Markov chain on an augmented state space with transitions based on a deterministic differential flow derived from Hamiltonian mechanics. In practice, the evolution of Hamiltonian systems cannot be solved analytically, requiring numerical integration schemes. Under numerical integration, the resulting approximate solution no longer preserves the measure of the target distribution, therefore an accept-reject step is used to correct the bias. For doubly-intractable distributions -- such as posterior distributions based on Gibbs random fields -- HMC suffers from some computational difficulties: computation of gradients in the differential flow and computation of the accept-reject proposals poses difficulty. In this paper, we study the behaviour of HMC when these quantities are replaced by Monte Carlo estimates

    Constructing A Flexible Likelihood Function For Spectroscopic Inference

    Full text link
    We present a modular, extensible likelihood framework for spectroscopic inference based on synthetic model spectra. The subtraction of an imperfect model from a continuously sampled spectrum introduces covariance between adjacent datapoints (pixels) into the residual spectrum. For the high signal-to-noise data with large spectral range that is commonly employed in stellar astrophysics, that covariant structure can lead to dramatically underestimated parameter uncertainties (and, in some cases, biases). We construct a likelihood function that accounts for the structure of the covariance matrix, utilizing the machinery of Gaussian process kernels. This framework specifically address the common problem of mismatches in model spectral line strengths (with respect to data) due to intrinsic model imperfections (e.g., in the atomic/molecular databases or opacity prescriptions) by developing a novel local covariance kernel formalism that identifies and self-consistently downweights pathological spectral line "outliers." By fitting many spectra in a hierarchical manner, these local kernels provide a mechanism to learn about and build data-driven corrections to synthetic spectral libraries. An open-source software implementation of this approach is available at http://iancze.github.io/Starfish, including a sophisticated probabilistic scheme for spectral interpolation when using model libraries that are sparsely sampled in the stellar parameters. We demonstrate some salient features of the framework by fitting the high resolution VV-band spectrum of WASP-14, an F5 dwarf with a transiting exoplanet, and the moderate resolution KK-band spectrum of Gliese 51, an M5 field dwarf.Comment: Accepted to ApJ. Incorporated referees' comments. New figures 1, 8, 10, 12, and 14. Supplemental website: http://iancze.github.io/Starfish
    • 

    corecore