10,946 research outputs found
Bayesian astrostatistics: a backward look to the future
This perspective chapter briefly surveys: (1) past growth in the use of
Bayesian methods in astrophysics; (2) current misconceptions about both
frequentist and Bayesian statistical inference that hinder wider adoption of
Bayesian methods by astronomers; and (3) multilevel (hierarchical) Bayesian
modeling as a major future direction for research in Bayesian astrostatistics,
exemplified in part by presentations at the first ISI invited session on
astrostatistics, commemorated in this volume. It closes with an intentionally
provocative recommendation for astronomical survey data reporting, motivated by
the multilevel Bayesian perspective on modeling cosmic populations: that
astronomers cease producing catalogs of estimated fluxes and other source
properties from surveys. Instead, summaries of likelihood functions (or
marginal likelihood functions) for source properties should be reported (not
posterior probability density functions), including nontrivial summaries (not
simply upper limits) for candidate objects that do not pass traditional
detection thresholds.Comment: 27 pp, 4 figures. A lightly revised version of a chapter in
"Astrostatistical Challenges for the New Astronomy" (Joseph M. Hilbe, ed.,
Springer, New York, forthcoming in 2012), the inaugural volume for the
Springer Series in Astrostatistics. Version 2 has minor clarifications and an
additional referenc
A generalized bayesian inference method for constraining the interiors of super Earths and sub-Neptunes
We aim to present a generalized Bayesian inference method for constraining
interiors of super Earths and sub-Neptunes. Our methodology succeeds in
quantifying the degeneracy and correlation of structural parameters for high
dimensional parameter spaces. Specifically, we identify what constraints can be
placed on composition and thickness of core, mantle, ice, ocean, and
atmospheric layers given observations of mass, radius, and bulk refractory
abundance constraints (Fe, Mg, Si) from observations of the host star's
photospheric composition. We employed a full probabilistic Bayesian inference
analysis that formally accounts for observational and model uncertainties.
Using a Markov chain Monte Carlo technique, we computed joint and marginal
posterior probability distributions for all structural parameters of interest.
We included state-of-the-art structural models based on self-consistent
thermodynamics of core, mantle, high-pressure ice, and liquid water.
Furthermore, we tested and compared two different atmospheric models that are
tailored for modeling thick and thin atmospheres, respectively. First, we
validate our method against Neptune. Second, we apply it to synthetic
exoplanets of fixed mass and determine the effect on interior structure and
composition when (1) radius, (2) atmospheric model, (3) data uncertainties, (4)
semi-major axes, (5) atmospheric composition (i.e., a priori assumption of
enriched envelopes versus pure H/He envelopes), and (6) prior distributions are
varied. Our main conclusions are: [...]Comment: Astronomy & Astrophysics, 597, A37, 17 pages, 11 figure
Estimating model evidence using data assimilation
We review the field of data assimilation (DA) from a Bayesian perspective and show that, in addition to its by now common application to state estimation, DA may be used for model selection. An important special case of the latter is the discrimination between a factual model–which corresponds, to the best of the modeller's knowledge, to the situation in the actual world in which a sequence of events has occurred–and a counterfactual model, in which a particular forcing or process might be absent or just quantitatively different from the actual world. Three different ensemble‐DA methods are reviewed for this purpose: the ensemble Kalman filter (EnKF), the ensemble four‐dimensional variational smoother (En‐4D‐Var), and the iterative ensemble Kalman smoother (IEnKS). An original contextual formulation of model evidence (CME) is introduced. It is shown how to apply these three methods to compute CME, using the approximated time‐dependent probability distribution functions (pdfs) each of them provide in the process of state estimation. The theoretical formulae so derived are applied to two simplified nonlinear and chaotic models: (i) the Lorenz three‐variable convection model (L63), and (ii) the Lorenz 40‐variable midlatitude atmospheric dynamics model (L95). The numerical results of these three DA‐based methods and those of an integration based on importance sampling are compared. It is found that better CME estimates are obtained by using DA, and the IEnKS method appears to be best among the DA methods. Differences among the performance of the three DA‐based methods are discussed as a function of model properties. Finally, the methodology is implemented for parameter estimation and for event attribution
Bayesian Methods for Analysis and Adaptive Scheduling of Exoplanet Observations
We describe work in progress by a collaboration of astronomers and
statisticians developing a suite of Bayesian data analysis tools for extrasolar
planet (exoplanet) detection, planetary orbit estimation, and adaptive
scheduling of observations. Our work addresses analysis of stellar reflex
motion data, where a planet is detected by observing the "wobble" of its host
star as it responds to the gravitational tug of the orbiting planet. Newtonian
mechanics specifies an analytical model for the resulting time series, but it
is strongly nonlinear, yielding complex, multimodal likelihood functions; it is
even more complex when multiple planets are present. The parameter spaces range
in size from few-dimensional to dozens of dimensions, depending on the number
of planets in the system, and the type of motion measured (line-of-sight
velocity, or position on the sky). Since orbits are periodic, Bayesian
generalizations of periodogram methods facilitate the analysis. This relies on
the model being linearly separable, enabling partial analytical
marginalization, reducing the dimension of the parameter space. Subsequent
analysis uses adaptive Markov chain Monte Carlo methods and adaptive importance
sampling to perform the integrals required for both inference (planet detection
and orbit measurement), and information-maximizing sequential design (for
adaptive scheduling of observations). We present an overview of our current
techniques and highlight directions being explored by ongoing research.Comment: 29 pages, 11 figures. An abridged version is accepted for publication
in Statistical Methodology for a special issue on astrostatistics, with
selected (refereed) papers presented at the Astronomical Data Analysis
Conference (ADA VI) held in Monastir, Tunisia, in May 2010. Update corrects
equation (3
Earth System Modeling 2.0: A Blueprint for Models That Learn From Observations and Targeted High-Resolution Simulations
Climate projections continue to be marred by large uncertainties, which
originate in processes that need to be parameterized, such as clouds,
convection, and ecosystems. But rapid progress is now within reach. New
computational tools and methods from data assimilation and machine learning
make it possible to integrate global observations and local high-resolution
simulations in an Earth system model (ESM) that systematically learns from
both. Here we propose a blueprint for such an ESM. We outline how
parameterization schemes can learn from global observations and targeted
high-resolution simulations, for example, of clouds and convection, through
matching low-order statistics between ESMs, observations, and high-resolution
simulations. We illustrate learning algorithms for ESMs with a simple dynamical
system that shares characteristics of the climate system; and we discuss the
opportunities the proposed framework presents and the challenges that remain to
realize it.Comment: 32 pages, 3 figure
Recommended from our members
Imaging of a fluid injection process using geophysical data - A didactic example
In many subsurface industrial applications, fluids are injected into or withdrawn from a geologic formation. It is of practical interest to quantify precisely where, when, and by how much the injected fluid alters the state of the subsurface. Routine geophysical monitoring of such processes attempts to image the way that geophysical properties, such as seismic velocities or electrical conductivity, change through time and space and to then make qualitative inferences as to where the injected fluid has migrated. The more rigorous formulation of the time-lapse geophysical inverse problem forecasts how the subsurface evolves during the course of a fluid-injection application. Using time-lapse geophysical signals as the data to be matched, the model unknowns to be estimated are the multiphysics forward-modeling parameters controlling the fluid-injection process. Properly reproducing the geophysical signature of the flow process, subsequent simulations can predict the fluid migration and alteration in the subsurface. The dynamic nature of fluid-injection processes renders imaging problems more complex than conventional geophysical imaging for static targets. This work intents to clarify the related hydrogeophysical parameter estimation concepts
- …