4,609 research outputs found
Implicit Copulas from Bayesian Regularized Regression Smoothers
We show how to extract the implicit copula of a response vector from a
Bayesian regularized regression smoother with Gaussian disturbances. The copula
can be used to compare smoothers that employ different shrinkage priors and
function bases. We illustrate with three popular choices of shrinkage priors
--- a pairwise prior, the horseshoe prior and a g prior augmented with a point
mass as employed for Bayesian variable selection --- and both univariate and
multivariate function bases. The implicit copulas are high-dimensional, have
flexible dependence structures that are far from that of a Gaussian copula, and
are unavailable in closed form. However, we show how they can be evaluated by
first constructing a Gaussian copula conditional on the regularization
parameters, and then integrating over these. Combined with non-parametric
margins the regularized smoothers can be used to model the distribution of
non-Gaussian univariate responses conditional on the covariates. Efficient
Markov chain Monte Carlo schemes for evaluating the copula are given for this
case. Using both simulated and real data, we show how such copula smoothing
models can improve the quality of resulting function estimates and predictive
distributions
Structure Learning in Coupled Dynamical Systems and Dynamic Causal Modelling
Identifying a coupled dynamical system out of many plausible candidates, each
of which could serve as the underlying generator of some observed measurements,
is a profoundly ill posed problem that commonly arises when modelling real
world phenomena. In this review, we detail a set of statistical procedures for
inferring the structure of nonlinear coupled dynamical systems (structure
learning), which has proved useful in neuroscience research. A key focus here
is the comparison of competing models of (ie, hypotheses about) network
architectures and implicit coupling functions in terms of their Bayesian model
evidence. These methods are collectively referred to as dynamical casual
modelling (DCM). We focus on a relatively new approach that is proving
remarkably useful; namely, Bayesian model reduction (BMR), which enables rapid
evaluation and comparison of models that differ in their network architecture.
We illustrate the usefulness of these techniques through modelling
neurovascular coupling (cellular pathways linking neuronal and vascular
systems), whose function is an active focus of research in neurobiology and the
imaging of coupled neuronal systems
Fast Gibbs sampling for high-dimensional Bayesian inversion
Solving ill-posed inverse problems by Bayesian inference has recently
attracted considerable attention. Compared to deterministic approaches, the
probabilistic representation of the solution by the posterior distribution can
be exploited to explore and quantify its uncertainties. In applications where
the inverse solution is subject to further analysis procedures, this can be a
significant advantage. Alongside theoretical progress, various new
computational techniques allow to sample very high dimensional posterior
distributions: In [Lucka2012], a Markov chain Monte Carlo (MCMC) posterior
sampler was developed for linear inverse problems with -type priors. In
this article, we extend this single component Gibbs-type sampler to a wide
range of priors used in Bayesian inversion, such as general priors
with additional hard constraints. Besides a fast computation of the
conditional, single component densities in an explicit, parameterized form, a
fast, robust and exact sampling from these one-dimensional densities is key to
obtain an efficient algorithm. We demonstrate that a generalization of slice
sampling can utilize their specific structure for this task and illustrate the
performance of the resulting slice-within-Gibbs samplers by different computed
examples. These new samplers allow us to perform sample-based Bayesian
inference in high-dimensional scenarios with certain priors for the first time,
including the inversion of computed tomography (CT) data with the popular
isotropic total variation (TV) prior.Comment: submitted to "Inverse Problems
- …