2,000 research outputs found
Digitizing Darwin's Library
This project which aims to reconstruct, digitally, Charles Darwin's working library as it stood at the end of his life's journey, will open up and make accessible to students of the humanities and the sciences whole new dimensions of Darwin's thinking. Over 700 of Darwin's most heavily annotated books are held at Cambridge University Library. The abundant hand-written notes on these books were painstakingly transcribed in the late 1980s. Now, thanks to high-resolution digital imagery, and an international partnership of Cambridge, the Natural History Museum in London, the Biodiversity Heritage Library-a consortium of natural history libraries, and the Darwin Digital Library of Evolution-an online scholarly edition of Darwin's manuscripts based at the American Museum of Natural History, Darwin's transcribed marginalia will be digitally married with scanned books from his own library and with scanned surrogate volumes of the exact editions Darwin owned from the partnership's libraries
Efficient Bayesian inference for multivariate factor stochastic volatility models with leverage
This paper discusses the efficient Bayesian estimation of a multivariate
factor stochastic volatility (Factor MSV) model with leverage. We propose a
novel approach to construct the sampling schemes that converges to the
posterior distribution of the latent volatilities and the parameters of
interest of the Factor MSV model based on recent advances in Particle Markov
chain Monte Carlo (PMCMC). As opposed to the approach of Chib et al. (2006} and
Omori et al. (2007}, our approach does not require approximating the joint
distribution of outcome and volatility innovations by a mixture of bivariate
normal distributions. To sample the free elements of the loading matrix we
employ the interweaving method used in Kastner et al. (2017} in the Particle
Metropolis within Gibbs (PMwG) step. The proposed method is illustrated
empirically using a simulated dataset and a sample of daily US stock returns.Comment: 4 figures and 9 table
On Scalable Particle Markov Chain Monte Carlo
Particle Markov Chain Monte Carlo (PMCMC) is a general approach to carry out
Bayesian inference in non-linear and non-Gaussian state space models. Our
article shows how to scale up PMCMC in terms of the number of observations and
parameters by expressing the target density of the PMCMC in terms of the basic
uniform or standard normal random numbers, instead of the particles, used in
the sequential Monte Carlo algorithm. Parameters that can be drawn efficiently
conditional on the particles are generated by particle Gibbs. All the other
parameters are drawn by conditioning on the basic uniform or standard normal
random variables; e.g. parameters that are highly correlated with the states,
or parameters whose generation is expensive when conditioning on the states.
The performance of this hybrid sampler is investigated empirically by applying
it to univariate and multivariate stochastic volatility models having both a
large number of parameters and a large number of latent states and shows that
it is much more efficient than competing PMCMC methods. We also show that the
proposed hybrid sampler is ergodic
Mixed Marginal Copula Modeling
This article extends the literature on copulas with discrete or continuous
marginals to the case where some of the marginals are a mixture of discrete and
continuous components. We do so by carefully defining the likelihood as the
density of the observations with respect to a mixed measure. The treatment is
quite general, although we focus focus on mixtures of Gaussian and Archimedean
copulas. The inference is Bayesian with the estimation carried out by Markov
chain Monte Carlo. We illustrate the methodology and algorithms by applying
them to estimate a multivariate income dynamics model.Comment: 46 pages, 8 tables and 4 figure
Variational Bayes with Intractable Likelihood
Variational Bayes (VB) is rapidly becoming a popular tool for Bayesian
inference in statistical modeling. However, the existing VB algorithms are
restricted to cases where the likelihood is tractable, which precludes the use
of VB in many interesting situations such as in state space models and in
approximate Bayesian computation (ABC), where application of VB methods was
previously impossible. This paper extends the scope of application of VB to
cases where the likelihood is intractable, but can be estimated unbiasedly. The
proposed VB method therefore makes it possible to carry out Bayesian inference
in many statistical applications, including state space models and ABC. The
method is generic in the sense that it can be applied to almost all statistical
models without requiring too much model-based derivation, which is a drawback
of many existing VB algorithms. We also show how the proposed method can be
used to obtain highly accurate VB approximations of marginal posterior
distributions.Comment: 40 pages, 6 figure
Bayesian Deep Net GLM and GLMM
Deep feedforward neural networks (DFNNs) are a powerful tool for functional
approximation. We describe flexible versions of generalized linear and
generalized linear mixed models incorporating basis functions formed by a DFNN.
The consideration of neural networks with random effects is not widely used in
the literature, perhaps because of the computational challenges of
incorporating subject specific parameters into already complex models.
Efficient computational methods for high-dimensional Bayesian inference are
developed using Gaussian variational approximation, with a parsimonious but
flexible factor parametrization of the covariance matrix. We implement natural
gradient methods for the optimization, exploiting the factor structure of the
variational covariance matrix in computation of the natural gradient. Our
flexible DFNN models and Bayesian inference approach lead to a regression and
classification method that has a high prediction accuracy, and is able to
quantify the prediction uncertainty in a principled and convenient way. We also
describe how to perform variable selection in our deep learning method. The
proposed methods are illustrated in a wide range of simulated and real-data
examples, and the results compare favourably to a state of the art flexible
regression and classification method in the statistical literature, the
Bayesian additive regression trees (BART) method. User-friendly software
packages in Matlab, R and Python implementing the proposed methods are
available at https://github.com/VBayesLabComment: 35 pages, 7 figure, 10 table
- …