25,665 research outputs found
Auxiliary Likelihood-Based Approximate Bayesian Computation in State Space Models
A computationally simple approach to inference in state space models is
proposed, using approximate Bayesian computation (ABC). ABC avoids evaluation
of an intractable likelihood by matching summary statistics for the observed
data with statistics computed from data simulated from the true process, based
on parameter draws from the prior. Draws that produce a 'match' between
observed and simulated summaries are retained, and used to estimate the
inaccessible posterior. With no reduction to a low-dimensional set of
sufficient statistics being possible in the state space setting, we define the
summaries as the maximum of an auxiliary likelihood function, and thereby
exploit the asymptotic sufficiency of this estimator for the auxiliary
parameter vector. We derive conditions under which this approach - including a
computationally efficient version based on the auxiliary score - achieves
Bayesian consistency. To reduce the well-documented inaccuracy of ABC in
multi-parameter settings, we propose the separate treatment of each parameter
dimension using an integrated likelihood technique. Three stochastic volatility
models for which exact Bayesian inference is either computationally
challenging, or infeasible, are used for illustration. We demonstrate that our
approach compares favorably against an extensive set of approximate and exact
comparators. An empirical illustration completes the paper.Comment: This paper is forthcoming at the Journal of Computational and
Graphical Statistics. It also supersedes the earlier arXiv paper "Approximate
Bayesian Computation in State Space Models" (arXiv:1409.8363
Bayesian Cointegrated Vector Autoregression models incorporating Alpha-stable noise for inter-day price movements via Approximate Bayesian Computation
We consider a statistical model for pairs of traded assets, based on a
Cointegrated Vector Auto Regression (CVAR) Model. We extend standard CVAR
models to incorporate estimation of model parameters in the presence of price
series level shifts which are not accurately modeled in the standard Gaussian
error correction model (ECM) framework. This involves developing a novel matrix
variate Bayesian CVAR mixture model comprised of Gaussian errors intra-day and
Alpha-stable errors inter-day in the ECM framework. To achieve this we derive a
novel conjugate posterior model for the Scaled Mixtures of Normals (SMiN CVAR)
representation of Alpha-stable inter-day innovations. These results are
generalized to asymmetric models for the innovation noise at inter-day
boundaries allowing for skewed Alpha-stable models.
Our proposed model and sampling methodology is general, incorporating the
current literature on Gaussian models as a special subclass and also allowing
for price series level shifts either at random estimated time points or known a
priori time points. We focus analysis on regularly observed non-Gaussian level
shifts that can have significant effect on estimation performance in
statistical models failing to account for such level shifts, such as at the
close and open of markets. We compare the estimation accuracy of our model and
estimation approach to standard frequentist and Bayesian procedures for CVAR
models when non-Gaussian price series level shifts are present in the
individual series, such as inter-day boundaries. We fit a bi-variate
Alpha-stable model to the inter-day jumps and model the effect of such jumps on
estimation of matrix-variate CVAR model parameters using the likelihood based
Johansen procedure and a Bayesian estimation. We illustrate our model and the
corresponding estimation procedures we develop on both synthetic and actual
data.Comment: 30 page
Variational Bayes with Intractable Likelihood
Variational Bayes (VB) is rapidly becoming a popular tool for Bayesian
inference in statistical modeling. However, the existing VB algorithms are
restricted to cases where the likelihood is tractable, which precludes the use
of VB in many interesting situations such as in state space models and in
approximate Bayesian computation (ABC), where application of VB methods was
previously impossible. This paper extends the scope of application of VB to
cases where the likelihood is intractable, but can be estimated unbiasedly. The
proposed VB method therefore makes it possible to carry out Bayesian inference
in many statistical applications, including state space models and ABC. The
method is generic in the sense that it can be applied to almost all statistical
models without requiring too much model-based derivation, which is a drawback
of many existing VB algorithms. We also show how the proposed method can be
used to obtain highly accurate VB approximations of marginal posterior
distributions.Comment: 40 pages, 6 figure
Stochastic Volatility Filtering with Intractable Likelihoods
This paper is concerned with particle filtering for -stable
stochastic volatility models. The -stable distribution provides a
flexible framework for modeling asymmetry and heavy tails, which is useful when
modeling financial returns. An issue with this distributional assumption is the
lack of a closed form for the probability density function. To estimate the
volatility of financial returns in this setting, we develop a novel auxiliary
particle filter. The algorithm we develop can be easily applied to any hidden
Markov model for which the likelihood function is intractable or
computationally expensive. The approximate target distribution of our auxiliary
filter is based on the idea of approximate Bayesian computation (ABC). ABC
methods allow for inference on posterior quantities in situations when the
likelihood of the underlying model is not available in closed form, but
simulating samples from it is possible. The ABC auxiliary particle filter
(ABC-APF) that we propose provides not only a good alternative to state
estimation in stochastic volatility models, but it also improves on the
existing ABC literature. It allows for more flexibility in state estimation
while improving on the accuracy through better proposal distributions in cases
when the optimal importance density of the filter is unavailable in closed
form. We assess the performance of the ABC-APF on a simulated dataset from the
-stable stochastic volatility model and compare it to other currently
existing ABC filters
Hyper-g Priors for Generalized Linear Models
We develop an extension of the classical Zellner's g-prior to generalized
linear models. The prior on the hyperparameter g is handled in a flexible way,
so that any continuous proper hyperprior f(g) can be used, giving rise to a
large class of hyper-g priors. Connections with the literature are described in
detail. A fast and accurate integrated Laplace approximation of the marginal
likelihood makes inference in large model spaces feasible. For posterior
parameter estimation we propose an efficient and tuning-free
Metropolis-Hastings sampler. The methodology is illustrated with variable
selection and automatic covariate transformation in the Pima Indians diabetes
data set.Comment: 30 pages, 12 figures, poster contribution at ISBA 201
Divide and conquer in ABC: Expectation-Progagation algorithms for likelihood-free inference
ABC algorithms are notoriously expensive in computing time, as they require
simulating many complete artificial datasets from the model. We advocate in
this paper a "divide and conquer" approach to ABC, where we split the
likelihood into n factors, and combine in some way n "local" ABC approximations
of each factor. This has two advantages: (a) such an approach is typically much
faster than standard ABC and (b) it makes it possible to use local summary
statistics (i.e. summary statistics that depend only on the data-points that
correspond to a single factor), rather than global summary statistics (that
depend on the complete dataset). This greatly alleviates the bias introduced by
summary statistics, and even removes it entirely in situations where local
summary statistics are simply the identity function.
We focus on EP (Expectation-Propagation), a convenient and powerful way to
combine n local approximations into a global approximation. Compared to the EP-
ABC approach of Barthelm\'e and Chopin (2014), we present two variations, one
based on the parallel EP algorithm of Cseke and Heskes (2011), which has the
advantage of being implementable on a parallel architecture, and one version
which bridges the gap between standard EP and parallel EP. We illustrate our
approach with an expensive application of ABC, namely inference on spatial
extremes.Comment: To appear in the forthcoming Handbook of Approximate Bayesian
Computation (ABC), edited by S. Sisson, L. Fan, and M. Beaumon
Partially Exchangeable Networks and Architectures for Learning Summary Statistics in Approximate Bayesian Computation
We present a novel family of deep neural architectures, named partially
exchangeable networks (PENs) that leverage probabilistic symmetries. By design,
PENs are invariant to block-switch transformations, which characterize the
partial exchangeability properties of conditionally Markovian processes.
Moreover, we show that any block-switch invariant function has a PEN-like
representation. The DeepSets architecture is a special case of PEN and we can
therefore also target fully exchangeable data. We employ PENs to learn summary
statistics in approximate Bayesian computation (ABC). When comparing PENs to
previous deep learning methods for learning summary statistics, our results are
highly competitive, both considering time series and static models. Indeed,
PENs provide more reliable posterior samples even when using less training
data.Comment: Forthcoming on the Proceedings of ICML 2019. New comparisons with
several different networks. We now use the Wasserstein distance to produce
comparisons. Code available on GitHub. 16 pages, 5 figures, 21 table
- …