Skip to main content
Article thumbnail
Location of Repository

Neural network approximations to posterior densities: an analytical approach

By L.F. Hoogerheide, J.F. Kaashoek and H.K. van Dijk

Abstract

In Hoogerheide, Kaashoek and Van Dijk (2002) the class of neural networksampling methods is introduced to sample from a target (posterior)distribution that may be multi-modal or skew, or exhibit strong correlationamong the parameters. In these methods the neural network is used as animportance function in IS or as a candidate density in MH. In this note wesuggest an analytical approach to estimate the moments of a certain (target)distribution, where `analytical' refers to the fact that no samplingalgorithm like MH or IS is needed.We show an example in which our analyticalapproach is feasible, even in a case where a `standard' Gibbs approach wouldfail or be extremely slow.Markov chain Monte Carlo;Bayesian inference;importance sampling;neural networks

OAI identifier:

Suggested articles

Citations

  1. (1991). A Bayesian Analysis of the Unit Root in Real Exchange Rates”,
  2. (1978). Bayesian estimates of equation system parameters: an application of integration by Monte Carlo”,
  3. (1989). Bayesian inference in econometric models using Monte Carlo integration”,
  4. (1998). Bayesian Simultaneous Equations Analysis using Reduced Rank Structures”, Econometric Theory,
  5. (1984). Experiments with some alternatives for simple importance sampling in Monte Carlo integration”,
  6. (2002). Functional Approximations to Posterior Densities: A Neural Network Approach to Efficient Sampling”, Econometric Institute report 2002-48,
  7. (1980). Further experience in Bayesian analysis using Monte Carlo integration”,
  8. (1987). Kolmogorov mapping neural network existence theorem”, in
  9. (1964). Monte Carlo Methods”.
  10. (1970). Monte Carlo Sampling Methods using Markov Chains and their
  11. (1957). On the representation of continuous functions of many variables by superposition of continuous functions of one variable and addition”,
  12. (1984). Stochastic Relaxation, Gibbs Distributions and the Bayesian Restoration of Images”,
  13. (1996). The Effect of Improper Priors on Gibbs Sampling in Hierarchical Linear Mixed Models”,
  14. (1989). There exists a neural network that does not make avoidable mistakes”,

To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.