124,927 research outputs found

    Computing the Gamma function using contour integrals and rational approximations

    Get PDF
    Some of the best methods for computing the gamma function are based on numerical evaluation of Hankel's contour integral. For example, Temme evaluates this integral based on steepest-decent contours by the trapezoid rule. Here we investigate a different approach to the integral: the application of the trapezoid rule on Talbot-type contours using optimal parameters recently derived by Weideman for computing inverse Laplace transforms. Relatedly, we also investigate quadrature formulas derived from best approximations to exp(z) on the negative real axis, following Cody, Meinardus and Varga. The two methods are closely related and both converge geometrically. We find that the new methods are competitive with existing ones, even though they are based on generic tools rather than on specific analysis of the gamma function

    Black Box Variational Inference

    Full text link
    Variational inference has become a widely used method to approximate posteriors in complex latent variables models. However, deriving a variational inference algorithm generally requires significant model-specific analysis, and these efforts can hinder and deter us from quickly developing and exploring a variety of models for a problem at hand. In this paper, we present a "black box" variational inference algorithm, one that can be quickly applied to many models with little additional derivation. Our method is based on a stochastic optimization of the variational objective where the noisy gradient is computed from Monte Carlo samples from the variational distribution. We develop a number of methods to reduce the variance of the gradient, always maintaining the criterion that we want to avoid difficult model-based derivations. We evaluate our method against the corresponding black box sampling based methods. We find that our method reaches better predictive likelihoods much faster than sampling methods. Finally, we demonstrate that Black Box Variational Inference lets us easily explore a wide space of models by quickly constructing and evaluating several models of longitudinal healthcare data

    parallelMCMCcombine: An R Package for Bayesian Methods for Big Data and Analytics

    Full text link
    Recent advances in big data and analytics research have provided a wealth of large data sets that are too big to be analyzed in their entirety, due to restrictions on computer memory or storage size. New Bayesian methods have been developed for large data sets that are only large due to large sample sizes; these methods partition big data sets into subsets, and perform independent Bayesian Markov chain Monte Carlo analyses on the subsets. The methods then combine the independent subset posterior samples to estimate a posterior density given the full data set. These approaches were shown to be effective for Bayesian models including logistic regression models, Gaussian mixture models and hierarchical models. Here, we introduce the R package parallelMCMCcombine which carries out four of these techniques for combining independent subset posterior samples. We illustrate each of the methods using a Bayesian logistic regression model for simulation data and a Bayesian Gamma model for real data; we also demonstrate features and capabilities of the R package. The package assumes the user has carried out the Bayesian analysis and has produced the independent subposterior samples outside of the package. The methods are primarily suited to models with unknown parameters of fixed dimension that exist in continuous parameter spaces. We envision this tool will allow researchers to explore the various methods for their specific applications, and will assist future progress in this rapidly developing field.Comment: for published version see: http://www.plosone.org/article/fetchObject.action?uri=info%3Adoi%2F10.1371%2Fjournal.pone.0108425&representation=PD

    Fast and scalable non-parametric Bayesian inference for Poisson point processes

    Get PDF
    We study the problem of non-parametric Bayesian estimation of the intensity function of a Poisson point process. The observations are nn independent realisations of a Poisson point process on the interval [0,T][0,T]. We propose two related approaches. In both approaches we model the intensity function as piecewise constant on NN bins forming a partition of the interval [0,T][0,T]. In the first approach the coefficients of the intensity function are assigned independent gamma priors, leading to a closed form posterior distribution. On the theoretical side, we prove that as n→∞,n\rightarrow\infty, the posterior asymptotically concentrates around the "true", data-generating intensity function at an optimal rate for hh-H\"older regular intensity functions (0<h≤10 < h\leq 1). In the second approach we employ a gamma Markov chain prior on the coefficients of the intensity function. The posterior distribution is no longer available in closed form, but inference can be performed using a straightforward version of the Gibbs sampler. Both approaches scale well with sample size, but the second is much less sensitive to the choice of NN. Practical performance of our methods is first demonstrated via synthetic data examples. We compare our second method with other existing approaches on the UK coal mining disasters data. Furthermore, we apply it to the US mass shootings data and Donald Trump's Twitter data.Comment: 45 pages, 22 figure
    • …
    corecore