108,221 research outputs found

    On Approximations of the Beta Process in Latent Feature Models

    Full text link
    The beta process has recently been widely used as a nonparametric prior for different models in machine learning, including latent feature models. In this paper, we prove the asymptotic consistency of the finite dimensional approximation of the beta process due to Paisley \& Carin (2009). In addition, we derive an almost sure approximation of the beta process. This approximation provides a direct method to efficiently simulate the beta process. A simulated example, illustrating the work of the method and comparing its performance to several existing algorithms, is also included.Comment: 25 page

    Bayesian Model Selection for Beta Autoregressive Processes

    Get PDF
    We deal with Bayesian inference for Beta autoregressive processes. We restrict our attention to the class of conditionally linear processes. These processes are particularly suitable for forecasting purposes, but are difficult to estimate due to the constraints on the parameter space. We provide a full Bayesian approach to the estimation and include the parameter restrictions in the inference problem by a suitable specification of the prior distributions. Moreover in a Bayesian framework parameter estimation and model choice can be solved simultaneously. In particular we suggest a Markov-Chain Monte Carlo (MCMC) procedure based on a Metropolis-Hastings within Gibbs algorithm and solve the model selection problem following a reversible jump MCMC approach

    Nested Hierarchical Dirichlet Processes

    Full text link
    We develop a nested hierarchical Dirichlet process (nHDP) for hierarchical topic modeling. The nHDP is a generalization of the nested Chinese restaurant process (nCRP) that allows each word to follow its own path to a topic node according to a document-specific distribution on a shared tree. This alleviates the rigid, single-path formulation of the nCRP, allowing a document to more easily express thematic borrowings as a random effect. We derive a stochastic variational inference algorithm for the model, in addition to a greedy subtree selection method for each document, which allows for efficient inference using massive collections of text documents. We demonstrate our algorithm on 1.8 million documents from The New York Times and 3.3 million documents from Wikipedia.Comment: To appear in IEEE Transactions on Pattern Analysis and Machine Intelligence, Special Issue on Bayesian Nonparametric

    Exact simulation pricing with Gamma processes and their extensions

    Full text link
    Exact path simulation of the underlying state variable is of great practical importance in simulating prices of financial derivatives or their sensitivities when there are no analytical solutions for their pricing formulas. However, in general, the complex dependence structure inherent in most nontrivial stochastic volatility (SV) models makes exact simulation difficult. In this paper, we present a nontrivial SV model that parallels the notable Heston SV model in the sense of admitting exact path simulation as studied by Broadie and Kaya. The instantaneous volatility process of the proposed model is driven by a Gamma process. Extensions to the model including superposition of independent instantaneous volatility processes are studied. Numerical results show that the proposed model outperforms the Heston model and two other L\'evy driven SV models in terms of model fit to the real option data. The ability to exactly simulate some of the path-dependent derivative prices is emphasized. Moreover, this is the first instance where an infinite-activity volatility process can be applied exactly in such pricing contexts.Comment: Forthcoming The Journal of Computational Financ

    Gamma Processes, Stick-Breaking, and Variational Inference

    Full text link
    While most Bayesian nonparametric models in machine learning have focused on the Dirichlet process, the beta process, or their variants, the gamma process has recently emerged as a useful nonparametric prior in its own right. Current inference schemes for models involving the gamma process are restricted to MCMC-based methods, which limits their scalability. In this paper, we present a variational inference framework for models involving gamma process priors. Our approach is based on a novel stick-breaking constructive definition of the gamma process. We prove correctness of this stick-breaking process by using the characterization of the gamma process as a completely random measure (CRM), and we explicitly derive the rate measure of our construction using Poisson process machinery. We also derive error bounds on the truncation of the infinite process required for variational inference, similar to the truncation analyses for other nonparametric models based on the Dirichlet and beta processes. Our representation is then used to derive a variational inference algorithm for a particular Bayesian nonparametric latent structure formulation known as the infinite Gamma-Poisson model, where the latent variables are drawn from a gamma process prior with Poisson likelihoods. Finally, we present results for our algorithms on nonnegative matrix factorization tasks on document corpora, and show that we compare favorably to both sampling-based techniques and variational approaches based on beta-Bernoulli priors

    Resampling Procedures with Empirical Beta Copulas

    Full text link
    The empirical beta copula is a simple but effective smoother of the empirical copula. Because it is a genuine copula, from which, moreover, it is particularly easy to sample, it is reasonable to expect that resampling procedures based on the empirical beta copula are expedient and accurate. In this paper, after reviewing the literature on some bootstrap approximations for the empirical copula process, we first show the asymptotic equivalence of several bootstrapped processes related to the empirical copula and empirical beta copula. Then we investigate the finite-sample properties of resampling schemes based on the empirical (beta) copula by Monte Carlo simulation. More specifically, we consider interval estimation for some functionals such as rank correlation coefficients and dependence parameters of several well-known families of copulas, constructing confidence intervals by several methods and comparing their accuracy and efficiency. We also compute the actual size and power of symmetry tests based on several resampling schemes for the empirical copula and empirical beta copula.Comment: 22 pages, 8 table

    MCMC Bayesian Estimation in FIEGARCH Models

    Full text link
    Bayesian inference for fractionally integrated exponential generalized autoregressive conditional heteroskedastic (FIEGARCH) models using Markov Chain Monte Carlo (MCMC) methods is described. A simulation study is presented to access the performance of the procedure, under the presence of long-memory in the volatility. Samples from FIEGARCH processes are obtained upon considering the generalized error distribution (GED) for the innovation process. Different values for the tail-thickness parameter \nu are considered covering both scenarios, innovation processes with lighter (\nu2) tails than the Gaussian distribution (\nu=2). A sensitivity analysis is performed by considering different prior density functions and by integrating (or not) the knowledge on the true parameter values to select the hyperparameter values
    corecore