512 research outputs found
Poisson Latent Feature Calculus for Generalized Indian Buffet Processes
The purpose of this work is to describe a unified, and indeed simple,
mechanism for non-parametric Bayesian analysis, construction and generative
sampling of a large class of latent feature models which one can describe as
generalized notions of Indian Buffet Processes(IBP). This is done via the
Poisson Process Calculus as it now relates to latent feature models. The IBP
was ingeniously devised by Griffiths and Ghahramani in (2005) and its
generative scheme is cast in terms of customers entering sequentially an Indian
Buffet restaurant and selecting previously sampled dishes as well as new
dishes. In this metaphor dishes corresponds to latent features, attributes,
preferences shared by individuals. The IBP, and its generalizations, represent
an exciting class of models well suited to handle high dimensional statistical
problems now common in this information age. The IBP is based on the usage of
conditionally independent Bernoulli random variables, coupled with completely
random measures acting as Bayesian priors, that are used to create sparse
binary matrices. This Bayesian non-parametric view was a key insight due to
Thibaux and Jordan (2007). One way to think of generalizations is to to use
more general random variables. Of note in the current literature are models
employing Poisson and Negative-Binomial random variables. However, unlike their
closely related counterparts, generalized Chinese restaurant processes, the
ability to analyze IBP models in a systematic and general manner is not yet
available. The limitations are both in terms of knowledge about the effects of
different priors and in terms of models based on a wider choice of random
variables. This work will not only provide a thorough description of the
properties of existing models but also provide a simple template to devise and
analyze new models.Comment: This version provides more details for the multivariate extensions in
section 5. We highlight the case of a simple multinomial distribution and
showcase a multivariate Levy process prior we call a stable-Beta Dirichlet
process. Section 4.1.1 expande
Bayesian Poisson process partition calculus with an application to Bayesian L\'evy moving averages
This article develops, and describes how to use, results concerning
disintegrations of Poisson random measures. These results are fashioned as
simple tools that can be tailor-made to address inferential questions arising
in a wide range of Bayesian nonparametric and spatial statistical models. The
Poisson disintegration method is based on the formal statement of two results
concerning a Laplace functional change of measure and a Poisson Palm/Fubini
calculus in terms of random partitions of the integers {1,...,n}. The
techniques are analogous to, but much more general than, techniques for the
Dirichlet process and weighted gamma process developed in [Ann. Statist. 12
(1984) 351-357] and [Ann. Inst. Statist. Math. 41 (1989) 227-245]. In order to
illustrate the flexibility of the approach, large classes of random probability
measures and random hazards or intensities which can be expressed as
functionals of Poisson random measures are described. We describe a unified
posterior analysis of classes of discrete random probability which identifies
and exploits features common to all these models. The analysis circumvents many
of the difficult issues involved in Bayesian nonparametric calculus, including
a combinatorial component. This allows one to focus on the unique features of
each process which are characterized via real valued functions h. The
applicability of the technique is further illustrated by obtaining explicit
posterior expressions for L\'evy-Cox moving average processes within the
general setting of multiplicative intensity models.Comment: Published at http://dx.doi.org/10.1214/009053605000000336 in the
Annals of Statistics (http://www.imstat.org/aos/) by the Institute of
Mathematical Statistics (http://www.imstat.org
Bayesian Estimation of a Simple Macroeconomic Model for a Small Open and Partially Dollarized Economy
I describe a simple new-keynesian macroeconomic model for a small open and partially dollarized economy, which closely resembles the Quarterly Projection Model (QPM) developed at the Central Bank of Peru (Vega et al. (2009)). Then I use Bayesian techniques and quarterly data from Peru to estimate a large group of parameters. The empirical findings provide support for some of the parameters values imposed in the original QPM. In contrast, I find that another group of coefficients ā e.g., the weights on the forward-looking components in the aggregate demand and the Phillips curve equations, among several others ā should be modified to be more consistent with the data. Furthermore, the results validate the operation of different channels of monetary policy transmission, such as the traditional interest rate channel and the exchange rate channel. I also find evidence that in the most recent part of the sample (2004 onwards), the expectations channel has become more prominent, as implied by the estimated values of the forward-looking parameters in the aggregate demand and the Phillips curve equations.Monetary Policy; Partial Dollarization; Bayesian Estimation
Price-Level versus Inflation Targeting with Financial Market Imperfections
This paper compares price-level-path targeting (PT) with inflation targeting (IT) in a sticky-price, dynamic, general equilibrium model augmented with imperfections in both the debt and equity markets. Using a Bayesian approach, we estimate this model for the Canadian economy. We show that the model with both debt and equity market imperfections fits the data better and use it to compare PT versus the estimated current IT regime. We find that in general PT outperforms the current IT regime. However, the gain is lower when financial market imperfections are taken into account.Monetary policy framework; Inflation targets; Economic models
Applying stochastic spike train theory for high-accuracy MEG/EEG
The accuracy of electroencephalography (EEG) and magnetoencephalography (MEG) is challenged by overlapping sources from within the brain. This lack of accuracy is a severe limitation to the possibilities and reliability of modern stimulation protocols in basic research and clinical diagnostics. As a solution, we here introduce a theory of stochastic neuronal spike timing probability densities for describing the large-scale spiking activity in neural networks, and a novel spike density component analysis (SCA) method for isolating specific neural sources. Three studies are conducted based on 564 cases of evoked responses to auditory stimuli from 94 human subjects each measured with 60 EEG electrodes and 306 MEG sensors. In the first study we show that the large-scale spike timing (but not non-encephalographic artifacts) in MEG/EEG waveforms can be modeled with Gaussian probability density functions with ā¦Non peer reviewe
Gaussian beta process
<p>This thesis presents a new framework for constituting a group of dependent completely random measures, unifying and extending methods in the literature. The dependent completely random measures are constructed based on a shared completely random measure, which is extended to the covariate space, and further differentiated by the covariate information associated with the data for which the completely random measures serve as priors. As a concrete example of the flexibility provided by the framework, a group of dependent feature learning measures are constructed based on a shared beta process, with Gaussian processes applied to build adaptive dependencies learnt from the practical data, denoted as the Gaussian beta process. Experiment results are presented for gene-expression series data (time as covariate), as well as digital image data (spatial location as covariate).</p>Thesi
- ā¦