780 research outputs found

    Entropy-based parametric estimation of spike train statistics

    Full text link
    We consider the evolution of a network of neurons, focusing on the asymptotic behavior of spikes dynamics instead of membrane potential dynamics. The spike response is not sought as a deterministic response in this context, but as a conditional probability : "Reading out the code" consists of inferring such a probability. This probability is computed from empirical raster plots, by using the framework of thermodynamic formalism in ergodic theory. This gives us a parametric statistical model where the probability has the form of a Gibbs distribution. In this respect, this approach generalizes the seminal and profound work of Schneidman and collaborators. A minimal presentation of the formalism is reviewed here, while a general algorithmic estimation method is proposed yielding fast convergent implementations. It is also made explicit how several spike observables (entropy, rate, synchronizations, correlations) are given in closed-form from the parametric estimation. This paradigm does not only allow us to estimate the spike statistics, given a design choice, but also to compare different models, thus answering comparative questions about the neural code such as : "are correlations (or time synchrony or a given set of spike patterns, ..) significant with respect to rate coding only ?" A numerical validation of the method is proposed and the perspectives regarding spike-train code analysis are also discussed.Comment: 37 pages, 8 figures, submitte

    Sparse Multivariate Modeling: Priors and Applications

    Get PDF

    Numerically Stable Approximate Bayesian Methods for Generalized Linear Mixed Models and Linear Model Selection

    Get PDF
    Approximate Bayesian inference methods offer methodology for fitting Bayesian models as fast alternatives to Markov Chain Monte Carlo methods that sometimes have only a slight loss of accuracy. In this thesis, we consider variable selection for linear models, and zero inflated mixed models. Variable selection for linear regression models are ubiquitous in applied statistics. We use the popular g-prior (Zellner, 1986) for model selection of linear models with normal priors where g is a prior hyperparameter. We derive exact expressions for the model selection Bayes Factors in terms of special functions depending on the sample size, number of covariates and R-squared of the model. We show that these expressions are accurate, fast to evaluate, and numerically stable. An R package blma for doing Bayesian linear model averaging using these exact expressions has been released on GitHub. We extend the Particle EM method of (Rockova, 2017) using Particle Variational Approximation and the exact posterior marginal likelihood expressions to derive a computationally efficient algorithm for model selection on data sets with many covariates. Our algorithm performs well relative to existing algorithms, completing in 8 seconds on a model selection problem with a sample size of 600 and 7200 covariates. We consider zero-inflated models that have many applications in areas such as manufacturing and public health, but pose numerical issues when fitting them to data. We apply a variational approximation to zero-inflated Poisson mixed models with Gaussian distributed random effects using a combination of VB and the Gaussian Variational Approximation (GVA). We also incorporate a novel parameterisation of the covariance of the GVA using the Cholesky factor of the precision matrix, similar to Tan and Nott (2018) to resolve associated numerical difficulties

    Probabilistic Tensor Decomposition of Neural Population Spiking Activity

    Get PDF
    The firing of neural populations is coordinated across cells, in time, and across experimental conditions or repeated experimental trials, and so a full understanding of the computational significance of neural responses must be based on a separation of these different contributions to structured activity. Tensor decomposition is an approach to untangling the influence of multiple factors in data that is common in many fields. However, despite some recent interest in neuroscience, wider applicability of the approach is hampered by the lack of a full probabilistic treatment allowing principled inference of a decomposition from non-Gaussian spike-count data. Here, we extend the Polya-Gamma (PG) augmentation, previously used in sampling-based Bayesian inference, to implement scalable variational inference in non-conjugate spike-count models. Using this new approach, we develop techniques related to automatic relevance determination to infer the most appropriate tensor rank, as well as to incorporate priors based on known brain anatomy such as the segregation of cell response properties by brain area. We apply the model to neural recordings taken under conditions of visual-vestibular sensory integration, revealing how the encoding of self- and visual-motion signals is modulated by the sensory information available to the animal
    • …
    corecore