An evaluation of different copula models for the short-term noise dependence of spike counts


Correlations between spike counts are often used to analyze neural coding. Traditionally, multivariate Gaussian distributions are frequently used to model the correlation structure of these spike-counts [1]. However, this approximation is not realistic for short time intervals. In this study, as an alternative approach we introduce dependencies by means of copulas of several families. Copulas are functions that can be used to couple marginal cumulative distribution functions to form a joint distribution function with the same margins [2]. We can thus use arbitrary marginal distributions such as Poisson or negative binomial that are better suited for modeling noise distributions of spike counts. Furthermore, copulas place a wide range of dependence structures at our disposal and can be used to analyze higher order interactions. We develop a framework to analyze spike count data by means of such copulas. Methods for parameter inference based on maximum likelihood estimates and for computation of Shannon entropy are provided. The methods are evaluated on a data set of simultaneously measured spike-counts on 100 ms intervals of up to three neurons in macaque MT responding to stochastic dot stimuli [3] and of up to six neurons recorded from macaque prefrontal cortex. Parameters are estimated by the inference-for margins method: first the margin likelihoods are separately maximized and then the coupling parameters are estimated given the parameterized margins. Resulting parameters are close to the maximum likelihood estimation with the advantage that the approach is also tractable for moderate dimensions. Goodness-of-fit is evaluated by cross-validation for the likelihoods. The data analysis leads to three significant findings: (1) copula-based distributions provide better fits than discretized multivariate normal distributions; (2) negative binomial margins fit the data better than Poisson margins; and (3) a dependence model that includes only pairwise interactions overestimates the information entropy by at least 19% compared to the model with higher order interactions

Similar works

This paper was published in MPG.PuRe.

Having an issue?

Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.