5 research outputs found

    Investigating large-scale brain dynamics using field potential recordings: Analysis and interpretation

    Get PDF
    New technologies to record electrical activity from the brain on a massive scale offer tremendous opportunities for discovery. Electrical measurements of large-scale brain dynamics, termed field potentials, are especially important to understanding and treating the human brain. Here, our goal is to provide best practices on how field potential recordings (EEG, MEG, ECoG and LFP) can be analyzed to identify large-scale brain dynamics, and to highlight critical issues and limitations of interpretation in current work. We focus our discussion of analyses around the broad themes of activation, correlation, communication and coding. We provide best-practice recommendations for the analyses and interpretations using a forward model and an inverse model. The forward model describes how field potentials are generated by the activity of populations of neurons. The inverse model describes how to infer the activity of populations of neurons from field potential recordings. A recurring theme is the challenge of understanding how field potentials reflect neuronal population activity given the complexity of the underlying brain systems

    Parametric models to relate spike train and LFP dynamics with neural information processing

    Get PDF
    Spike trains and local field potentials resulting from extracellular current flows provide a substrate for neural information processing. Understanding the neural code from simultaneous spike-field recordings and subsequent decoding of information processing events will have widespread applications. One way to demonstrate an understanding of the neural code, with particular advantages for the development of applications, is to formulate a parametric statistical model of neural activity and its covariates. Here, we propose a set of parametric spike-field models (unified models) that can be used with existing decoding algorithms to reveal the timing of task or stimulus specific processing. Our proposed unified modeling framework captures the effects of two important features of information processing: time-varying stimulus driven inputs and ongoing background activity that occurs even in the absence of environmental inputs. We have applied this framework for decoding neural latencies in simulated and experimentally recorded spike-field sessions obtained from the lateral intraparietal area (LIP) of awake, behaving monkeys performing cued look-and-reach movements to spatial targets. Using both simulated and experimental data, we find that estimates of trial-by-trial parameters are not significantly affected by the presence of ongoing background activity. However, including background activity in the unified model improves goodness of fit for predicting individual spiking events. Trial-by-trial spike-field correlation in visual response onset times are higher when the unified model is used, matching with corresponding values obtained using earlier trial-averaged measures on a previously published data set. Uncovering the relationship between the model parameters and the timing of movements offers new ways to test hypotheses about the relationship between neural activity and behavior

    HIERARCHICAL NEURAL COMPUTATION IN THE MAMMALIAN VISUAL SYSTEM

    Get PDF
    Our visual system can efficiently extract behaviorally relevant information from ambiguous and noisy luminance patterns. Although we know much about the anatomy and physiology of the visual system, it remains obscure how the computation performed by individual visual neurons is constructed from the neural circuits. In this thesis, I designed novel statistical modeling approaches to study hierarchical neural computation, using electrophysiological recordings from several stages of the mammalian visual system. In Chapter 2, I describe a two-stage nonlinear model that characterized both synaptic current and spike response of retinal ganglion cells with unprecedented accuracy. I found that excitatory synaptic currents to ganglion cells are well described by excitatory inputs multiplied by divisive suppression, and that spike responses can be explained with the addition of a second stage of spiking nonlinearity and refractoriness. The structure of the model was inspired by known elements of the retinal circuit, and implies that presynaptic inhibition from amacrine cells is an important mechanism underlying ganglion cell computation. In Chapter 3, I describe a hierarchical stimulus-processing model of MT neurons in the context of a naturalistic optic flow stimulus. The model incorporates relevant nonlinear properties of upstream V1 processing and explained MT neuron responses to complex motion stimuli. MT neuron responses are shown to be best predicted from distinct excitatory and suppressive components. The direction-selective suppression can impart selectivity of MT neurons to complex velocity fields, and contribute to improved estimation of the three-dimensional velocity of moving objects. In Chapter 4, I present an extended model of MT neurons that includes both the stimulus-processing component and network activity reflected in local field potentials (LFPs). A significant fraction of the trial-to-trial variability of MT neuron responses is predictable from the LFPs in both passive fixation and a motion discrimination task. Moreover, the choice-related variability of MT neuron responses can be explained by their phase preferences in low-frequency band LFPs. These results suggest an important role of network activity in cortical function. Together, these results demonstrated that it is possible to infer the nature of neural computation from physiological recordings using statistical modeling approaches

    Multivariate Multiscale Analysis of Neural Spike Trains

    Get PDF
    This dissertation introduces new methodologies for the analysis of neural spike trains. Biological properties of the nervous system, and how they are reflected in neural data, can motivate specific analytic tools. Some of these biological aspects motivate multiscale frameworks, which allow for simultaneous modelling of the local and global behaviour of neurons. Chapter 1 provides the preliminary background on the biology of the nervous system and details the concept of information and randomness in the analysis of the neural spike trains. It also provides the reader with a thorough literature review on the current statistical models in the analysis of neural spike trains. The material presented in the next six chapters (2-7) have been the focus of three papers, which have either already been published or are being prepared for publication. It is demonstrated in Chapters 2 and 3 that the multiscale complexity penalized likelihood method, introduced in Kolaczyk and Nowak (2004), is a powerful model in the simultaneous modelling of spike trains with biological properties from different time scales. To detect the periodic spiking activities of neurons, two periodic models from the literature, Bickel et al. (2007, 2008); Shao and Li (2011), were combined and modified in a multiscale penalized likelihood model. The contributions of these chapters are (1) employinh a powerful visualization tool, inter-spike interval (ISI) plot, (2) combining the multiscale method of Kolaczyk and Nowak (2004) with the periodic models ofBickel et al. (2007, 2008) and Shao and Li (2011), to introduce the so-called additive and multiplicative models for the intensity function of neural spike trains and introducing a cross-validation scheme to estimate their tuning parameters, (3) providing the numerical bootstrap confidence bands for the multiscale estimate of the intensity function, and (4) studying the effect of time-scale on the statistical properties of spike counts. Motivated by neural integration phenomena, as well as the adjustments for the neural refractory period, Chapters 4 and 5 study the Skellam process and introduce the Skellam Process with Resetting (SPR). Introducing SPR and its application in the analysis of neural spike trains is one of the major contributions of this dissertation. This stochastic process is biologically plausible, and unlike the Poisson process, it does not suffer from limited dependency structure. It also has multivariate generalizations for the simultaneous analysis of multiple spike trains. A computationally efficient recursive algorithm for the estimation of the parameters of SPR is introduced in Chapter 5. Except for the literature review at the beginning of Chapter 4, the rest of the material within these two chapters is original. The specific contributions of Chapters 4 and 5 are (1) introducing the Skellam Process with Resetting as a statistical tool to analyze neural spike trains and studying its properties, including all theorems and lemmas provided in Chapter 4, (2) the two fairly standard definitions of the Skellam process (homogeneous and inhomogeneous) and the proof of their equivalency, (3) deriving the likelihood function based on the observable data (spike trains) and developing a computationally efficient recursive algorithm for parameter estimation, and (4) studying the effect of time scales on the SPR model. The challenging problem of multivariate analysis of the neural spike trains is addressed in Chapter 6. As far as we know, the multivariate models which are available in the literature suffer from limited dependency structures. In particular, modelling negative correlation among spike trains is a challenging problem. To address this issue, the multivariate Skellam distribution, as well as the multivariate Skellam process, which both have flexible dependency structures, are developed. Chapter 5 also introduces a multivariate version of Skellam Process with Resetting (MSPR), and a so-called profile-moment likelihood estimation of its parameters. This chapter generalizes the results of Chapter 4 and 5, and therefore, except for the brief literature review provided at the beginning of the chapter, the remainder of the material is original work. In particular, the contributions of this chapter are (1) introducing multivariate Skellam distribution, (2) introducing two definitions of the Multivariate Skellam process in both homogeneous and inhomogeneous cases and proving their equivalence, (3) introducing Multivariate Skellam Process with Resetting (MSPR) to simultaneously model spike trains from an ensemble of neurons, and (4) utilizing the so-called profile-moment likelihood method to compute estimates of the parameters of MSPR. The discussion of the developed methodologies as well as the ``next steps'' are outlined in Chapter 7
    corecore