18 research outputs found

    State-Space Models and Latent Processes in the Statistical Analysis of Neural Data

    Get PDF
    This thesis develops and applies statistical methods for the analysis of neural data. In the second chapter we incorporate a latent process to the Generalized Linear Model framework. We develop and apply our framework to estimate the linear filters of an entire population of retinal ganglion cells while taking into account the effects of common-noise the cells might share. We are able to capture the encoding and decoding of visual stimulus to neural code. Our formalism gives us insight into the underlying architecture of the neural system. And we are able to estimate the common-noise that the cells receive. In the third chapter we discuss methods for optimally inferring the synaptic inputs to an electrotonically compact neuron, given intracellular voltage-clamp or current-clamp recordings from the postsynaptic cell. These methods are based on sequential Monte Carlo techniques ("particle filtering"). We demonstrate, on model data, that these methods can recover the time course of excitatory and inhibitory synaptic inputs accurately on a single trial. In the fourth chapter we develop a more general approach to the state-space filtering problem. Our method solves the same recursive set of Markovian filter equations as the particle filter, but we replace all importance sampling steps with a more general Markov chain Monte Carlo (MCMC) step. Our algorithm is especially well suited for problems where the model parameters might be misspecified

    Investigating thought disorder in schizophrenia: evidence for pathological activation.

    Get PDF
    BACKGROUND: Previous research has yielded evidence for enhanced semantic priming in formal thought-disordered schizophrenia patients, a result that fits well with the hypothesis of disinhibited processes of spreading activation in this population. OBJECTIVE: The current study examined whether hyper priming among schizophrenia patients is an outcome of further spreading of activation of a node or a result of farther activation of nodes in the semantic network. We also try to shed light on the fate of this activation. METHODS: The present study tested this hypothesis by using semantic and identical priming in two different experiments. SOA (stimulus onset asynchrony) was manipulated (240 ms vs. 740 ms) within block. It is assumed that among healthy individuals, performance relies on a balance between activation and inhibition processes, contrary to in schizophrenic individuals. In order to examine this hypothesis, we compared formal thought-disordered schizophrenia patients, non thought-disordered schizophrenia patients, and healthy controls. RESULTS: For thought-disordered schizophrenia patients, we found a large positive semantic effect and identical priming effect (129 ms and 154 ms, respectively) only with short SOA. SOA and type of priming did not modulate priming effects in the control groups. CONCLUSIONS: This result supports the claim that there is a lack of inhibitory processes among thought-disordered patients. Hyper priming in the thought-disorder group may be an outcome of hyper activation followed by rapid decay below baseline threshold

    1 Robust particle filters via sequential pairwise reparameterized Gibbs sampling

    No full text
    provide a powerful set of tools for recursive optimal Bayesian filtering in state-space models. However, these methods are based on importance sampling, which is known to be nonrobust in several key scenarios, and therefore standard particle filtering methods can fail in these settings. We present a filtering method which solves the key forward recursion using a reparameterized Gibbs sampling method, thus sidestepping the need for importance sampling. In many cases the resulting filter is much more robust and efficient than standard importancesampling particle filter implementations. We illustrate the method with an application to a nonlinear, non-Gaussian model from neuroscience. I. INTRODUCTION AND BACKGROUND Sequential Monte Carlo (“particle filtering”) methods have become quite popular over the last two decades [1], largel
    corecore