476 research outputs found

    Exact firing time statistics of neurons driven by discrete inhibitory noise

    Get PDF
    Neurons in the intact brain receive a continuous and irregular synaptic bombardment from excitatory and inhibitory pre-synaptic neurons, which determines the firing activity of the stimulated neuron. In order to investigate the influence of inhibitory stimulation on the firing time statistics, we consider Leaky Integrate-and-Fire neurons subject to inhibitory instantaneous post-synaptic potentials. In particular, we report exact results for the firing rate, the coefficient of variation and the spike train spectrum for various synaptic weight distributions. Our results are not limited to stimulations of infinitesimal amplitude, but they apply as well to finite amplitude post-synaptic potentials, thus being able to capture the effect of rare and large spikes. The developed methods are able to reproduce also the average firing properties of heterogeneous neuronal populations.Comment: 20 pages, 8 Figures, submitted to Scientific Report

    Estimation in discretely observed diffusions killed at a threshold

    Get PDF
    Parameter estimation in diffusion processes from discrete observations up to a first-hitting time is clearly of practical relevance, but does not seem to have been studied so far. In neuroscience, many models for the membrane potential evolution involve the presence of an upper threshold. Data are modeled as discretely observed diffusions which are killed when the threshold is reached. Statistical inference is often based on the misspecified likelihood ignoring the presence of the threshold causing severe bias, e.g. the bias incurred in the drift parameters of the Ornstein-Uhlenbeck model for biological relevant parameters can be up to 25-100%. We calculate or approximate the likelihood function of the killed process. When estimating from a single trajectory, considerable bias may still be present, and the distribution of the estimates can be heavily skewed and with a huge variance. Parametric bootstrap is effective in correcting the bias. Standard asymptotic results do not apply, but consistency and asymptotic normality may be recovered when multiple trajectories are observed, if the mean first-passage time through the threshold is finite. Numerical examples illustrate the results and an experimental data set of intracellular recordings of the membrane potential of a motoneuron is analyzed.Comment: 29 pages, 5 figure

    A copula-based method to build diffusion models with prescribed marginal and serial dependence

    Full text link
    This paper investigates the probabilistic properties that determine the existence of space-time transformations between diffusion processes. We prove that two diffusions are related by a monotone space-time transformation if and only if they share the same serial dependence. The serial dependence of a diffusion process is studied by means of its copula density and the effect of monotone and non-monotone space-time transformations on the copula density is discussed. This provides us a methodology to build diffusion models by freely combining prescribed marginal behaviors and temporal dependence structures. Explicit expressions of copula densities are provided for tractable models. A possible application in neuroscience is sketched as a proof of concept

    First passage times of two-correlated processes: analytical results for the Wiener process and a numerical method for diffusion processes

    Full text link
    Given a two-dimensional correlated diffusion process, we determine the joint density of the first passage times of the process to some constant boundaries. This quantity depends on the joint density of the first passage time of the first crossing component and of the position of the second crossing component before its crossing time. First we show that these densities are solutions of a system of Volterra-Fredholm first kind integral equations. Then we propose a numerical algorithm to solve it and we describe how to use the algorithm to approximate the joint density of the first passage times. The convergence of the method is theoretically proved for bivariate diffusion processes. We derive explicit expressions for these and other quantities of interest in the case of a bivariate Wiener process, correcting previous misprints appearing in the literature. Finally we illustrate the application of the method through a set of examples.Comment: 18 pages, 3 figure

    Computational study of resting state network dynamics

    Get PDF
    Lo scopo di questa tesi è quello di mostrare, attraverso una simulazione con il software The Virtual Brain, le più importanti proprietà della dinamica cerebrale durante il resting state, ovvero quando non si è coinvolti in nessun compito preciso e non si è sottoposti a nessuno stimolo particolare. Si comincia con lo spiegare cos’è il resting state attraverso una breve revisione storica della sua scoperta, quindi si passano in rassegna alcuni metodi sperimentali utilizzati nell’analisi dell’attività cerebrale, per poi evidenziare la differenza tra connettività strutturale e funzionale. In seguito, si riassumono brevemente i concetti dei sistemi dinamici, teoria indispensabile per capire un sistema complesso come il cervello. Nel capitolo successivo, attraverso un approccio ‘bottom-up’, si illustrano sotto il profilo biologico le principali strutture del sistema nervoso, dal neurone alla corteccia cerebrale. Tutto ciò viene spiegato anche dal punto di vista dei sistemi dinamici, illustrando il pionieristico modello di Hodgkin-Huxley e poi il concetto di dinamica di popolazione. Dopo questa prima parte preliminare si entra nel dettaglio della simulazione. Prima di tutto si danno maggiori informazioni sul software The Virtual Brain, si definisce il modello di network del resting state utilizzato nella simulazione e si descrive il ‘connettoma’ adoperato. Successivamente vengono mostrati i risultati dell’analisi svolta sui dati ricavati, dai quali si mostra come la criticità e il rumore svolgano un ruolo chiave nell'emergenza di questa attività di fondo del cervello. Questi risultati vengono poi confrontati con le più importanti e recenti ricerche in questo ambito, le quali confermano i risultati del nostro lavoro. Infine, si riportano brevemente le conseguenze che porterebbe in campo medico e clinico una piena comprensione del fenomeno del resting state e la possibilità di virtualizzare l’attività cerebrale

    Aspects of Signal Processing in Noisy Neurons

    Get PDF
    In jüngerer Zeit hat sich die Erkenntnis durchgesetzt, daß statistische Einflüsse, oft Rauschen genannt, die Verarbeitung von Signalen nicht notwendig behindern, sondern unterstützen können. Dieser Effekt ist als stochastische Resonanz bekannt geworden. Es liegt nahe, daß die Evolution Wege gefunden hat, diese Phänomen zur Optimierung der Informationsverarbeitung im Nervensystem auszunutzen. Diese Dissertation untersucht am Beispiel des pulserzeugenden Integratorneurons mit Leckstrom, ob die Kodierung periodischer Signale in Neuronen durch das ohnehin im Nervensystem vorhandene Rauschen verbessert wird. Die Untersuchung erfolgt mit den Methoden der Theorie der Punktprozesse. Die Verteilung der Intervalle zwischen zwei beliebigen aufeinanderfolgenden Pulsen, die das Neuron aussendet, wird aus einem Integralgleichungsansatz numerisch bestimmt und die zeitliche Ordnung der Pulsfolgen relativ zum periodischen Signal als Markoffkette beschrieben. Daneben werden einige Näherungsmodelle für die Pulsintervallverteilung, die weitergehende analytische Untersuchungen erlauben, vorgestellt und ihre Zuverlässigkeit geprüft. Als wesentliches Ergebnis wird gezeigt, daß im Modellneuron zwei Arten rauschinduzierter Resonanz auftreten: zum einen klassiche stochastische Resonanz, d.h. ein optimales Signal-Rausch-Verhältnis der evozierten Pulsfolge bei einer bestimmten Amplitude des Eingangsrauschens. Hinzu tritt eine Resonanz bezüglich der Frequenz des Eingangssignals oder Reizes. Reize eines bestimmten Frequenzbereichs werden in Pulsfolgen kodiert, die zeitlich deutlich strukturiert sind, währ! end Stimuli außerhalb des bevorzugten Frequenzbandes zeitlich homogenere Pulsfolgen auslösen. Für diese zweifache Resonanz wird der Begriff stochastische Doppelresonanz eingeführt. Der Effekt wird auf elementare Mechanismen zurückgeführt und seine Abhängigkeit von den Eigenschaften des Reizes umfassend untersucht. Dabei zeigt sich ,daß die Reizantwort des Neurons einfachen Skalengesetzen unterliegt. Insbesondere ist die optimale skalierte Rauschamplitude ein universeller Parameter des Modells, der vom Reiz unabhängig zu sein scheint. Die optimale Reizfrequenz hängt hingegen linear von der skalierten Reizamplitude ab, wobei die Proportionalitätskonstante vom Gleichstromanteil des Reizes bestimmt wird (Basisstrom). Während große Basisströme Frequenz und Amplitude nahezu entkoppeln, so daß Reize beliebiger Amplitude in zeitlich wohlstrukturierten Pulsfolgen kodiert werden, erlauben es kleine Basisströme, das optimale Frequenzband durch Veränderung der Reizamplitude zu wählen

    The Interplay of Architecture and Correlated Variability in Neuronal Networks

    Get PDF
    This much is certain: neurons are coupled, and they exhibit covariations in their output. The extent of each does not have a single answer. Moreover, the strength of neuronal correlations, in particular, has been a subject of hot debate within the neuroscience community over the past decade, as advancing recording techniques have made available a lot of new, sometimes seemingly conflicting, datasets. The impact of connectivity and the resulting correlations on the ability of animals to perform necessary tasks is even less well understood. In order to answer relevant questions in these categories, novel approaches must be developed. This work focuses on three somewhat distinct, but inseparably coupled, crucial avenues of research within the broader field of computational neuroscience. First, there is a need for tools which can be applied, both by experimentalists and theorists, to understand how networks transform their inputs. In turn, these tools will allow neuroscientists to tease apart the structure which underlies network activity. The Generalized Thinning and Shift framework, presented in Chapter 4, addresses this need. Next, taking for granted a general understanding of network architecture as well as some grasp of the behavior of its individual units, we must be able to reverse the activity to structure relationship, and understand instead how network structure determines dynamics. We achieve this in Chapters 5 through 7 where we present an application of linear response theory yielding an explicit approximation of correlations in integrate--and--fire neuronal networks. This approximation reveals the explicit relationship between correlations, structure, and marginal dynamics. Finally, we must strive to understand the functional impact of network dynamics and architecture on the tasks that a neural network performs. This need motivates our analysis of a biophysically detailed model of the blow fly visual system in Chapter 8. Our hope is that the work presented here represents significant advances in multiple directions within the field of computational neuroscience.Mathematics, Department o
    • …
    corecore