9 research outputs found

    Continuous testing for Poisson process intensities: A new perspective on scanning statistics

    Full text link
    We propose a novel continuous testing framework to test the intensities of Poisson Processes. This framework allows a rigorous definition of the complete testing procedure, from an infinite number of hypothesis to joint error rates. Our work extends traditional procedures based on scanning windows, by controlling the family-wise error rate and the false discovery rate in a non-asymptotic manner and in a continuous way. The decision rule is based on a \pvalue process that can be estimated by a Monte-Carlo procedure. We also propose new test statistics based on kernels. Our method is applied in Neurosciences and Genomics through the standard test of homogeneity, and the two-sample test

    Surrogate data methods based on a shuffling of the trials for synchrony detection: the centering issue

    Get PDF
    International audienceWe investigate several distribution-free dependence detection procedures, all based on a shuffling of the trials, from a statistical point of view. The mathematical justification of such procedures lies in the bootstrap principle and its approximation properties. In particular, we show that such a shuffling has mainly to be done on centered quantities-that is, quantities with zero mean under independence-to construct correct p-values, meaning that the corresponding tests control their false positive (FP) rate. Thanks to this study, we introduce a method, named permutation UE, which consists of a multiple testing procedure based on permutation of experimental trials and delayed coincidence count. Each involved single test of this procedure achieves the prescribed level, so that the corresponding multiple testing procedure controls the false discovery rate (FDR), and this with as few assumptions as possible on the underneath distribution, except independence and identical distribution across trials. The mathematical meaning of this assumption is discussed, and it is in particular argued that it does not mean what is commonly referred in neuroscience to as cross-trials stationarity. Some simulations show, moreover, that permutation UE outperforms the trial-shuffling of Pipa and Grün ( 2003 ) and the MTGAUE method of Tuleau-Malot et al. ( 2014 ) in terms of single levels and FDR, for a comparable amount of false negatives. Application to real data is also provided

    Microscopic approach of a time elapsed neural model

    Get PDF
    The spike trains are the main components of the information processing in the brain. To model spike trains several point processes have been investigated in the literature. And more macroscopic approaches have also been studied, using partial differential equation models. The main aim of the present article is to build a bridge between several point processes models (Poisson, Wold, Hawkes) that have been proved to statistically fit real spike trains data and age-structured partial differential equations as introduced by Pakdaman, Perthame and Salort

    Algortimi di spike detection per applicazioni neuroprotesiche: sviluppo di modelli, implementazione e valutazione delle performance

    Get PDF
    I disordini neurologici costituiscono il 6,3% delle cause di malattia in tutto il mondo, diventando una delle priorità della sanità globale. Per trattare questi disordini si utilizzano farmaci, ma alcuni pazienti possono risultarne resistenti. La Neuroingegneria propone soluzioni innovative per la cura e la riabilitazione di queste patologie, proponendo, tra le varie soluzioni, le neuroprotesi, capaci di sostituire un’area danneggiata del cervello o di ricollegare artificialmente due aree disconnesse bypassando la lesione che ha causato il danno. Tra questi, il dispositivo sviluppato presso la University of Kansas (KUMC) si è dimostrato essere efficace in esperimenti effettuati su topi con lesione focale in area motoria. Il funzionamento di questo dispositivo è basato sull’impianto di micro-elettrodi in due regioni cerebrali disconnesse a causa di una lesione. Questi creano un ponte in grado di ricollegare le due aree scollegate attraverso la registrazione di eventi (spike) in una delle due aree, e la seguente somministrazione di corrente nella seconda. In questo tipo di dispositivi, è importantissimo effettuare una identificazione corretta degli spikes. Il mio lavoro di tesi si inserisce nell’ambito della collaborazione tra il Rehab Technologies Lab (IIT, Genova), dove ho svolto il tirocinio, e la KUMC in relazione al progetto per lo sviluppo di neuroprotesi innovative per il recupero motorio a seguito di danni cerebrali. Nello specifico, il mio lavoro di Tesi si concentra sulla Spike Detection (SD), di cui uno dei problemi fondamentali è la mancanza di un ground truth, ovvero di una conoscenza a priori della localizzazione degli spikes nel tracciato. Nel contesto descritto sopra, si inseriscono gli obiettivi di questa Tesi: fornire un ground truth, studiare e adattare un set di algoritmi di SD già presenti in letteratura, modificare un algoritmo ad alte prestazioni sviluppato all’interno di IIT in passato e confrontare le prestazioni di tutti gli algoritmi di SD

    Impact of Spike Train Autostructure on Probability Distribution of Joint Spike Events

    Get PDF
    The discussion whether temporally coordinated spiking activity really exists and whether it is relevant has been heated over the past few years. To investigate this issue, several approaches have been taken to determine whether synchronized events occur significantly above chance, that is, whether they occur more often than expected if the neurons fire independently. Most investigations ignore or destroy the autostructure of the spiking activity of individual cells or assume Poissonian spiking as a model. Such methods that ignore the autostructure can significantly bias the coincidence statistics. Here, we study the influence of the autostructure on the probability distribution of coincident spiking events between tuples of mutually independent non-Poisson renewal processes. In particular, we consider two types of renewal processes that were suggested as appropriate models of experimental spike trains: a gamma and a log-normal process. For a gamma process, we characterize the shape of the distribution analytically with the Fano factor (FFc). In addition, we perform Monte Carlo estimations to derive the full shape of the distribution and the probability for false positives if a different process type is assumed as was actually present. We also determine how manipulations of such spike trains, here dithering, used for the generation of surrogate data change the distribution of coincident events and influence the significance estimation. We find, first, that the width of the coincidence count distribution and its FFc depend critically and in a nontrivial way on the detailed properties of the structure of the spike trains as characterized by the coefficient of variation CV. Second, the dependence of the FFc on the CV is complex and mostly nonmonotonic. Third, spike dithering, even if as small as a fraction of the interspike interval, can falsify the inference on coordinated firing

    Disaggregation by State Inference A Probabilistic Framework For Non-Intrusive Load Monitoring

    Get PDF
    Non-intrusive load monitoring (NILM), the problem of disaggregating whole home power measurements into single-appliance measurements, has received increasing attention from the academic community because of its energy saving potentials, however the majority of NILM approaches are either variants of event-based or event-less disaggregation. Event-based approaches are able to capture much information about the transient behavior of appliances but suffer from error-propagation problems whereas event-less approaches are lessprone to error-propagation problems but can only incorporate transient information to a small degree. On top of that inference techniques for event-less approaches are either computationally expensive, do not allow to trade off computational time for approximation accuracy or are prone to local minima. This work will contribute three-fold: first an automated way to infer ground truth from single appliance readings is introduced, second an augmentation for event-less approaches is introduced that allows to capture side-channel as well as transient information of change-points, third an inference technique is presented that allows to control the trade-off between computational expense and accuracy. Ultimately, this work will try to put the NILM problem into a probabilistic framework that allows for closing feedback loops between the different stages of event-based NILM approaches, effectively bridging event-less and event-based approaches. The performance of the inference technique is evaluated on a synthetic data set and compared to state-of-the-art approaches. Then the hypothesis that incorporating transient information increases the disaggregation performance is tested on a real-life data set

    29th Annual Computational Neuroscience Meeting: CNS*2020

    Get PDF
    Meeting abstracts This publication was funded by OCNS. The Supplement Editors declare that they have no competing interests. Virtual | 18-22 July 202
    corecore