11,237 research outputs found

    Sequential optimal design of neurophysiology experiments

    Get PDF
    For well over 200 years, scientists and doctors have been poking and prodding brains in every which way in an effort to understand how they work. The earliest pokes were quite crude, often involving permanent forms of brain damage. Though neural injury continues to be an active area of research within neuroscience, technology has given neuroscientists a number of tools for stimulating and observing the brain in very subtle ways. Nonetheless, the basic experimental paradigm remains the same; poke the brain and see what happens. For example, neuroscientists studying the visual or auditory system can easily generate any image or sound they can imagine to see how an organism or neuron will respond. Since neuroscientists can now easily design more pokes then they could every deliver, a fundamental question is ``What pokes should they actually use?' The complexity of the brain means that only a small number of the pokes scientists can deliver will produce any information about the brain. One of the fundamental challenges of experimental neuroscience is finding the right stimulus parameters to produce an informative response in the system being studied. This thesis addresses this problem by developing algorithms to sequentially optimize neurophysiology experiments. Every experiment we conduct contains information about how the brain works. Before conducting the next experiment we should use what we have already learned to decide which experiment we should perform next. In particular, we should design an experiment which will reveal the most information about the brain. At a high level, neuroscientists already perform this type of sequential, optimal experimental design; for example crude experiments which knockout entire regions of the brain have given rise to modern experimental techniques which probe the responses of individual neurons using finely tuned stimuli. The goal of this thesis is to develop automated and rigorous methods for optimizing neurophysiology experiments efficiently and at a much finer time scale. In particular, we present methods for near instantaneous optimization of the stimulus being used to drive a neuron.Ph.D.Committee Co-Chair: Butera, Robert; Committee Co-Chair: Paninski, Liam; Committee Member: Isbell, Charles; Committee Member: Rozell, Chris; Committee Member: Stanley, Garrett; Committee Member: Vidakovic, Bran

    Bayesian sequential experimental design for binary response data with application to electromyographic experiments

    Get PDF
    We develop a sequential Monte Carlo approach for Bayesian analysis of the experimental design for binary response data. Our work is motivated by surface electromyographic (SEMG) experiments, which can be used to provide information about the functionality of subjects' motor units. These experiments involve a series of stimuli being applied to a motor unit, with whether or not the motor unit res for each stimulus being recorded. The aim is to learn about how the probability of ring depends on the applied stimulus (the so-called stimulus response curve); One such excitability parameter is an estimate of the stimulus level for which the motor unit has a 50% chance of ring. Within such an experiment we are able to choose the next stimulus level based on the past observations. We show how sequential Monte Carlo can be used to analyse such data in an online manner. We then use the current estimate of the posterior distribution in order to choose the next stimulus level. The aim is to select a stimulus level that mimimises the expected loss. We will apply this loss function to the estimates of target quantiles from the stimulus-response curve. Through simulation we show that this approach is more ecient than existing sequential design methods for choosing the stimulus values. If applied in practice, it could more than halve the length of SEMG experiments

    A unified approach to linking experimental, statistical and computational analysis of spike train data

    Get PDF
    A fundamental issue in neuroscience is how to identify the multiple biophysical mechanisms through which neurons generate observed patterns of spiking activity. In previous work, we proposed a method for linking observed patterns of spiking activity to specific biophysical mechanisms based on a state space modeling framework and a sequential Monte Carlo, or particle filter, estimation algorithm. We have shown, in simulation, that this approach is able to identify a space of simple biophysical models that were consistent with observed spiking data (and included the model that generated the data), but have yet to demonstrate the application of the method to identify realistic currents from real spike train data. Here, we apply the particle filter to spiking data recorded from rat layer V cortical neurons, and correctly identify the dynamics of an slow, intrinsic current. The underlying intrinsic current is successfully identified in four distinct neurons, even though the cells exhibit two distinct classes of spiking activity: regular spiking and bursting. This approach – linking statistical, computational, and experimental neuroscience – provides an effective technique to constrain detailed biophysical models to specific mechanisms consistent with observed spike train data.Published versio

    Counterbalancing for serial order carryover effects in experimental condition orders

    Get PDF
    Reactions of neural, psychological, and social systems are rarely, if ever, independent of previous inputs and states. The potential for serial order carryover effects from one condition to the next in a sequence of experimental trials makes counterbalancing of condition order an essential part of experimental design. Here, a method is proposed for generating counterbalanced sequences for repeated-measures designs including those with multiple observations of each condition on one participant and self-adjacencies of conditions. Condition ordering is reframed as a graph theory problem. Experimental conditions are represented as vertices in a graph and directed edges between them represent temporal relationships between conditions. A counterbalanced trial order results from traversing an Euler circuit through such a graph in which each edge is traversed exactly once. This method can be generalized to counterbalance for higher order serial order carryover effects as well as to create intentional serial order biases. Modern graph theory provides tools for finding other types of paths through such graph representations, providing a tool for generating experimental condition sequences with useful properties

    Dual enhancement mechanisms for overnight motor memory consolidation

    Get PDF
    Our brains are constantly processing past events<sup>1</sup>. These offline processes consolidate memories, leading in the case of motor skill memories to an enhancement in performance between training sessions. A similar magnitude of enhancement develops over a night of sleep following an implicit task, in which a sequence of movements is acquired unintentionally, or following an explicit task, in which the same sequence is acquired intentionally<sup>2</sup>. What remains poorly understood, however, is whether these similar offline improvements are supported by similar circuits, or through distinct circuits. We set out to distinguish between these possibilities by applying transcranial magnetic stimulation over the primary motor cortex (M1) or the inferior parietal lobule (IPL) immediately after learning in either the explicit or implicit task. These brain areas have both been implicated in encoding aspects of a motor sequence and subsequently supporting offline improvements over sleep<sup>3,​4,​5</sup>. Here we show that offline improvements following the explicit task are dependent on a circuit that includes M1 but not IPL. In contrast, offline improvements following the implicit task are dependent on a circuit that includes IPL but not M1. Our work establishes the critical contribution made by M1 and IPL circuits to offline memory processing, and reveals that distinct circuits support similar offline improvements

    Monitoring cortical excitability during repetitive transcranial magnetic stimulation in children with ADHD: a single-blind, sham-controlled TMS-EEG study

    Get PDF
    Background: Repetitive transcranial magnetic stimulation (rTMS) allows non-invasive stimulation of the human brain. However, no suitable marker has yet been established to monitor the immediate rTMS effects on cortical areas in children. Objective: TMS-evoked EEG potentials (TEPs) could present a well-suited marker for real-time monitoring. Monitoring is particularly important in children where only few data about rTMS effects and safety are currently available. Methods: In a single-blind sham-controlled study, twenty-five school-aged children with ADHD received subthreshold 1 Hz-rTMS to the primary motor cortex. The TMS-evoked N100 was measured by 64-channel-EEG pre, during and post rTMS, and compared to sham stimulation as an intraindividual control condition. Results: TMS-evoked N100 amplitude decreased during 1 Hz-rTMS and, at the group level, reached a stable plateau after approximately 500 pulses. N100 amplitude to supra-threshold single pulses post rTMS confirmed the amplitude reduction in comparison to the pre-rTMS level while sham stimulation had no influence. EEG source analysis indicated that the TMS-evoked N100 change reflected rTMS effects in the stimulated motor cortex. Amplitude changes in TMS-evoked N100 and MEPs (pre versus post 1 Hz-rTMS) correlated significantly, but this correlation was also found for pre versus post sham stimulation. Conclusion: The TMS-evoked N100 represents a promising candidate marker to monitor rTMS effects on cortical excitability in children with ADHD. TMS-evoked N100 can be employed to monitor real-time effects of TMS for subthreshold intensities. Though TMS-evoked N100 was a more sensitive parameter for rTMS-specific changes than MEPs in our sample, further studies are necessary to demonstrate whether clinical rTMS effects can be predicted from rTMS-induced changes in TMS-evoked N100 amplitude and to clarify the relationship between rTMS-induced changes in TMS-evoked N100 and MEP amplitudes. The TMS-evoked N100 amplitude reduction after 1 Hz-rTMS could either reflect a globally decreased cortical response to the TMS pulse or a specific decrease in inhibition

    Isoperimetric Partitioning: A New Algorithm for Graph Partitioning

    Full text link
    Temporal structure is skilled, fluent action exists at several nested levels. At the largest scale considered here, short sequences of actions that are planned collectively in prefronatal cortex appear to be queued for performance by a cyclic competitive process that operates in concert with a parallel analog representation that implicitly specifies the relative priority of elements of the sequence. At an intermediate scale, single acts, like reaching to grasp, depend on coordinated scaling of the rates at which many muscles shorten or lengthen in parallel. To ensure success of acts such as catching an approaching ball, such parallel rate scaling, which appears to be one function of the basal ganglia, must be coupled to perceptual variables such as time-to-contact. At a finer scale, within each act, desired rate scaling can be realized only if precisely timed muscle activations first accelerate and then decelerate the limbs, to ensure that muscle length changes do not under- or over- shoot the amounts needed for precise acts. Each context of action may require a different timed muscle activation pattern than similar contexts. Because context differences that require different treatment cannot be known in advance, a formidable adaptive engine-the cerebellum-is needed to amplify differences within, and continuosly search, a vast parallel signal flow, in order to discover contextual "leading indicators" of when to generate distinctive patterns of analog signals. From some parts of the cerebellum, such signals control muscles. But a recent model shows how the lateral cerebellum may serve the competitive queuing system (frontal cortex) as a repository of quickly accessed long-term sequence memories. Thus different parts of the cerebellum may use the same adaptive engine design to serve the lowest and highest of the three levels of temporal structure treated. If so, no one-to-one mapping exists between leveels of temporal structure and major parts of the brain. Finally, recent data cast doubt on network-delay models of cerebellar adaptive timing.National Institute of Mental Health (R01 DC02582
    • …
    corecore