123 research outputs found

    Multisensory Interactions Influence Neuronal Spike Train Dynamics in the Posterior Parietal Cortex

    Get PDF
    abstract: Although significant progress has been made in understanding multisensory interactions at the behavioral level, their underlying neural mechanisms remain relatively poorly understood in cortical areas, particularly during the control of action. In recent experiments where animals reached to and actively maintained their arm position at multiple spatial locations while receiving either proprioceptive or visual-proprioceptive position feedback, multisensory interactions were shown to be associated with reduced spiking (i.e. subadditivity) as well as reduced intra-trial and across-trial spiking variability in the superior parietal lobule (SPL). To further explore the nature of such interaction-induced changes in spiking variability we quantified the spike train dynamics of 231 of these neurons. Neurons were classified as Poisson, bursty, refractory, or oscillatory (in the 13–30 Hz “beta-band”) based on their spike train power spectra and autocorrelograms. No neurons were classified as Poisson-like in either the proprioceptive or visual-proprioceptive conditions. Instead, oscillatory spiking was most commonly observed with many neurons exhibiting these oscillations under only one set of feedback conditions. The results suggest that the SPL may belong to a putative beta-synchronized network for arm position maintenance and that position estimation may be subserved by different subsets of neurons within this network depending on available sensory information. In addition, the nature of the observed spiking variability suggests that models of multisensory interactions in the SPL should account for both Poisson-like and non-Poisson variability.The article is published at http://journals.plos.org/plosone/article?id=10.1371/journal.pone.016678

    Sounds facilitate visual motion discrimination via the enhancement of late occipital visual representations

    Get PDF
    Kayser S, Philiastides MG, Kayser C. Sounds facilitate visual motion discrimination via the enhancement of late occipital visual representations. Neuroimage. 2017;148:31-41

    Neural Mechanisms of Sensory Integration: Frequency Domain Analysis of Spike and Field Potential Activity During Arm Position Maintenance with and Without Visual Feedback

    Get PDF
    abstract: Understanding where our bodies are in space is imperative for motor control, particularly for actions such as goal-directed reaching. Multisensory integration is crucial for reducing uncertainty in arm position estimates. This dissertation examines time and frequency-domain correlates of visual-proprioceptive integration during an arm-position maintenance task. Neural recordings were obtained from two different cortical areas as non-human primates performed a center-out reaching task in a virtual reality environment. Following a reach, animals maintained the end-point position of their arm under unimodal (proprioception only) and bimodal (proprioception and vision) conditions. In both areas, time domain and multi-taper spectral analysis methods were used to quantify changes in the spiking, local field potential (LFP), and spike-field coherence during arm-position maintenance. In both areas, individual neurons were classified based on the spectrum of their spiking patterns. A large proportion of cells in the SPL that exhibited sensory condition-specific oscillatory spiking in the beta (13-30Hz) frequency band. Cells in the IPL typically had a more diverse mix of oscillatory and refractory spiking patterns during the task in response to changing sensory condition. Contrary to the assumptions made in many modelling studies, none of the cells exhibited Poisson-spiking statistics in SPL or IPL. Evoked LFPs in both areas exhibited greater effects of target location than visual condition, though the evoked responses in the preferred reach direction were generally suppressed in the bimodal condition relative to the unimodal condition. Significant effects of target location on evoked responses were observed during the movement period of the task well. In the frequency domain, LFP power in both cortical areas was enhanced in the beta band during the position estimation epoch of the task, indicating that LFP beta oscillations may be important for maintaining the ongoing state. This was particularly evident at the population level, with clear increase in alpha and beta power. Differences in spectral power between conditions also became apparent at the population level, with power during bimodal trials being suppressed relative to unimodal. The spike-field coherence showed confounding results in both the SPL and IPL, with no clear correlation between incidence of beta oscillations and significant beta coherence.Dissertation/ThesisDoctoral Dissertation Biomedical Engineering 201

    Hierarchically nested networks optimize the analysis of audiovisual speech

    Get PDF
    In conversational settings, seeing the speaker’s face elicits internal predictions about the upcoming acoustic utterance. Understanding how the listener’s cortical dynamics tune to the temporal statistics of audiovisual (AV) speech is thus essential. Using magnetoencephalography, we explored how large-scale frequency-specific dynamics of human brain activity adapt to AV speech delays. First, we show that the amplitude of phase-locked responses parametrically decreases with natural AV speech synchrony, a pattern that is consistent with predictive coding. Second, we show that the temporal statistics of AV speech affect large-scale oscillatory networks at multiple spatial and temporal resolutions. We demonstrate a spatial nestedness of oscillatory networks during the processing of AV speech: these oscillatory hierarchies are such that high-frequency activity (beta, gamma) is contingent on the phase response of low-frequency (delta, theta) networks. Our findings suggest that the endogenous temporal multiplexing of speech processing confers adaptability within the temporal regimes that are essential for speech comprehension

    Electrophysiological signatures of conscious perception: The influence of cognitive, cortical and pathological states on multisensory integration

    Get PDF
    At any given moment, information reaches us via our different sensory systems. In order to navigate this multitude of information, associated information needs to be integrated to a coherent percept. In recent years, the hypothesis that synchronous neural oscillations play a prominent role in unisensory and multisensory processing has received substantial support. Current findings further convey the idea that local oscillations and functional connectivity reflect bottom-up as well as top-down processes during multisensory integration and perception. In the current work, I review recent findings on the role of neural oscillations for conscious multisensory perception. Subsequently, I present an integrative network model for multisensory integration that describes the cortical correlates of conscious multisensory perception, the influence of fluctuations of oscillatory neural activity on subsequent perception, and the influence of cognitive processes on neural oscillations and perception. I propose that neural oscillations in distinct, coexisting frequencies reflect the various processing steps underlying multisensory perception.Jederzeit erreichen uns Informationen über unsere verschiedenen Sinnesorgane und Wahrnehmungssysteme. Um in dieser Menge an Informationen den Überblick zu behalten, müssen zusammengehörige Informationen zu einer kohärente Wahrnehmung zusammengefügt werden. In den letzten Jahren hat die Hypothese, dass synchrone neuronale Oszillationen eine wichtige Rolle bei der Verarbeitung von unisensorischen und multisensorischen Reizen spielen, viel Unterstützung erfahren. Neueste Befunde befördern weiterhin die Idee, dass lokale Oszillationen und funktionale Konnektivität aufsteigende und absteigende Prozesse bei multisensorischer Integration und Wahrnehmung widerspiegeln. In dieser Arbeit werde ich einen Überblick über die neuesten Befunde zur Rolle neuronaler Oszillationen bei bewusster, multisensorischer Wahrnehmung geben. Anschließend werde ich ein integratives Netzwerkmodell multisensorischer Wahrnehmung präsentieren, welches die kortikalen Korrelate bewusster, multisensorischer Wahrnehmung, den Einfluss von Schwankungen oszillatorischer neuronaler Aktivität auf darauffolgende Wahrnehmung, sowie den Einfluss kognitiver Prozesse auf neuronale Oszillationen und Wahrnehmung beschreibt. Ich schlage vor, dass neuronale Oszillationen in umschriebenen, gleichzeitig aktiven Frequenzbändern die verschiedenen Verarbeitungsschritte widerspiegeln, welche multisensorischer Wahrnehmung zugrunde liegen

    Exploring the neural entrainment to musical rhythms and meter : a steady-state evoked potential approach

    Full text link
    Thèse de doctorat réalisé en cotutelle avec l'Université catholique de Louvain, Belgique (Faculté de médecine, Institut de Neuroscience)Percevoir et synchroniser ses mouvements à une pulsation régulière en musique est une capacité largement répandue chez l’Homme, et fondamentale aux comportements musicaux. La pulsation et la métrique en musique désignent généralement une organisation temporelle périodique perçue à partir de stimuli acoustiques complexes, et cette organisation perceptuelle implique souvent une mise en mouvement périodique spontanée du corps. Cependant, les mécanismes neuraux sous-tendant cette perception sont à l’heure actuelle encore méconnus. Le présent travail a donc eu pour objectif de développer une nouvelle approche expérimentale, inspirée par l’approche électrophysiologique des potentiels évoqués stationnaires, afin d’explorer les corrélats neuraux à la base de notre perception de la pulsation et de la métrique induite à l’écoute de rythmes musicaux. L’activité neurale évoquée en relation avec la perception d’une pulsation a été enregistrée par électroencéphalographie (EEG) chez des individus sains, dans divers contextes : (1) dans un contexte d’imagerie mentale d’une métrique appliquée de manière endogène sur un stimulus auditif, (2) dans un contexte d’induction spontanée d’une pulsation à l’écoute de patterns rythmiques musicaux, (3) dans un contexte d’interaction multisensorielle, et (4) dans un contexte de synchronisation sensorimotrice. Pris dans leur ensemble, les résultats de ces études corroborent l’hypothèse selon laquelle la perception de la pulsation en musique est sous-tendue par des processus de synchronisation et de résonance de l’activité neurale dans le cerveau humain. De plus, ces résultats suggèrent que l’approche développée dans le présent travail pourrait apporter un éclairage significatif pour comprendre les mécanismes neuraux de la perception de la pulsation et des rythmes musicaux, et, dans une perspective plus générale, pour explorer les mécanismes de synchronisation neurale.The ability to perceive a regular beat in music and synchronize to it is a widespread human skill. Fundamental to musical behavior, beat and meter refer to the perception of periodicities while listening to musical rhythms, and usually involve spontaneous entrainment to move on these periodicities. However, the neural mechanisms underlying entrainment to beat and meter in Humans remain unclear. The present work tests a novel experimental approach, inspired by the steady-state evoked potential method, to explore the neural dynamics supporting the perception of rhythmic inputs. Using human electroencephalography (EEG), neural responses to beat and meter were recorded in various contexts: (1) mental imagery of meter, (2) spontaneous induction of a beat from rhythmic patterns, (3) multisensory integration, and (4) sensorimotor synchronization. Our results support the view that entrainment and resonance phenomena subtend the processing of musical rhythms in the human brain. Furthermore, our results suggest that this novel approach could help investigating the link between the phenomenology of musical beat and meter and neurophysiological evidence of a bias towards periodicities arising under certain circumstances in the nervous system. Hence, entrainment to music provides an original framework to explore general entrainment phenomena occurring at various levels, from the inter-neural to the inter-individual level

    Being first matters: topographical representational similarity analysis of ERP signals reveals separate networks for audiovisual temporal binding depending on the leading sense

    Get PDF
    In multisensory integration, processing in one sensory modality is enhanced by complementary information from other modalities. Inter-sensory timing is crucial in this process as only inputs reaching the brain within a restricted temporal window are perceptually bound. Previous research in the audiovisual field has investigated various features of the temporal binding window (TBW), revealing asymmetries in its size and plasticity depending on the leading input (auditory-visual, AV; visual-auditory, VA). We here tested whether separate neuronal mechanisms underlie this AV-VA dichotomy in humans. We recorded high-density EEG while participants performed an audiovisual simultaneity judgment task including various AV/VA asynchronies and unisensory control conditions (visual-only, auditory-only) and tested whether AV and VA processing generate different patterns of brain activity. After isolating the multisensory components of AV/VA event-related potentials (ERPs) from the sum of their unisensory constituents, we run a time-resolved topographical representational similarity analysis (tRSA) comparing AV and VA ERP maps. Spatial cross-correlation matrices were built from real data to index the similarity between AV- and VA-maps at each time point (500ms window post-stimulus) and then correlated with two alternative similarity model matrices: AVmaps=VAmaps vs. AVmaps≠VAmaps. The tRSA results favored the AVmaps≠VAmaps model across all time points, suggesting that audiovisual temporal binding (indexed by synchrony perception) engages different neural pathways depending on the leading sense. The existence of such dual route supports recent theoretical accounts proposing that multiple binding mechanisms are implemented in the brain to accommodate different information parsing strategies in auditory and visual sensory systems

    A Visionary Approach to Listening: Determining The Role Of Vision In Auditory Scene Analysis

    Get PDF
    To recognize and understand the auditory environment, the listener must first separate sounds that arise from different sources and capture each event. This process is known as auditory scene analysis. The aim of this thesis is to investigate whether and how visual information can influence auditory scene analysis. The thesis consists of four chapters. Firstly, I reviewed the literature to give a clear framework about the impact of visual information on the analysis of complex acoustic environments. In chapter II, I examined psychophysically whether temporal coherence between auditory and visual stimuli was sufficient to promote auditory stream segregation in a mixture. I have found that listeners were better able to report brief deviants in an amplitude modulated target stream when a visual stimulus changed in size in a temporally coherent manner than when the visual stream was coherent with the non-target auditory stream. This work demonstrates that temporal coherence between auditory and visual features can influence the way people analyse an auditory scene. In chapter III, the integration of auditory and visual features in auditory cortex was examined by recording neuronal responses in awake and anaesthetised ferret auditory cortex in response to the modified stimuli used in Chapter II. I demonstrated that temporal coherence between auditory and visual stimuli enhances the neural representation of a sound and influences which sound a neuron represents in a sound mixture. Visual stimuli elicited reliable changes in the phase of the local field potential which provides mechanistic insight into this finding. Together these findings provide evidence that early cross modal integration underlies the behavioural effects in chapter II. Finally, in chapter IV, I investigated whether training can influence the ability of listeners to utilize visual cues for auditory stream analysis and showed that this ability improved by training listeners to detect auditory-visual temporal coherence
    • …
    corecore