12,007 research outputs found

    The Role of the Primary Sensory Cortices in Early Language Processing.

    Get PDF
    The results of this magnetoencephalography study challenge two long-standing assumptions regarding the brain mechanisms of language processing: First, that linguistic processing proper follows sensory feature processing effected by bilateral activation of the primary sensory cortices that lasts about 100 msec from stimulus onset. Second, that subsequent linguistic processing is effected by left hemisphere networks outside the primary sensory areas, including Broca's and Wernicke's association cortices. Here we present evidence that linguistic analysis begins almost synchronously with sensory, prelinguistic verbal input analysis and that the primary cortices are also engaged in these linguistic analyses and become, consequently, part of the left hemisphere language network during language tasks. These findings call for extensive revision of our conception of linguistic processing in the brain

    Neural Modeling and Imaging of the Cortical Interactions Underlying Syllable Production

    Full text link
    This paper describes a neural model of speech acquisition and production that accounts for a wide range of acoustic, kinematic, and neuroimaging data concerning the control of speech movements. The model is a neural network whose components correspond to regions of the cerebral cortex and cerebellum, including premotor, motor, auditory, and somatosensory cortical areas. Computer simulations of the model verify its ability to account for compensation to lip and jaw perturbations during speech. Specific anatomical locations of the model's components are estimated, and these estimates are used to simulate fMRI experiments of simple syllable production with and without jaw perturbations.National Institute on Deafness and Other Communication Disorders (R01 DC02852, RO1 DC01925

    Music in the first days of life

    Get PDF
    In adults, specific neural systems with right-hemispheric weighting are necessary to process pitch, melody and harmony, as well as structure and meaning emerging from musical sequences. To which extent does this neural specialization result from exposure to music or from neurobiological predispositions? We used fMRI to measure brain activity in 1 to 3 days old newborns while listening to Western tonal music, and to the same excerpts altered, so as to include tonal violations or dissonance. Music caused predominant right hemisphere activations in primary and higher-order auditory cortex. For altered music, activations were seen in the left inferior frontal cortex and limbic structures. Thus, the newborn's brain is able to plenty receive music and to figure out even small perceptual and structural differences in the music sequences. This neural architecture present at birth provides us the potential to process basic and complex aspects of music, a uniquely human capacity

    Top-down effects on early visual processing in humans: a predictive coding framework

    Get PDF
    An increasing number of human electroencephalography (EEG) studies examining the earliest component of the visual evoked potential, the so-called C1, have cast doubts on the previously prevalent notion that this component is impermeable to top-down effects. This article reviews the original studies that (i) described the C1, (ii) linked it to primary visual cortex (V1) activity, and (iii) suggested that its electrophysiological characteristics are exclusively determined by low-level stimulus attributes, particularly the spatial position of the stimulus within the visual field. We then describe conflicting evidence from animal studies and human neuroimaging experiments and provide an overview of recent EEG and magnetoencephalography (MEG) work showing that initial V1 activity in humans may be strongly modulated by higher-level cognitive factors. Finally, we formulate a theoretical framework for understanding top-down effects on early visual processing in terms of predictive coding

    Neural Dynamics of Autistic Behaviors: Cognitive, Emotional, and Timing Substrates

    Full text link
    What brain mechanisms underlie autism and how do they give rise to autistic behavioral symptoms? This article describes a neural model, called the iSTART model, which proposes how cognitive, emotional, timing, and motor processes may interact together to create and perpetuate autistic symptoms. These model processes were originally developed to explain data concerning how the brain controls normal behaviors. The iSTART model shows how autistic behavioral symptoms may arise from prescribed breakdowns in these brain processes.Air Force Office of Scientific Research (F49620-01-1-0397); Office of Naval Research (N00014-01-1-0624

    Speech rhythms and multiplexed oscillatory sensory coding in the human brain

    Get PDF
    Cortical oscillations are likely candidates for segmentation and coding of continuous speech. Here, we monitored continuous speech processing with magnetoencephalography (MEG) to unravel the principles of speech segmentation and coding. We demonstrate that speech entrains the phase of low-frequency (delta, theta) and the amplitude of high-frequency (gamma) oscillations in the auditory cortex. Phase entrainment is stronger in the right and amplitude entrainment is stronger in the left auditory cortex. Furthermore, edges in the speech envelope phase reset auditory cortex oscillations thereby enhancing their entrainment to speech. This mechanism adapts to the changing physical features of the speech envelope and enables efficient, stimulus-specific speech sampling. Finally, we show that within the auditory cortex, coupling between delta, theta, and gamma oscillations increases following speech edges. Importantly, all couplings (i.e., brain-speech and also within the cortex) attenuate for backward-presented speech, suggesting top-down control. We conclude that segmentation and coding of speech relies on a nested hierarchy of entrained cortical oscillations

    Primary interoceptive cortex activity during simulated experiences of the body

    Get PDF
    Studies of the classic exteroceptive sensory systems (e.g., vision, touch) consistently demonstrate that vividly imagining a sensory experience of the world – simulating it – is associated with increased activity in the corresponding primary sensory cortex. We hypothesized, analogously, that simulating internal bodily sensations would be associated with increased neural activity in primary interoceptive cortex. An immersive, language-based mental imagery paradigm was used to test this hypothesis (e.g., imagine your heart pounding during a roller coaster ride, your face drenched in sweat during a workout). During two neuroimaging experiments, participants listened to vividly described situations and imagined “being there” in each scenario. In Study 1, we observed significantly heightened activity in primary interoceptive cortex (of dorsal posterior insula) during imagined experiences involving vivid internal sensations. This effect was specific to interoceptive simulation: it was not observed during a separate affect focus condition in Study 1, nor during an independent Study 2 that did not involve detailed simulation of internal sensations (instead involving simulation of other sensory experiences). These findings underscore the large-scale predictive architecture of the brain and reveal that words can be powerful drivers of bodily experiences

    Frontal top-down signals increase coupling of auditory low-frequency oscillations to continuous speech in human listeners

    Get PDF
    Humans show a remarkable ability to understand continuous speech even under adverse listening conditions. This ability critically relies on dynamically updated predictions of incoming sensory information, but exactly how top-down predictions improve speech processing is still unclear. Brain oscillations are a likely mechanism for these top-down predictions [1 and 2]. Quasi-rhythmic components in speech are known to entrain low-frequency oscillations in auditory areas [3 and 4], and this entrainment increases with intelligibility [5]. We hypothesize that top-down signals from frontal brain areas causally modulate the phase of brain oscillations in auditory cortex. We use magnetoencephalography (MEG) to monitor brain oscillations in 22 participants during continuous speech perception. We characterize prominent spectral components of speech-brain coupling in auditory cortex and use causal connectivity analysis (transfer entropy) to identify the top-down signals driving this coupling more strongly during intelligible speech than during unintelligible speech. We report three main findings. First, frontal and motor cortices significantly modulate the phase of speech-coupled low-frequency oscillations in auditory cortex, and this effect depends on intelligibility of speech. Second, top-down signals are significantly stronger for left auditory cortex than for right auditory cortex. Third, speech-auditory cortex coupling is enhanced as a function of stronger top-down signals. Together, our results suggest that low-frequency brain oscillations play a role in implementing predictive top-down control during continuous speech perception and that top-down control is largely directed at left auditory cortex. This suggests a close relationship between (left-lateralized) speech production areas and the implementation of top-down control in continuous speech perception

    From early markers to neuro-developmental mechanisms of autism

    Get PDF
    A fast growing field, the study of infants at risk because of having an older sibling with autism (i.e. infant sibs) aims to identify the earliest signs of this disorder, which would allow for earlier diagnosis and intervention. More importantly, we argue, these studies offer the opportunity to validate existing neuro-developmental models of autism against experimental evidence. Although autism is mainly seen as a disorder of social interaction and communication, emerging early markers do not exclusively reflect impairments of the “social brain”. Evidence for atypical development of sensory and attentional systems highlight the need to move away from localized deficits to models suggesting brain-wide involvement in autism pathology. We discuss the implications infant sibs findings have for future work into the biology of autism and the development of interventions
    corecore