3,073 research outputs found
How musical rhythms entrain the human brain : clarifying the neural mechanisms of sensory-motor entrainment to rhythms
When listening to music, people across cultures tend to spontaneously perceive and move the body along a periodic pulse-like meter. Increasing evidence suggests that this ability is supported by neural mechanisms that selectively amplify periodicities corresponding to the perceived metric pulses. However, the nature of these neural mechanisms, i.e., the endogenous or exogenous factors that may selectively enhance meter periodicities in brain responses to rhythm, remains largely unknown. This question was investigated in a series of studies in which the electroencephalogram (EEG) of healthy participants was recorded while they listened to musical rhythm. From this EEG, selective contrast at meter periodicities in the elicited neural activity was captured using frequency-tagging, a method allowing direct comparison of this contrast between the sensory input, EEG response, biologically-plausible models of auditory subcortical processing, and behavioral output. The results show that the selective amplification of meter periodicities is shaped by a continuously updated combination of factors including sound spectral content, long-term training and recent context, irrespective of attentional focus and beyond auditory subcortical nonlinear processing. Together, these observations demonstrate that perception of rhythm involves a number of processes that transform the sensory input via fixed low-level nonlinearities, but also through flexible mappings shaped by prior experience at different timescales. These higher-level neural mechanisms could represent a neurobiological basis for the remarkable flexibility and stability of meter perception relative to the acoustic input, which is commonly observed within and across individuals. Fundamentally, the current results add to the evidence that evolution has endowed the human brain with an extraordinary capacity to organize, transform, and interact with rhythmic signals, to achieve adaptive behavior in a complex dynamic environment
Attentional control in depression: a translational affective neuroscience approach
Translational research refers to the application of basic science to address clinical problems and acquire knowledge that can be used to guide and refine clinical practice. This special issue of Cognitive, Affective, & Behavioral Neuroscience seeks to explore and integrate some of the most promising findings offered by recent cognitive and affective neuroscience studies in hopes of filling the gap between basic and applied research, thereby heightening our understanding of vulnerability for depression. The studies presented in this special issue focus specifically on attentional processes. We solicited contributions from leading researchers involved in basic cognitive and neuroscience research investigating processes underlying depression-related disturbances in emotion processing. In this introductory article, we present an integrative overview to demonstrate how these specific contributions might be valuable for translational research
Neural responses to sounds presented on and off the beat of ecologically valid music
The tracking of rhythmic structure is a vital component of speech and music perception. It is known that sequences of identical sounds can give rise to the percept of alternating strong and weak sounds, and that this percept is linked to enhanced cortical and oscillatory responses. The neural correlates of the perception of rhythm elicited by ecologically valid, complex stimuli, however, remain unexplored. Here we report the effects of a stimulus' alignment with the beat on the brain's processing of sound. Human subjects listened to short popular music pieces while simultaneously hearing a target sound. Cortical and brainstem electrophysiological onset responses to the sound were enhanced when it was presented on the beat of the music, as opposed to shifted away from it. Moreover, the size of the effect of alignment with the beat on the cortical response correlated strongly with the ability to tap to a beat, suggesting that the ability to synchronize to the beat of simple isochronous stimuli and the ability to track the beat of complex, ecologically valid stimuli may rely on overlapping neural resources. These results suggest that the perception of musical rhythm may have robust effects on processing throughout the auditory system
Recommended from our members
Hearing through your eyes: neural basis of audiovisual cross-activation, revealed by transcranial alternating current stimulation
Some people experience auditory sensations when seeing visual flashes or movements. This prevalent synaesthesia-like ‘visual-evoked auditory response’ (vEAR) could result either from over-exuberant cross-activation between brain areas, and/or reduced inhibition of normally-occurring cross-activation. We have used transcranial alternating current stimulation (tACS) to test these theories. We applied tACS at 10Hz (alpha-band frequency) or 40Hz (gamma-band), bilaterally either to temporal or occipital sites, while measuring same/different discrimination of paired auditory (A) versus visual (V) 'Morse code' sequences. At debriefing, participants were classified as vEAR or non-vEAR depending on whether they reported 'hearing' the silent flashes.
In non-vEAR participants, temporal 10Hz tACS caused impairment of A performance, which correlated with improved V; conversely under occipital tACS, poorer V performance correlated with improved A. This reciprocal pattern suggests that sensory cortices are normally mutually inhibitory, and that alpha-frequency tACS may bias the balance of competition between them. vEAR participants showed no tACS effects, consistent with reduced inhibition, or enhanced cooperation between modalities. In addition, temporal 40Hz tACS impaired V performance, specifically in individuals who showed a performance advantage for V (relative to A). Gamma-frequency tACS may therefore modulate the ability of these individuals to benefit from recoding flashes into the auditory modality, possibly by disrupting cross-activation of auditory areas by visual stimulation.
Our results support both theories, suggesting that vEAR may depend on disinhibition of normally-occurring sensory cross-activation, which may be expressed more strongly in some individuals. Furthermore, endogenous alpha and gamma-frequency oscillations may function respectively to inhibit or promote this cross-activation
The mechanisms of tinnitus: perspectives from human functional neuroimaging
In this review, we highlight the contribution of advances in human neuroimaging to the current understanding of central mechanisms underpinning tinnitus and explain how interpretations of neuroimaging data have been guided by animal models. The primary motivation for studying the neural substrates of tinnitus in humans has been to demonstrate objectively its representation in the central auditory system and to develop a better understanding of its diverse pathophysiology and of the functional interplay between sensory, cognitive and affective systems. The ultimate goal of neuroimaging is to identify subtypes of tinnitus in order to better inform treatment strategies. The three neural mechanisms considered in this review may provide a basis for TI classification. While human neuroimaging evidence strongly implicates the central auditory system and emotional centres in TI, evidence for the precise contribution from the three mechanisms is unclear because the data are somewhat inconsistent. We consider a number of methodological issues limiting the field of human neuroimaging and recommend approaches to overcome potential inconsistency in results arising from poorly matched participants, lack of appropriate controls and low statistical power
The Development of Spontaneous Sound-Shape Matching in Monolingual and Bilingual Infants During the First Year
Online First November 17, 2016Supplemental materials: http://dx.doi.org/10.1037/dev0000237.suppRecently it has been proposed that sensitivity to nonarbitrary relationships between speech sounds and objects potentially bootstraps lexical acquisition. However, it is currently unclear whether preverbal infants (e.g., before 6 months of age) with different linguistic profiles are sensitive to such nonarbitrary relationships. Here, the authors assessed 4- and 12-month-old Basque monolingual and Spanish-Basque bilingual infants’ sensitivity to cross-modal correspondences between sound symbolic nonwords without syllable repetition (buba, kike) and drawings of rounded and angular shapes. The findings demonstrate that sensitivity to sound-shape correspondences emerge by 12 months of age in both monolinguals and bilinguals. This finding suggests that spontaneous sound-shape matching is likely to be the product of language learning and development and may not be readily available prior to the onset of word learning
Multisensory Processes: A Balancing Act across the Lifespan.
Multisensory processes are fundamental in scaffolding perception, cognition, learning, and behavior. How and when stimuli from different sensory modalities are integrated rather than treated as separate entities is poorly understood. We review how the relative reliance on stimulus characteristics versus learned associations dynamically shapes multisensory processes. We illustrate the dynamism in multisensory function across two timescales: one long term that operates across the lifespan and one short term that operates during the learning of new multisensory relations. In addition, we highlight the importance of task contingencies. We conclude that these highly dynamic multisensory processes, based on the relative weighting of stimulus characteristics and learned associations, provide both stability and flexibility to brain functions over a wide range of temporal scales
- …