3,522 research outputs found

    The influence of external and internal motor processes on human auditory rhythm perception

    Get PDF
    Musical rhythm is composed of organized temporal patterns, and the processes underlying rhythm perception are found to engage both auditory and motor systems. Despite behavioral and neuroscience evidence converging to this audio-motor interaction, relatively little is known about the effect of specific motor processes on auditory rhythm perception. This doctoral thesis was devoted to investigating the influence of both external and internal motor processes on the way we perceive an auditory rhythm. The first half of the thesis intended to establish whether overt body movement had a facilitatory effect on our ability to perceive the auditory rhythmic structure, and whether this effect was modulated by musical training. To this end, musicians and non-musicians performed a pulse-finding task either using natural body movement or through listening only, and produced their identified pulse by finger tapping. The results showed that overt movement benefited rhythm (pulse) perception especially for non-musicians, confirming the facilitatory role of external motor activities in hearing the rhythm, as well as its interaction with musical training. The second half of the thesis tested the idea that indirect, covert motor input, such as that transformed from the visual stimuli, could influence our perceived structure of an auditory rhythm. Three experiments examined the subjectively perceived tempo of an auditory sequence under different visual motion stimulations, while the auditory and visual streams were presented independently of each other. The results revealed that the perceived auditory tempo was accordingly influenced by the concurrent visual motion conditions, and the effect was related to the increment or decrement of visual motion speed. This supported the hypothesis that the internal motor information extracted from the visuomotor stimulation could be incorporated into the percept of an auditory rhythm. Taken together, the present thesis concludes that, rather than as a mere reaction to the given auditory input, our motor system plays an important role in contributing to the perceptual process of the auditory rhythm. This can occur via both external and internal motor activities, and may not only influence how we hear a rhythm but also under some circumstances improve our ability to hear the rhythm.Musikalische Rhythmen bestehen aus zeitlich strukturierten Mustern akustischer Stimuli. Es konnte gezeigt werden, dass die Prozesse, welche der Rhythmuswahrnehmung zugrunde liegen, sowohl motorische als auch auditive Systeme nutzen. Obwohl sich für diese auditiv-motorischen Interaktionen sowohl in den Verhaltenswissenschaften als auch Neurowissenschaften übereinstimmende Belege finden, weiß man bislang relativ wenig über die Auswirkungen spezifischer motorischer Prozesse auf die auditive Rhythmuswahrnehmung. Diese Doktorarbeit untersucht den Einfluss externaler und internaler motorischer Prozesse auf die Art und Weise, wie auditive Rhythmen wahrgenommen werden. Der erste Teil der Arbeit diente dem Ziel herauszufinden, ob körperliche Bewegungen es dem Gehirn erleichtern können, die Struktur von auditiven Rhythmen zu erkennen, und, wenn ja, ob dieser Effekt durch ein musikalisches Training beeinflusst wird. Um dies herauszufinden wurde Musikern und Nichtmusikern die Aufgabe gegeben, innerhalb von präsentierten auditiven Stimuli den Puls zu finden, wobei ein Teil der Probanden währenddessen Körperbewegungen ausführen sollte und der andere Teil nur zuhören sollte. Anschließend sollten die Probanden den gefundenen Puls durch Finger-Tapping ausführen, wobei die Reizgaben sowie die Reaktionen mittels eines computerisierten Systems kontrolliert wurden. Die Ergebnisse zeigen, dass offen ausgeführte Bewegungen die Wahrnehmung des Pulses vor allem bei Nichtmusikern verbesserten. Diese Ergebnisse bestätigen, dass Bewegungen beim Hören von Rhythmen unterstützend wirken. Außerdem zeigte sich, dass hier eine Wechselwirkung mit dem musikalischen Training besteht. Der zweite Teil der Doktorarbeit überprüfte die Idee, dass indirekte, verdeckte Bewegungsinformationen, wie sie z.B. in visuellen Stimuli enthalten sind, die wahrgenommene Struktur von auditiven Rhythmen beeinflussen können. Drei Experimente untersuchten, inwiefern das subjektiv wahrgenommene Tempo einer akustischen Sequenz durch die Präsentation unterschiedlicher visueller Bewegungsreize beeinflusst wird, wobei die akustischen und optischen Stimuli unabhängig voneinander präsentiert wurden. Die Ergebnisse zeigten, dass das wahrgenommene auditive Tempo durch die visuellen Bewegungsinformationen beeinflusst wird, und dass der Effekt in Verbindung mit der Zunahme oder Abnahme der visuellen Geschwindigkeit steht. Dies unterstützt die Hypothese, dass internale Bewegungsinformationen, welche aus visuomotorischen Reizen extrahiert werden, in die Wahrnehmung eines auditiven Rhythmus integriert werden können. Zusammen genommen, 5 zeigt die vorgestellte Arbeit, dass unser motorisches System eine wichtige Rolle im Wahrnehmungsprozess von auditiven Rhythmen spielt. Dies kann sowohl durch äußere als auch durch internale motorische Aktivitäten geschehen, und beeinflusst nicht nur die Art, wie wir Rhythmen hören, sondern verbessert unter bestimmten Bedingungen auch unsere Fähigkeit Rhythmen zu identifizieren

    ECoG high gamma activity reveals distinct cortical representations of lyrics passages, harmonic and timbre-related changes in a rock song

    Get PDF
    Listening to music moves our minds and moods, stirring interest in its neural underpinnings. A multitude of compositional features drives the appeal of natural music. How such original music, where a composer's opus is not manipulated for experimental purposes, engages a listener's brain has not been studied until recently. Here, we report an in-depth analysis of two electrocorticographic (ECoG) data sets obtained over the left hemisphere in ten patients during presentation of either a rock song or a read-out narrative. First, the time courses of five acoustic features (intensity, presence/absence of vocals with lyrics, spectral centroid, harmonic change, and pulse clarity) were extracted from the audio tracks and found to be correlated with each other to varying degrees. In a second step, we uncovered the specific impact of each musical feature on ECoG high-gamma power (70–170 Hz) by calculating partial correlations to remove the influence of the other four features. In the music condition, the onset and offset of vocal lyrics in ongoing instrumental music was consistently identified within the group as the dominant driver for ECoG high-gamma power changes over temporal auditory areas, while concurrently subject-individual activation spots were identified for sound intensity, timbral, and harmonic features. The distinct cortical activations to vocal speech-related content embedded in instrumental music directly demonstrate that song integrated in instrumental music represents a distinct dimension in complex music. In contrast, in the speech condition, the full sound envelope was reflected in the high gamma response rather than the onset or offset of the vocal lyrics. This demonstrates how the contributions of stimulus features that modulate the brain response differ across the two examples of a full-length natural stimulus, which suggests a context-dependent feature selection in the processing of complex auditory stimuli

    Cortical and behavioural tracking of rhythm in music:Effects of pitch predictability, enjoyment, and expertise

    Get PDF
    Cortical tracking of stimulus features (such as the envelope) is a crucial tractable neural mechanism, allowing us to investigate how we process continuous music. We here tested whether cortical and behavioural tracking of beat, typically related to rhythm processing, are modulated by pitch predictability. In two experiments (n=20, n=52), participants’ ability to tap along to the beat of musical sequences was measured for tonal (high pitch predictability) and atonal (low pitch predictability) music. In Experiment 1, we additionally measured participants’ EEG and analysed cortical tracking of the acoustic envelope and of pitch surprisal (using IDyOM). In both experiments, finger-tapping performance was better in the tonal than the atonal condition, indicating a positive effect of pitch predictability on behavioural rhythm processing. Neural data revealed that the acoustic envelope was tracked stronger while listening to atonal than tonal music, potentially reflecting listeners’ violated pitch expectations. Our findings show that cortical envelope tracking, beyond reflecting musical rhythm processing, is modulated by pitch predictability (as well as musical expertise and enjoyment). Stronger cortical surprisal tracking was linked to overall worse envelope tracking, and worse finger-tapping performance for atonal music. Specifically, the low pitch predictability in atonal music seems to draw attentional resources resulting in a reduced ability to follow the rhythm behaviourally. Overall, cortical envelope and surprisal tracking were differentially related to behaviour in tonal and atonal music, likely reflecting differential processing under conditions of high and low predictability. Taken together, our results show diverse effects of pitch predictability on musical rhythm processing

    Neural responses to sounds presented on and off the beat of ecologically valid music

    Get PDF
    The tracking of rhythmic structure is a vital component of speech and music perception. It is known that sequences of identical sounds can give rise to the percept of alternating strong and weak sounds, and that this percept is linked to enhanced cortical and oscillatory responses. The neural correlates of the perception of rhythm elicited by ecologically valid, complex stimuli, however, remain unexplored. Here we report the effects of a stimulus' alignment with the beat on the brain's processing of sound. Human subjects listened to short popular music pieces while simultaneously hearing a target sound. Cortical and brainstem electrophysiological onset responses to the sound were enhanced when it was presented on the beat of the music, as opposed to shifted away from it. Moreover, the size of the effect of alignment with the beat on the cortical response correlated strongly with the ability to tap to a beat, suggesting that the ability to synchronize to the beat of simple isochronous stimuli and the ability to track the beat of complex, ecologically valid stimuli may rely on overlapping neural resources. These results suggest that the perception of musical rhythm may have robust effects on processing throughout the auditory system

    Prominence of delta oscillatory rhythms in the motor cortex and their relevance for auditory and speech perception

    Get PDF
    In the motor cortex, beta oscillations (∼12-30 Hz) are generally considered a principal rhythm contributing to movement planning and execution. Beta oscillations cohabit and dynamically interact with slow delta oscillations (0.5-4 Hz), but the role of delta oscillations and the subordinate relationship between these rhythms in the perception-action loop remains unclear. Here, we review evidence that motor delta oscillations shape the dynamics of motor behaviors and sensorimotor processes, in particular during auditory perception. We describe the functional coupling between delta and beta oscillations in the motor cortex during spontaneous and planned motor acts. In an active sensing framework, perception is strongly shaped by motor activity, in particular in the delta band, which imposes temporal constraints on the sampling of sensory information. By encoding temporal contextual information, delta oscillations modulate auditory processing and impact behavioral outcomes. Finally, we consider the contribution of motor delta oscillations in the perceptual analysis of speech signals, providing a contextual temporal frame to optimize the parsing and processing of slow linguistic information

    Revealing spatio-spectral electroencephalographic dynamics of musical mode and tempo perception by independent component analysis.

    Get PDF
    BackgroundMusic conveys emotion by manipulating musical structures, particularly musical mode- and tempo-impact. The neural correlates of musical mode and tempo perception revealed by electroencephalography (EEG) have not been adequately addressed in the literature.MethodThis study used independent component analysis (ICA) to systematically assess spatio-spectral EEG dynamics associated with the changes of musical mode and tempo.ResultsEmpirical results showed that music with major mode augmented delta-band activity over the right sensorimotor cortex, suppressed theta activity over the superior parietal cortex, and moderately suppressed beta activity over the medial frontal cortex, compared to minor-mode music, whereas fast-tempo music engaged significant alpha suppression over the right sensorimotor cortex.ConclusionThe resultant EEG brain sources were comparable with previous studies obtained by other neuroimaging modalities, such as functional magnetic resonance imaging (fMRI) and positron emission tomography (PET). In conjunction with advanced dry and mobile EEG technology, the EEG results might facilitate the translation from laboratory-oriented research to real-life applications for music therapy, training and entertainment in naturalistic environments

    Interpretations of frequency domain analyses of neural entrainment: Periodicity, fundamental frequency, and harmonics

    Get PDF
    Brain activity can follow the rhythms of dynamic sensory stimuli, such as speech and music, a phenomenon called neural entrainment. It has been hypothesized that low-frequency neural entrainment in the neural delta and theta bands provides a potential mechanism to represent and integrate temporal information. Low-frequency neural entrainment is often studied using periodically changing stimuli and is analyzed in the frequency domain using the Fourier analysis. The Fourier analysis decomposes a periodic signal into harmonically related sinusoids. However, it is not intuitive how these harmonically related components are related to the response waveform. Here, we explain the interpretation of response harmonics, with a special focus on very low frequency neural entrainment near 1 Hz. It is illustrated why neural responses repeating at f Hz do not necessarily generate any neural response at f Hz in the Fourier spectrum. A strong neural response at f Hz indicates that the time scales of the neural response waveform within each cycle match the time scales of the stimulus rhythm. Therefore, neural entrainment at very low frequency implies not only that the neural response repeats at f Hz but also that each period of the neural response is a slow wave matching the time scale of a f Hz sinusoid. With a few exceptions, the literature on face recognition and its neural basis derives from the presentation of single faces. However, in many ecologically typical situations, we see more than one face, in different communicative contexts. One of the principal ways in which we interact using our faces is kissing. Although there is no obvious taxonomy of kissing, we kiss in various interpersonal situations (greeting, ceremony, sex), with different goals and partners. Here, we assess the visual cortical responses elicited by viewing different couples kissing with different intents. The study thus lies at the nexus of face recognition, action recognition, and social neuroscience. Magnetoencephalography data were recorded from nine participants in a passive viewing paradigm. We presented images of couples kissing, with the images differing along two dimensions, kiss type and couple type. We quantified event-related field amplitudes and latencies. In each participant, the canonical sequence of event-related fields was observed, including an M100, an M170, and a later M400 response. The earliest two responses were significantly modulated in latency (M100) or amplitude (M170) by the sex composition of the images (with male-male and female-female pairings yielding faster latency M100 and larger amplitude M170 responses). In contrast, kiss type showed no modulation of any brain response. The early cortical-evoked fields that we typically associate with the presentation and analysis of single faces are differentially sensitive to complex social and action information in face pairs that are kissing. The early responses, typically associated with perceptual analysis, exhibit a consistent grouping and suggest a high and rapid sensitivity to the composition of the kissing pairs

    Language specificity in cortical tracking of speech rhythm at the mora, syllable, and foot levels

    Get PDF
    Published: 05 August 2022Recent research shows that adults’ neural oscillations track the rhythm of the speech signal. However, the extent to which this tracking is driven by the acoustics of the signal, or by language-specific processing remains unknown. Here adult native listeners of three rhythmically different languages (English, French, Japanese) were compared on their cortical tracking of speech envelopes synthesized in their three native languages, which allowed for coding at each of the three language’s dominant rhythmic unit, respectively the foot (2.5 Hz), syllable (5 Hz), or mora (10 Hz) level. The three language groups were also tested with a sequence in a non-native language, Polish, and a non-speech vocoded equivalent, to investigate possible differential speech/nonspeech processing. The results first showed that cortical tracking was most prominent at 5 Hz (syllable rate) for all three groups, but the French listeners showed enhanced tracking at 5 Hz compared to the English and the Japanese groups. Second, across groups, there were no differences in responses for speech versus non-speech at 5 Hz (syllable rate), but there was better tracking for speech than for non-speech at 10 Hz (not the syllable rate). Together these results provide evidence for both language-general and language-specific influences on cortical tracking.In Australia, this work was supported by a Transdisciplinary and Innovation Grant from Australian Research Council Centre of Excellence in Dynamics of Language. In France, the work was funded by an ANR-DFG grant (ANR-16-FRAL-0007) and funds from LABEX EFL (ANR-10-LABX-0083) to TN, and a DIM Cerveau et Pensées grant (2013 MOBIBRAIN). In Japan, the work was funded by JSPS grant-in-aid for Scientific Research S(16H06319) and for Specially Promoted Research (20H05617), MEXT grant for Innovative Areas #4903 (17H06382) to RM

    Evidence for multiple rhythmic skills

    Get PDF
    Rhythms, or patterns in time, play a vital role in both speech and music. Proficiency in a number of rhythm skills has been linked to language ability, suggesting that certain rhythmic processes in music and language rely on overlapping resources. However, a lack of understanding about how rhythm skills relate to each other has impeded progress in understanding how language relies on rhythm processing. In particular, it is unknown whether all rhythm skills are linked together, forming a single broad rhythmic competence, or whether there are multiple dissociable rhythm skills. We hypothesized that beat tapping and rhythm memory/sequencing form two separate clusters of rhythm skills. This hypothesis was tested with a battery of two beat tapping and two rhythm memory tests. Here we show that tapping to a metronome and the ability to adjust to a changing tempo while tapping to a metronome are related skills. The ability to remember rhythms and to drum along to repeating rhythmic sequences are also related. However, we found no relationship between beat tapping skills and rhythm memory skills. Thus, beat tapping and rhythm memory are dissociable rhythmic aptitudes. This discovery may inform future research disambiguating how distinct rhythm competencies track with specific language functions
    corecore