8 research outputs found

    The Influence of Task-Irrelevant Music on Language Processing: Syntactic and Semantic Structures

    Get PDF
    Recent research has suggested that music and language processing share neural resources, leading to new hypotheses about interference in the simultaneous processing of these two structures. The present study investigated the effect of a musical chord's tonal function on syntactic processing (Experiment 1) and semantic processing (Experiment 2) using a cross-modal paradigm and controlling for acoustic differences. Participants read sentences and performed a lexical decision task on the last word, which was, syntactically or semantically, expected or unexpected. The simultaneously presented (task-irrelevant) musical sequences ended on either an expected tonic or a less-expected subdominant chord. Experiment 1 revealed interactive effects between music-syntactic and linguistic-syntactic processing. Experiment 2 showed only main effects of both music-syntactic and linguistic-semantic expectations. An additional analysis over the two experiments revealed that linguistic violations interacted with musical violations, though not differently as a function of the type of linguistic violations. The present findings were discussed in light of currently available data on the processing of music as well as of syntax and semantics in language, leading to the hypothesis that resources might be shared for structural integration processes and sequencing

    (Investigating musical expectations of non-musician listeners : the musical priming paradigm)

    No full text
    Western listeners become sensitive to the regularities of the Western tonal system by mere exposure to musical pieces. The implicitly acquired tonal knowledge allows listeners to develop musical expectations for future events of a musical sequence. These expectations play a role for musical expressivity and influence the processing of musical events. The musical priming paradigm is an indirect investigation method that allows studying listeners’ tonal knowledge and the influence of musical expectations on processing speed of musical events. Behavioral data sets have shown that the processing of a musical event is facilitated when it is tonally related (and supposed to be “expected”) in comparison to when it is unrelated or less-related to the preceding tonal context. Neurophysiological data sets have shown that the processing of a less-expected event requires more neural resources than the processing of more prototypical musical structures. For example, studies using functional magnetic resonance imaging have reported increased activation in the inferior frontal cortex for unexpected musical events. Studying musical expectations – as an example of processing complex, non-verbal acoustic structures – contributes to a better understanding of the processes underlying the acquisition of implicit knowledge about our auditory environment as well as about the influence of this knowledge on perception

    Auditory expectations for newly acquired structures

    No full text
    Our study investigated whether newly acquired auditory structure knowledge allows listeners to develop perceptual expectations for future events. For that aim, we introduced a new experimental approach that combines implicit learning and priming paradigms. Participants were first exposed to structured tone sequences without being told about the underlying artificial grammar. They then made speeded judgements on a perceptual feature of target tones in new sequences (i.e., in-tune/ out-of-tune judgements). The target tones respected or violated the structure of the artificial grammar and were thus supposed to be expected or unexpected. In this priming task, grammatical tones were processed faster and more accurately than ungrammatical ones. This processing advantage was observed for an experimental group performing a memory task during the exposure phase, but was not observed for a control group, which was lacking the exposure phase (Experiment 1). It persisted when participants realized an in-tune/out-of-tune detection task during exposure (Experiment 2). This finding suggests that the acquisition of new structure knowledge not only influences grammaticality judgements on entire sequences (as previously shown in implicit learning research), but allows developing perceptual expectations that influence single event processing. It further promotes the priming paradigm as an implicit access to acquired artificial structure knowledge

    The role of expectation in music : from the score to emotions and the brain

    No full text
    Like discourse, music is a dynamic process that occurs over time. Listeners usually expect some events or structures of events to occur in the prolongation of a given context. Part of the musical emotional experience would depend upon how composers (improvisers) fulfill these expectancies. Musical expectations are a core phenomenon of music cognition, and the present article provides an overview of its foundation in the score as well as in listeners' behavior and brain, and how it can be simulated by artificial neural networks. We highlight parallels to language processing and include the attentional and emotional dimensions of musical expectations. Studying musical expectations is thus valuable not only for our understanding of music perception and production but also for more general brain functioning. Some open and challenging issues are summarized in this article

    (Music, syntax and semantics : shared structural and temporal integration resources?)

    No full text
    Numerous behavioral, neurophysiological and neuropsychological studies have investigated the specificity of music and language processing. While some studies suggested independent processes, other studies revealed common neural resources and interactive influences for music and language. Patel (2003) proposed that music and language share neural resources for processes linked to the structural integration of events, notably for musical and linguistic syntactic processing ("Shared Syntactic Integration Resource Hypothesis”, SSIRH). The SSIRH has led to a series of studies suggesting more consistent interactive influences between music and syntax than between music and semantics. Based on a review of the literature, the present article aims to reconcile the data from the literature about music and language processing, by proposing a hypothesis of shared structural and temporal integration resources for music, syntax and semantics

    Electrophysiological changes associated with acute smartphone use and mental fatigue.

    No full text
    International audienceIntroduction: Acute smartphone use can decrease cognitive and physical performance, likely due to the induction of mental fatigue. To date, no study has used electrophysiological markers to support this hypothesis. Mental fatigue is known to induce changes in brain activity (e.g., increase in theta and alpha power), cardiac activity (e.g., increase in heart rate variability - HRV) and ocular activity (e.g., increase in pupil diameter), as well as increasing feelings of tiredness and lack of energy. The aim of this study was to investigate the effects of acute smartphone use on electroencephalography, pupillometry, and HRV responses to objectively identify the presence of mental fatigue.Methods: Eighteen participants performed a psychomotor vigilance task (PVT) before and after 45-min of smartphone use. Feelings of fatigue, and sleepiness, were assessed with visual analog scales before and after smartphone use. Brain oscillations, HRV, and pupil diameter were recorded 2-min at rest before and after, as well as during smartphone use. Evoked-related potentials (N200 and P300) were investigated during the PVT. Results: Acute smartphone use increased feelings of fatigue (t17 = -4.567, p < .001), and sleepiness (t17 = -3.585, p = .002). During smartphone use, theta power increased on frontal (t17 = -2.470, p = .024) and centromedial (t17 = -2.353, p = .031) regions, without any changes in pupil diameter and HRV. During the rest period after smartphone use, alpha power increased in the central left (t17=-2.227, p=.040), central right (t17 = -2.604, p = .018), and all parietal regions (p < .033|). HRV high frequencies decreased (t17 = 2.818, p = .013), but pupil diameter did not change. Reaction times increased after smartphone use (t17 = -2.335, p = .032), while no significant effects were reported on N200 and P300.Conclusion: Forty-five minutes of smartphone use induced mental fatigue, subjectively evidenced by an increased feeling of fatigue, and objectively by a decreased attentional performance. The increase in theta power during smartphone use and a decrease in HRV high frequencies after smartphone could reflect the presence of mental fatigue. However, due to the concomitant increase in sleepiness, these electrophysiological changes may not be specific to the induction of mental fatigue. Future studies should explore the dissociation between feelings of fatigue and sleepiness to validate these changes as markers of mental fatigue. Further investigations are needed to obtain reliable electrophysiological markers of mental fatigue after acute smartphone use

    Children's implicit knowledge of harmony in Western music

    No full text
    Three experiments examined children's knowledge of harmony in Western music. The children heard a series of chords followed by a final, target chord. In Experiment 1, French 6- and 11-year-olds judged whether the target was sung with the vowel /i/ or /u/. In Experiment 2, Australian 8- and 11-year-olds judged whether the target was played on a piano or a trumpet. In Experiment 3, Canadian 8- and 11-year-olds judged whether the target sounded good (i.e. consonant) or bad (dissonant). The target was either the most stable chord in the established musical key (i.e. the tonic, based on do, the first note of the scale) or a less stable chord. Performance was faster (Experiments 1, 2 and 3) and more accurate (Experiment 3) when the target was the tonic chord. The findings confirm that children have implicit knowledge of syntactic functions that typify Western harmony
    corecore