207 research outputs found

    Introduction to Vertex Algebras

    Get PDF
    These lecture notes are intended to give a modest impulse to anyone willing to start or pursue a journey into the theory of Vertex Algebras by reading one of Kac's or Lepowsky-Li's books. Therefore, the primary goal is to provide required tools and help being acquainted with the machinery which the whole theory is based on. The exposition follows Kac's approach. Fundamental examples relevant in Theoretical Physics are also discussed. No particular prerequisites are assumed.Comment: Lecture note

    Exploring the neural entrainment to musical rhythms and meter : a steady-state evoked potential approach

    Full text link
    Thèse de doctorat réalisé en cotutelle avec l'Université catholique de Louvain, Belgique (Faculté de médecine, Institut de Neuroscience)Percevoir et synchroniser ses mouvements à une pulsation régulière en musique est une capacité largement répandue chez l’Homme, et fondamentale aux comportements musicaux. La pulsation et la métrique en musique désignent généralement une organisation temporelle périodique perçue à partir de stimuli acoustiques complexes, et cette organisation perceptuelle implique souvent une mise en mouvement périodique spontanée du corps. Cependant, les mécanismes neuraux sous-tendant cette perception sont à l’heure actuelle encore méconnus. Le présent travail a donc eu pour objectif de développer une nouvelle approche expérimentale, inspirée par l’approche électrophysiologique des potentiels évoqués stationnaires, afin d’explorer les corrélats neuraux à la base de notre perception de la pulsation et de la métrique induite à l’écoute de rythmes musicaux. L’activité neurale évoquée en relation avec la perception d’une pulsation a été enregistrée par électroencéphalographie (EEG) chez des individus sains, dans divers contextes : (1) dans un contexte d’imagerie mentale d’une métrique appliquée de manière endogène sur un stimulus auditif, (2) dans un contexte d’induction spontanée d’une pulsation à l’écoute de patterns rythmiques musicaux, (3) dans un contexte d’interaction multisensorielle, et (4) dans un contexte de synchronisation sensorimotrice. Pris dans leur ensemble, les résultats de ces études corroborent l’hypothèse selon laquelle la perception de la pulsation en musique est sous-tendue par des processus de synchronisation et de résonance de l’activité neurale dans le cerveau humain. De plus, ces résultats suggèrent que l’approche développée dans le présent travail pourrait apporter un éclairage significatif pour comprendre les mécanismes neuraux de la perception de la pulsation et des rythmes musicaux, et, dans une perspective plus générale, pour explorer les mécanismes de synchronisation neurale.The ability to perceive a regular beat in music and synchronize to it is a widespread human skill. Fundamental to musical behavior, beat and meter refer to the perception of periodicities while listening to musical rhythms, and usually involve spontaneous entrainment to move on these periodicities. However, the neural mechanisms underlying entrainment to beat and meter in Humans remain unclear. The present work tests a novel experimental approach, inspired by the steady-state evoked potential method, to explore the neural dynamics supporting the perception of rhythmic inputs. Using human electroencephalography (EEG), neural responses to beat and meter were recorded in various contexts: (1) mental imagery of meter, (2) spontaneous induction of a beat from rhythmic patterns, (3) multisensory integration, and (4) sensorimotor synchronization. Our results support the view that entrainment and resonance phenomena subtend the processing of musical rhythms in the human brain. Furthermore, our results suggest that this novel approach could help investigating the link between the phenomenology of musical beat and meter and neurophysiological evidence of a bias towards periodicities arising under certain circumstances in the nervous system. Hence, entrainment to music provides an original framework to explore general entrainment phenomena occurring at various levels, from the inter-neural to the inter-individual level

    Modality effects in implicit artificial grammar learning: An EEG study

    Get PDF
    Recently, it has been proposed that sequence learning engages a combination of modality-specific operating networks and modality-independent computational principles. In the present study, we compared the behavioural and EEG outcomes of implicit artificial grammar learning in the visual vs. auditory modality. We controlled for the influence of surface characteristics of sequences (Associative Chunk Strength), thus focusing on the strictly structural aspects of sequence learning, and we adapted the paradigms to compensate for known frailties of the visual modality compared to audition (temporal presentation, fast presentation rate). The behavioural outcomes were similar across modalities. Favouring the idea of modality-specificity, ERPs in response to grammar violations differed in topography and latency (earlier and more anterior component in the visual modality), and ERPs in response to surface features emerged only in the auditory modality. In favour of modality-independence, we observed three common functional properties in the late ERPs of the two grammars: both were free of interactions between structural and surface influences, both were more extended in a grammaticality classification test than in a preference classification test, and both correlated positively and strongly with theta event-related-synchronization during baseline testing. Our findings support the idea of modality-specificity combined with modality-independence, and suggest that memory for visual vs. auditory sequences may largely contribute to cross-modal differences. (C) 2018 Elsevier B.V. All rights reserved.Max Planck Institute for Psycholinguistics; Donders Institute for Brain, Cognition and Behaviour; Fundacao para a Ciencia e Tecnologia [PTDC/PSI-PC0/110734/2009, UID/BIM/04773/2013, CBMR 1334, PEst-OE/EQB/1A0023/2013, UM/PSI/00050/2013

    Directed motor-auditory EEG connectivity is modulated by music tempo

    Get PDF
    Beat perception is fundamental to how we experience music, and yet the mechanism behind this spontaneous building of the internal beat representation is largely unknown. Existing findings support links between the tempo (speed) of the beat and enhancement of electroencephalogram (EEG) activity at tempo-related frequencies, but there are no studies looking at how tempo may affect the underlying long-range interactions between EEG activity at different electrodes. The present study investigates these long-range interactions using EEG activity recorded from 21 volunteers listening to music stimuli played at 4 different tempi (50, 100, 150 and 200 beats per minute). The music stimuli consisted of piano excerpts designed to convey the emotion of “peacefulness”. Noise stimuli with an identical acoustic content to the music excerpts were also presented for comparison purposes. The brain activity interactions were characterized with the imaginary part of coherence (iCOH) in the frequency range 1.5–18 Hz (δ, θ, α and lower β) between all pairs of EEG electrodes for the four tempi and the music/noise conditions, as well as a baseline resting state (RS) condition obtained at the start of the experimental task. Our findings can be summarized as follows: (a) there was an ongoing long-range interaction in the RS engaging fronto-posterior areas; (b) this interaction was maintained in both music and noise, but its strength and directionality were modulated as a result of acoustic stimulation; (c) the topological patterns of iCOH were similar for music, noise and RS, however statistically significant differences in strength and direction of iCOH were identified; and (d) tempo had an effect on the direction and strength of motor-auditory interactions. Our findings are in line with existing literature and illustrate a part of the mechanism by which musical stimuli with different tempi can entrain changes in cortical activity

    Spatial and temporal (non)binding of audiovisual rhythms in sensorimotor synchronisation

    Get PDF
    All data are held in a public repository, available at OSF database (URL access: https://osf.io/2jr48/?view_only=17e3f6f57651418c980832e00d818072).Human movement synchronisation with moving objects strongly relies on visual input. However, auditory information also plays an important role, since real environments are intrinsically multimodal. We used electroencephalography (EEG) frequency tagging to investigate the selective neural processing and integration of visual and auditory information during motor tracking and tested the effects of spatial and temporal congruency between audiovisual modalities. EEG was recorded while participants tracked with their index finger a red flickering (rate fV = 15 Hz) dot oscillating horizontally on a screen. The simultaneous auditory stimulus was modulated in pitch (rate fA = 32 Hz) and lateralised between left and right audio channels to induce perception of a periodic displacement of the sound source. Audiovisual congruency was manipulated in terms of space in Experiment 1 (no motion, same direction or opposite direction), and timing in Experiment 2 (no delay, medium delay or large delay). For both experiments, significant EEG responses were elicited at fV and fA tagging frequencies. It was also hypothesised that intermodulation products corresponding to the nonlinear integration of visual and auditory stimuli at frequencies fV ± fA would be elicited, due to audiovisual integration, especially in Congruent conditions. However, these components were not observed. Moreover, synchronisation and EEG results were not influenced by congruency manipulations, which invites further exploration of the conditions which may modulate audiovisual processing and the motor tracking of moving objects.We thank Ashleigh Clibborn and Ayah Hammoud for their assistance with data collection. This work was supported by a grant from the Australian Research Council (DP170104322, DP220103047). OML is supported by the Portuguese Foundation for Science and Technology and the Portuguese Ministry of Science, Technology and Higher Education, through the national funds, within the scope of the Transitory Disposition of the Decree No. 57/2016, of 29 August, amended by Law No. 57/2017 of 19 July (Ref.: SFRH/BPD/72710/2010

    Lateralised dynamic modulations of corticomuscular coherence associated with bimanual learning of rhythmic patterns

    Get PDF
    Supplementary Information: The online version contains supplementary material available at https://doi.org/ 10.1038/s41598-022-10342-5Human movements are spontaneously attracted to auditory rhythms, triggering an automatic activation of the motor system, a central phenomenon to music perception and production. Cortico- muscular coherence (CMC) in the theta, alpha, beta and gamma frequencies has been used as an index of the synchronisation between cortical motor regions and the muscles. Here we investigated how learning to produce a bimanual rhythmic pattern composed of low- and high-pitch sounds affects CMC in the beta frequency band. Electroencephalography (EEG) and electromyography (EMG) from the left and right First Dorsal Interosseus and Flexor Digitorum Superficialis muscles were concurrently recorded during constant pressure on a force sensor held between the thumb and index finger while listening to the rhythmic pattern before and after a bimanual training session. During the training, participants learnt to produce the rhythmic pattern guided by visual cues by pressing the force sensors with their left or right hand to produce the low- and high-pitch sounds, respectively. Results revealed no changes after training in overall beta CMC or beta oscillation amplitude, nor in the correlation between the left and right sides for EEG and EMG separately. However, correlation analyses indicated that left- and right-hand beta EEG–EMG coherence were positively correlated over time before training but became uncorrelated after training. This suggests that learning to bimanually produce a rhythmic musical pattern reinforces lateralised and segregated cortico-muscular communication.This work was supported by a grant from the Australian Research Council (DP170104322)

    Spatial and temporal (non)binding of audio-visual stimuli: effects on motor tracking and underlying neural sensory processing

    Get PDF
    Objectives: Compare the steady-state evoked potentials (SSEPs) of spatially or temporally congruent and incongruent audio-visual stimuli and evaluate how congruency affects the motion tracking of visual stimuli. Research question: Does spatial or temporal congruency of audio-visual stimuli affect motion tracking and evoke differential SSEPs? Methods: We use EEG frequency-tagging techniques to investigate the selective neural processing and integration of visual and auditory information in the tracking of a moving stimulus and how spatial and temporal (in)congruency between the two modalities modulate these sensory neural processes and synchronization performance.Participants were instructed to track a red dot flickering at 15 Hz that oscillated horizontally with a complex trajectory on a computer screen by moving their index finger. An auditory pure tone with continuous pitch modulation at 32 Hz was presented with lateralised amplitude modulations in left and right audio channels (panning) that were, in Experiment 1, either spatially congruent or incongruent (same direction vs. opposite direction vs. no panning), and in Experiment 2, either temporally congruent or incongruent (no delay vs. medium or large delay), with the oscillating visual stimulus. Results: Both experiments yielded significant EEG responses at the visual (15 Hz) and auditory (32 Hz) tagging frequencies. Further, in Experiment 1 participants had lower performance and larger amplitudes at the auditory frequency during no panning condition. No significant correlation between the two measures was found. In Experiment 2 no changes in the amplitude of the EEG responses or in performance were found. Conclusion: The movement synchronization performance and the neural processing of visual and auditory information were not influenced by phase congruency manipulation. For spatial congruency, the moving auditory stimuli led to better performance, irrespective of congruency, when compared to the non moving sound. Importantly, there were no significant responses at 17 and 47 Hz corresponding to the intermodulation frequencies of 15 and 32 Hz, suggesting an absence of global integration of visual and auditory information. These results encourage further exploration of the conditions that may result in the selective processing of visual and auditory information and their integration in the motor tracking of moving environmental objects

    Neural tracking of the musical beat is enhanced by low-frequency sounds

    Get PDF
    Music makes us move, and using bass instruments to build the rhythmic foundations of music is especially effective at inducing people to dance to periodic pulse-like beats. Here, we show that this culturally widespread practice may exploit a neurophysiological mechanism whereby low-frequency sounds shape the neural representations of rhythmic input by boosting selective locking to the beat. Cortical activity was captured using electroencephalography (EEG) while participants listened to a regular rhythm or to a relatively complex syncopated rhythm conveyed either by low tones (130 Hz) or high tones (1236.8 Hz). We found that cortical activity at the frequency of the perceived beat is selectively enhanced compared with other frequencies in the EEG spectrum when rhythms are conveyed by bass sounds. This effect is unlikely to arise from early cochlear processes, as revealed by auditory physiological modeling, and was particularly pronounced for the complex rhythm requiring endogenous generation of the beat. The effect is likewise not attributable to differences in perceived loudness between low and high tones, as a control experiment manipulating sound intensity alone did not yield similar results. Finally, the privileged role of bass sounds is contingent on allocation of attentional resources to the temporal properties of the stimulus, as revealed by a further control experiment examining the role of a behavioral task. Together, our results provide a neurobiological basis for the convention of using bass instruments to carry the rhythmic foundations of music and to drive people to move to the beat

    Neural tracking and integration of 'self' and 'other' in improvised interpersonal coordination

    Get PDF
    Humans coordinate their movements with one another in a range of everyday activities and skill domains. Optimal joint performance requires the continuous anticipation of and adaptation to each other's movements, especially when actions are spontaneous rather than pre-planned. Here we employ dual-EEG and frequency-tagging techniques to investigate how the neural tracking of self- and other-generated movements supports interpersonal coordination during improvised motion. LEDs flickering at 5.7 and 7.7 Hz were attached to participants’ index fingers in 28 dyads as they produced novel patterns of synchronous horizontal forearm movements. EEG responses at these frequencies revealed enhanced neural tracking of self-generated movement when leading and of other-generated movements when following. A marker of self-other integration at 13.4 Hz (inter-modulation frequency of 5.7 and 7.7 Hz) peaked when no leader was designated, and mutual adaptation and movement synchrony were maximal. Furthermore, the amplitude of EEG responses reflected differences in the capacity of dyads to synchronize their movements, offering a neurophysiologically grounded perspective for understanding perceptual-motor mechanisms underlying joint action. © 2019 Elsevier Inc

    Dynamic modulation of beta band cortico-muscular coupling induced by audio-visual rhythms

    Get PDF
    Human movements often spontaneously fall into synchrony with auditory and visual environmental rhythms. Related behavioral studies have shown that motor responses are automatically and unintentionally coupled with external rhythmic stimuli. However, the neurophysiological processes underlying such motor entrainment remain largely unknown. Here we investigated with electroencephalography (EEG) and electromyography (EMG) the modulation of neural and muscular activity induced by periodic audio and/or visual sequences. The sequences were presented at either 1 Hz or 2 Hz while participants maintained constant finger pressure on a force sensor. The results revealed that although there was no change of amplitude in participants’ EMG in response to the sequences, the synchronization between EMG and EEG recorded over motor areas in the beta (12–40 Hz) frequency band was dynamically modulated, with maximal coherence occurring about 100 ms before each stimulus. These modulations in beta EEG–EMG motor coherence were found for the 2 Hz audio-visual sequences, confirming at a neurophysiological level the enhancement of motor entrainment with multimodal rhythms that fall within preferred perceptual and movement frequency ranges. Our findings identify beta band cortico-muscular coupling as a potential underlying mechanism of motor entrainment, further elucidating the nature of the link between sensory and motor systems in humans
    • …
    corecore