54 research outputs found

    Application of Intermediate Multi-Agent Systems to Integrated Algorithmic Composition and Expressive Performance of Music

    Get PDF
    We investigate the properties of a new Multi-Agent Systems (MAS) for computer-aided composition called IPCS (pronounced β€œipp-siss”) the Intermediate Performance Composition System which generates expressive performance as part of its compositional process, and produces emergent melodic structures by a novel multi-agent process. IPCS consists of a small-medium size (2 to 16) collection of agents in which each agent can perform monophonic tunes and learn monophonic tunes from other agents. Each agent has an affective state (an β€œartificial emotional state”) which affects how it performs the music to other agents; e.g. a β€œhappy” agent will perform β€œhappier” music. The agent performance not only involves compositional changes to the music, but also adds smaller changes based on expressive music performance algorithms for humanization. Every agent is initialized with a tune containing the same single note, and over the interaction period longer tunes are built through agent interaction. Agents will only learn tunes performed to them by other agents if the affective content of the tune is similar to their current affective state; learned tunes are concatenated to the end of their current tune. Each agent in the society learns its own growing tune during the interaction process. Agents develop β€œopinions” of other agents that perform to them, depending on how much the performing agent can help their tunes grow. These opinions affect who they interact with in the future. IPCS is not a mapping from multi-agent interaction onto musical features, but actually utilizes music for the agents to communicate emotions. In spite of the lack of explicit melodic intelligence in IPCS, the system is shown to generate non-trivial melody pitch sequences as a result of emotional communication between agents. The melodies also have a hierarchical structure based on the emergent social structure of the multi-agent system and the hierarchical structure is a result of the emerging agent social interaction structure. The interactive humanizations produce micro-timing and loudness deviations in the melody which are shown to express its hierarchical generative structure without the need for structural analysis software frequently used in computer music humanization

    ΠŸΡ€ΠΎΠ³Ρ€Π°ΠΌΠΈΡ€Π°ΡšΠ΅ ΠΊΠ²Π°Π½Ρ‚Π½ΠΈΡ… Ρ€Π°Ρ‡ΡƒΠ½Π°Ρ€Π° Π±Π°Π·ΠΈΡ€Π°Π½ΠΈΡ… Π½Π° ΡƒΠΏΠΎΡ‚Ρ€Π΅Π±ΠΈ Π»ΠΎΠ³ΠΈΡ‡ΠΊΠΈΡ… ΠΊΠΎΠ»Π° Π·Π° ΠΏΠΎΡ‚Ρ€Π΅Π±Π΅ Ρ€Π°Π΄Π° са ΠΌΡƒΠ·ΠΈΠΊΠΎΠΌ

    Get PDF
    There have been significant attempts previously to use the equations of quantum mechanics for generating sound, and to sonify simulated quantum processes. For new forms of computation to be utilized in computer music, eventually hardware must be utilized. This has rarely happened with quantum computer music. One reason for this is that it is currently not easy to get access to such hardware. A second is that the hardware available requires some understanding of quantum computing theory. Tis paper moves forward the process by utilizing two hardware quantum computation systems: IBMQASM v1.1 and a D-Wave 2X. It also introduces the ideas behind the gate-based IBM system, in a way hopefully more accessible to computerliterate readers. Tis is a presentation of the frst hybrid quantum computer algorithm, involving two hardware machines. Although neither of these algorithms explicitly utilize the promised quantum speed-ups, they are a vitalfrst step in introducing QC to the musical feld. Te article also introduces some key quantum computer algorithms and discusses their possible future contribution to computer music.Досад су Π·Π°Π±Π΅Π»Π΅ΠΆΠ΅Π½ΠΈ Π·Π½Π°Ρ‡Π°Ρ˜Π½ΠΈ ΠΏΠΎΠΊΡƒΡˆΠ°Ρ˜ΠΈ Π΄Π° сС Ρ˜Π΅Π΄Π½Π°Ρ‡ΠΈΠ½Π΅ ΠΊΠ²Π°Π½Ρ‚Π½Π΅ ΠΌΠ΅Ρ…Π°Π½ΠΈΠΊΠ΅ користС Π·Π° Π³Π΅Π½Π΅Ρ€ΠΈΡΠ°ΡšΠ΅ Π·Π²ΡƒΠΊΠ° ΠΈ Π΄Π° сС ΠΎΠ·Π²ΡƒΡ‡Π΅ симулирани ΠΊΠ²Π°Π½Ρ‚Π½ΠΈ процСси. Али, Π·Π° Π½ΠΎΠ²Π΅ ΠΎΠ±Π»ΠΈΠΊΠ΅ Ρ€Π°Ρ‡ΡƒΠ½Π°ΡšΠ° који Π±ΠΈ сС користили Ρƒ ΠΊΠΎΠΌΠΏΡ˜ΡƒΡ‚Π΅Ρ€ΡΠΊΠΎΡ˜ ΠΌΡƒΠ·ΠΈΡ†ΠΈ, ΠΌΠΎΡ€Π° сС ΡƒΠΏΠΎΡ‚Ρ€Π΅Π±ΠΈΡ‚ΠΈ ΠΎΠ΄Π³ΠΎΠ²Π°Ρ€Π°Ρ˜ΡƒΡ›ΠΈ Ρ…Π°Ρ€Π΄Π²Π΅Ρ€. Ово сС досад Ρ€Π΅Ρ‚ΠΊΠΎ дСшавало са ΠΊΠ²Π°Π½Ρ‚Π½ΠΎΠΌ ΠΊΠΎΠΌΠΏΡ˜ΡƒΡ‚Π΅Ρ€ΡΠΊΠΎΠΌ ΠΌΡƒΠ·ΠΈΠΊΠΎΠΌ, Π½Π°Ρ˜ΠΏΡ€Π΅ Π·Π°Ρ‚ΠΎ ΡˆΡ‚ΠΎ Ρ‚Π°ΠΊΠ°Π² Ρ…Π°Ρ€Π΄Π²Π΅Ρ€ нијС ΡˆΠΈΡ€ΠΎΠΊΠΎ доступан. Π”Ρ€ΡƒΠ³ΠΈ Ρ€Π°Π·Π»ΠΎΠ³ Ρ˜Π΅ΡΡ‚Π΅ околност Π΄Π° ΠΎΠ²Π°ΠΊΠ°Π² Ρ…Π°Ρ€Π΄Π²Π΅Ρ€ Π·Π°Ρ…Ρ‚Π΅Π²Π° извСсно познавањС Ρ‚Π΅ΠΎΡ€ΠΈΡ˜Π΅ ΠΊΠ²Π°Π½Ρ‚Π½ΠΎΠ³ рачунарства. Овим Ρ‡Π»Π°Π½ΠΊΠΎΠΌ ΠΏΠΎΠΌΠ΅Ρ€Π°ΠΌΠΎ овај процСс ΡƒΠ½Π°ΠΏΡ€Π΅Π΄ ΠΏΠΎΠΌΠΎΡ›Ρƒ Π΄Π²Π° хардвСрска ΠΊΠ²Π°Π½Ρ‚Π½Π° рачунарска систСма: IBMQASM v1.1 ΠΈ D-Wave 2X. Π’Π°ΠΊΠΎΡ’Π΅ ΡƒΠ²ΠΎΠ΄ΠΈΠΌΠΎ Π½Π΅ΠΊΠ΅ идСјС ΠΈΠ· IBM-ΠΎΠ²ΠΎΠ³ систСма заснованог Π½Π° Π»ΠΎΠ³ΠΈΡ‡ΠΊΠΈΠΌ ΠΊΠΎΠ»ΠΈΠΌΠ°, Π½Π° Π½Π°Ρ‡ΠΈΠ½ доступан рачунарски писмСним Ρ‡ΠΈΡ‚Π°ΠΎΡ†ΠΈΠΌΠ°. Ово јС ΠΏΡ€Π΅Π·Π΅Π½Ρ‚Π°Ρ†ΠΈΡ˜Π° ΠΏΡ€Π²ΠΎΠ³ Ρ…ΠΈΠ±Ρ€ΠΈΠ΄Π½ΠΎΠ³ ΠΊΠ²Π°Π½Ρ‚Π½ΠΎΠ³ ΠΊΠΎΠΌΠΏΡ˜ΡƒΡ‚Π΅Ρ€ΡΠΊΠΎΠ³ Π°Π»Π³ΠΎΡ€ΠΈΡ‚ΠΌΠ°, који ΡƒΠΊΡ™ΡƒΡ‡ΡƒΡ˜Π΅ Π΄Π²Π΅ хардвСрскС машинС. Иако нијСдан ΠΎΠ΄ ΠΎΠ²ΠΈΡ… Π°Π»Π³ΠΎΡ€ΠΈΡ‚Π°ΠΌΠ° Сксплицитно Π½Π΅ користи ΠΎΠ±Π΅Ρ›Π°Π½Π° ΠΊΠ²Π°Π½Ρ‚Π½Π° ΡƒΠ±Ρ€Π·Π°ΡšΠ°, ΠΎΠ½ΠΈ ΠΏΡ€Π΅Π΄ΡΡ‚Π°Π²Ρ™Π°Ρ˜Ρƒ Π²ΠΈΡ‚Π°Π»Π°Π½ ΠΏΡ€Π²ΠΈ ΠΊΠΎΡ€Π°ΠΊ Ρƒ ΡƒΠ²ΠΎΡ’Π΅ΡšΡƒ ΠΊΠ²Π°Π½Ρ‚Π½ΠΎΠ³ рачунарства Ρƒ ΠΏΠΎΡ™Π΅ ΠΌΡƒΠ·ΠΈΠΊΠ΅. Π§Π»Π°Π½Π°ΠΊ Π·Π°ΠΏΠΎΡ‡ΠΈΡšΠ΅ΠΌΠΎ ΠΊΡ€Π°Ρ‚ΠΊΠΈΠΌ ΠΏΡ€Π΅Π³Π»Π΅Π΄ΠΎΠΌ ΠΊΠ²Π°Π½Ρ‚Π½ΠΎΠ³ рачунарства ΠΈ ΡƒΠΊΠ°Π·ΡƒΡ˜Π΅ΠΌΠΎ ΠΊΠ°ΠΊΠΎ сС ΠΎΠ½ΠΎ ΠΌΠΎΠΆΠ΅ ΠΏΡ€ΠΈΠΌΠ΅Π½ΠΈΡ‚ΠΈ Π½Π° ΠΏΠΎΠ΄Ρ€ΡƒΡ‡Ρ˜Ρƒ умСтности. Π‘Π»Π΅Π΄ΠΈ ΠΈΡΡ‚Ρ€Π°ΠΆΠΈΠ²Π°ΡšΠ΅ ΠΏΡ€Π΅Ρ‚Ρ…ΠΎΠ΄Π½ΠΈΡ… ΠΏΡ€ΠΎΡ˜Π΅ΠΊΠ°Ρ‚Π° Ρƒ којима су ΠΊΠΎΡ€ΠΈΡˆΡ›Π΅Π½ΠΈ стварни ΠΈΠ»ΠΈ симулирани ΠΊΠ²Π°Π½Ρ‚Π½ΠΈ процСси Ρƒ ΠΌΡƒΠ·ΠΈΡ‡ΠΊΠΈΠΌ Π΄Π΅Π»ΠΈΠΌΠ° ΠΈΠ»ΠΈ ΠΈΠ·Π²ΠΎΡ’Π΅ΡšΠΈΠΌΠ°. Π£ слСдСћСм ΠΎΠ΄Π΅Ρ™ΠΊΡƒ сС Π³ΠΎΠ²ΠΎΡ€ΠΈ ΠΎ Π½Π°Ρ˜ΠΏΠΎΠ·Π½Π°Ρ‚ΠΈΡ˜ΠΎΡ˜ врсти ΠΊΠ²Π°Π½Ρ‚Π½ΠΈΡ… Ρ€Π°Ρ‡ΡƒΠ½Π°Ρ€Π°, заснованих Π½Π° Π»ΠΎΠ³ΠΈΡ‡ΠΊΠΈΠΌ ΠΊΠΎΠ»ΠΈΠΌΠ°, ΠΈ ΠΎΠΏΠΈΡΡƒΡ˜Π΅ сС Ρ…Π°Ρ€Π΄Π²Π΅Ρ€ јСдног ΠΎΠ΄ ΠΌΠ°ΡšΠΈΡ… ΠΊΠ²Π°Π½Ρ‚Π½ΠΈΡ… Ρ€Π°Ρ‡ΡƒΠ½Π°Ρ€Π° компанијС IBM. Π‘Π»Π΅Π΄ΠΈ ΠΊΡ€Π°Ρ‚Π°ΠΊ ΡƒΠ²ΠΎΠ΄ Ρƒ Ρ‚Π΅ΠΎΡ€ΠΈΡ˜Ρƒ ΠΊΠ²Π°Π½Ρ‚Π½ΠΎΠ³ рачунарства; ΠΎΠ²Π΅ идСјС су ΠΏΠΎΡ‚ΠΎΠΌ ΠΏΡ€ΠΎΡ˜Π΅ΠΊΡ‚ΠΎΠ²Π°Π½Π΅ Π½Π° јСзик који користС IBM Ρ€Π°Ρ‡ΡƒΠ½Π°Ρ€ΠΈ: IBMQASM. Π‘Π»Π΅Π΄Π΅Ρ›ΠΈ ΠΎΠ΄Π΅Ρ™Π°ΠΊ доноси ΠΊΡ€Π°Ρ‚Π°ΠΊ ΠΏΡ€Π΅Π³Π»Π΅Π΄ Π΄Ρ€ΡƒΠ³Π΅ врстС ΠΊΠ²Π°Π½Ρ‚Π½ΠΎΠ³ Ρ€Π°Ρ‡ΡƒΠ½Π°Ρ€Π° који сС користи: D-Wave. Π”Π΅Ρ‚Π°Ρ™Π½ΠΈΡ˜ΠΈ описи ΠΌΠΎΠ³ Π°Π»Π³ΠΎΡ€ΠΈΡ‚ΠΌΠ° доступни су Ρƒ Π΄Ρ€ΡƒΠ³ΠΈΠΌ Ρ‡Π»Π°Π½Ρ†ΠΈΠΌΠ° Π½Π° којС сС ΠΏΠΎΠ·ΠΈΠ²Π°ΠΌ. На ΠΊΡ€Π°Ρ˜Ρƒ јС описан qGen: IBM Π³Π΅Π½Π΅Ρ€ΠΈΡˆΠ΅ ΠΌΠ΅Π»ΠΎΠ΄ΠΈΡ˜Ρƒ, Π° D-Wave јС Ρ…Π°Ρ€ΠΌΠΎΠ½ΠΈΠ·ΡƒΡ˜Π΅. Ѐокус јС Π½Π° мСлодијском Π°Π»Π³ΠΎΡ€ΠΈΡ‚ΠΌΡƒ, ΠΏΠΎΡˆΡ‚ΠΎ јС Π°Π»Π³ΠΎΡ€ΠΈΡ‚Π°ΠΌ D-Wave описан Ρƒ ΠΏΠΎΠ³Π»Π°Π²Ρ™Ρƒ ΠΈΠ· књигС Π½Π° ΠΊΠΎΡ˜Ρƒ Ρ€Π΅Ρ„Π΅Ρ€ΠΈΡ€Π°ΠΌ. РазвијСн јС β€œΠ½Π°Ρ˜Ρ˜Π΅Π΄Π½ΠΎΡΡ‚Π°Π²Π½ΠΈΡ˜ΠΈ ΠΌΠΎΠ³ΡƒΡ›ΠΈβ€œ мСлодијски Π°Π»Π³ΠΎΡ€ΠΈΡ‚Π°ΠΌ, ΡƒΠ· који јС ΠΏΡ€ΠΈΠ»ΠΎΠΆΠ΅Π½ ΠΈ ΠΎΠ΄Π³ΠΎΠ²Π°Ρ€Π°Ρ˜ΡƒΡ›ΠΈ ΠΏΡ€ΠΈΠΌΠ΅Ρ€

    Learning and Co-operation in Mobile Multi-Robot Systems

    Get PDF
    Merged with duplicate record 10026.1/1984 on 27.02.2017 by CS (TIS)This thesis addresses the problem of setting the balance between exploration and exploitation in teams of learning robots who exchange information. Specifically it looks at groups of robots whose tasks include moving between salient points in the environment. To deal with unknown and dynamic environments,such robots need to be able to discover and learn the routes between these points themselves. A natural extension of this scenario is to allow the robots to exchange learned routes so that only one robot needs to learn a route for the whole team to use that route. One contribution of this thesis is to identify a dilemma created by this extension: that once one robot has learned a route between two points, all other robots will follow that route without looking for shorter versions. This trade-off will be labeled the Distributed Exploration vs. Exploitation Dilemma, since increasing distributed exploitation (allowing robots to exchange more routes) means decreasing distributed exploration (reducing robots ability to learn new versions of routes), and vice-versa. At different times, teams may be required with different balances of exploitation and exploration. The main contribution of this thesis is to present a system for setting the balance between exploration and exploitation in a group of robots. This system is demonstrated through experiments involving simulated robot teams. The experiments show that increasing and decreasing the value of a parameter of the novel system will lead to a significant increase and decrease respectively in average exploitation (and an equivalent decrease and increase in average exploration) over a series of team missions. A further set of experiments show that this holds true for a range of team sizes and numbers of goals

    Wireless Interactive Sonification of Large Water Waves to Demonstrate the Facilities of a Large-Scale Research Wave Tank

    Get PDF
    Interactive sonification can provide a platform for demonstration and education as well as for monitoring and investigation. We present a system designed to demonstrate the facilities of the UK's most advanced large-scale research wave tank. The interactive sonification of water waves in the β€œocean basin” wave tank at Plymouth University consisted of a number of elements: generation of ocean waves, acquisition and sonification of ocean-wave measurement data, and gesture-controlled pitch and amplitude of sonifications. The generated water waves were linked in real time to sonic features via depth monitors and motion tracking of a floating buoy. Types of water-wave patterns, varying in shape and size, were selected and triggered using wireless motion detectors attached to the demonstrator's arms. The system was implemented on a network of five computers utilizing Max/MSP alongside specialist marine research software, and was demonstrated live in a public performance for the formal opening of the Marine Institute building. </jats:p

    Electroencephalography reflects the activity of sub-cortical brain regions during approach-withdrawal behaviour while listening to music

    Get PDF
    The ability of music to evoke activity changes in the core brain structures that underlie the experience of emotion suggests that it has the potential to be used in therapies for emotion disorders. A large volume of research has identified a network of sub-cortical brain regions underlying music-induced emotions. Additionally, separate evidence from electroencephalography (EEG) studies suggests that prefrontal asymmetry in the EEG reflects the approach-withdrawal response to music-induced emotion. However, fMRI and EEG measure quite different brain processes and we do not have a detailed understanding of the functional relationships between them in relation to music-induced emotion. We employ a joint EEG – fMRI paradigm to explore how EEG-based neural correlates of the approach-withdrawal response to music reflect activity changes in the sub-cortical emotional response network. The neural correlates examined are asymmetry in the prefrontal EEG, and the degree of disorder in that asymmetry over time, as measured by entropy. Participants’ EEG and fMRI were recorded simultaneously while the participants listened to music that had been specifically generated to target the elicitation of a wide range of affective states. While listening to this music, participants also continuously reported their felt affective states. Here we report on co-variations in the dynamics of these self-reports, the EEG, and the sub-cortical brain activity. We find that a set of sub-cortical brain regions in the emotional response network exhibits activity that significantly relates to prefrontal EEG asymmetry. Specifically, EEG in the pre-frontal cortex reflects not only cortical activity, but also changes in activity in the amygdala, posterior temporal cortex, and cerebellum. We also find that, while the magnitude of the asymmetry reflects activity in parts of the limbic and paralimbic systems, the entropy of that asymmetry reflects activity in parts of the autonomic response network such as the auditory cortex. This suggests that asymmetry magnitude reflects affective responses to music, while asymmetry entropy reflects autonomic responses to music. Thus, we demonstrate that it is possible to infer activity in the limbic and paralimbic systems from pre-frontal EEG asymmetry. These results show how EEG can be used to measure and monitor changes in the limbic and paralimbic systems. Specifically, they suggest that EEG asymmetry acts as an indicator of sub-cortical changes in activity induced by music. This shows that EEG may be used as a measure of the effectiveness of music therapy to evoke changes in activity in the sub-cortical emotion response network. This is also the first time that the activity of sub-cortical regions, normally considered β€œinvisible” to EEG, has been shown to be characterisable directly from EEG dynamics measured during music listening

    Affective calibration of musical feature sets in an emotionally intelligent music composition system

    Get PDF
    Affectively driven algorithmic composition (AAC) is a rapidly growing field that exploits computer-aided composition in order to generate new music with particular emotional qualities or affective intentions. An AAC system was devised in order to generate a stimulus set covering nine discrete sectors of a two-dimensional emotion space by means of a 16-channel feed-forward artificial neural network. This system was used to generate a stimulus set of short pieces of music, which were rendered using a sampled piano timbre and evaluated by a group of experienced listeners who ascribed a two-dimensional valence-arousal coordinate to each stimulus. The underlying musical feature set, initially drawn from the literature, was subsequently adjusted by amplifying or attenuating the quantity of each feature in order to maximize the spread of stimuli in the valence-arousal space before a second listener evaluation was conducted. This process was repeated a third time in order to maximize the spread of valence-arousal coordinates ascribed to the generated stimulus set in comparison to a spread taken from an existing prerated database of stimuli, demonstrating that this prototype AAC system is capable of creating short sequences of music with a slight improvement on the range of emotion found in a stimulus set comprised of real-world, traditionally composed musical excerpts

    Personalised, multi-modal, affective state detection for hybrid brain-computer music interfacing

    Get PDF
    Brain-computer music interfaces (BCMIs) may be used to modulate affective states, with applications in music therapy, composition, and entertainment. However, for such systems to work they need to be able to reliably detect their user's current affective state. We present a method for personalised affective state detection for use in BCMI. We compare it to a population-based detection method trained on 17 users and demonstrate that personalised affective state detection is significantly ( p<0.01p<0.01p<0.01 ) more accurate, with average improvements in accuracy of 10.2 percent for valence and 9.3 percent for arousal. We also compare a hybrid BCMI (a BCMI that combines physiological signals with neurological signals) to a conventional BCMI design (one based upon the use of only EEG features) and demonstrate that the hybrid design results in a significant ( p<0.01p<0.01p<0.01 ) 6.2 percent improvement in performance for arousal classification and a significant ( p<0.01p<0.01p<0.01 ) 5.9 percent improvement for valence classification

    Affective brain–computer music interfacing

    Get PDF
    We aim to develop and evaluate an affective brain–computer music interface (aBCMI) for modulating the affective states of its users. Approach. An aBCMI is constructed to detect a userΚΌs current affective state and attempt to modulate it in order to achieve specific objectives (for example, making the user calmer or happier) by playing music which is generated according to a specific affective target by an algorithmic music composition system and a casebased reasoning system. The system is trained and tested in a longitudinal study on a population of eight healthy participants, with each participant returning for multiple sessions. Main results. The final online aBCMI is able to detect its users current affective states with classification accuracies of up to 65% (3 class, p < 0.01) and modulate its userΚΌs affective states significantly above chance level (p < 0.05). Significance. Our system represents one of the first demonstrations of an online aBCMI that is able to accurately detect and respond to userΚΌs affective states. Possible applications include use in music therapy and entertainmen

    RadioMe: Adaptive Radio to Support People with Mild Dementia in Their Own Home

    Get PDF
    People with dementia and their carers are experiencing a complicated and highly personal health journey. The RadioMe system, an adaptive live radio system enriched with reminder possibilities and agitation detection and intervention with personalised calming music, is being developed to support people with mild dementia in their own home. RadioMe is an ongoing, interdisciplinary project, combining expertise on dementia, music therapy, music computation and human computer interaction

    Directed motor-auditory EEG connectivity is modulated by music tempo

    Get PDF
    Beat perception is fundamental to how we experience music, and yet the mechanism behind this spontaneous building of the internal beat representation is largely unknown. Existing findings support links between the tempo (speed) of the beat and enhancement of electroencephalogram (EEG) activity at tempo-related frequencies, but there are no studies looking at how tempo may affect the underlying long-range interactions between EEG activity at different electrodes. The present study investigates these long-range interactions using EEG activity recorded from 21 volunteers listening to music stimuli played at 4 different tempi (50, 100, 150 and 200 beats per minute). The music stimuli consisted of piano excerpts designed to convey the emotion of β€œpeacefulness”. Noise stimuli with an identical acoustic content to the music excerpts were also presented for comparison purposes. The brain activity interactions were characterized with the imaginary part of coherence (iCOH) in the frequency range 1.5–18 Hz (Ξ΄, ΞΈ, Ξ± and lower Ξ²) between all pairs of EEG electrodes for the four tempi and the music/noise conditions, as well as a baseline resting state (RS) condition obtained at the start of the experimental task. Our findings can be summarized as follows: (a) there was an ongoing long-range interaction in the RS engaging fronto-posterior areas; (b) this interaction was maintained in both music and noise, but its strength and directionality were modulated as a result of acoustic stimulation; (c) the topological patterns of iCOH were similar for music, noise and RS, however statistically significant differences in strength and direction of iCOH were identified; and (d) tempo had an effect on the direction and strength of motor-auditory interactions. Our findings are in line with existing literature and illustrate a part of the mechanism by which musical stimuli with different tempi can entrain changes in cortical activity
    • …
    corecore