159 research outputs found
Crowdsourcing Linked Data on listening experiences through reuse and enhancement of library data
Research has approached the practice of musical reception in a multitude of ways, such as the analysis of professional critique, sales figures and psychological processes activated by the act of listening. Studies in the Humanities, on the other hand, have been hindered by the lack of structured evidence of actual experiences of listening as reported by the listeners themselves, a concern that was voiced since the early Web era. It was however assumed that such evidence existed, albeit in pure textual form, but could not be leveraged until it was digitised and aggregated. The Listening Experience Database (LED) responds to this research need by providing a centralised hub for evidence of listening in the literature. Not only does LED support search and reuse across nearly 10,000 records, but it also provides machine-readable structured data of the knowledge around the contexts of listening. To take advantage of the mass of formal knowledge that already exists on the Web concerning these contexts, the entire framework adopts Linked Data principles and technologies. This also allows LED to directly reuse open data from the British Library for the source documentation that is already published. Reused data are re-published as open data with enhancements obtained by expanding over the model of the original data, such as the partitioning of published books and collections into individual stand-alone documents. The database was populated through crowdsourcing and seamlessly incorporates data reuse from the very early data entry phases. As the sources of the evidence often contain vague, fragmentary of uncertain information, facilities were put in place to generate structured data out of such fuzziness. Alongside elaborating on these functionalities, this article provides insights into the most recent features of the latest instalment of the dataset and portal, such as the interlinking with the MusicBrainz database, the relaxation of geographical input constraints through text mining, and the plotting of key locations in an interactive geographical browser
Reciprocal Modulation of Cognitive and Emotional Aspects in Pianistic Performances
Background: High level piano performance requires complex integration of perceptual, motor, cognitive and emotive skills. Observations in psychology and neuroscience studies have suggested reciprocal inhibitory modulation of the cognition by emotion and emotion by cognition. However, it is still unclear how cognitive states may influence the pianistic performance. The aim of the present study is to verify the influence of cognitive and affective attention in the piano performances. Methods and Findings: Nine pianists were instructed to play the same piece of music, firstly focusing only on cognitive aspects of musical structure (cognitive performances), and secondly, paying attention solely on affective aspects (affective performances). Audio files from pianistic performances were examined using a computational model that retrieves nine specific musical features (descriptors) - loudness, articulation, brightness, harmonic complexity, event detection, key clarity, mode detection, pulse clarity and repetition. In addition, the number of volunteers' errors in the recording sessions was counted. Comments from pianists about their thoughts during performances were also evaluated. The analyses of audio files throughout musical descriptors indicated that the affective performances have more: agogics, legatos, pianos phrasing, and less perception of event density when compared to the cognitive ones. Error analysis demonstrated that volunteers misplayed more left hand notes in the cognitive performances than in the affective ones. Volunteers also played more wrong notes in affective than in cognitive performances. These results correspond to the volunteers' comments that in the affective performances, the cognitive aspects of piano execution are inhibited, whereas in the cognitive performances, the expressiveness is inhibited. Conclusions: Therefore, the present results indicate that attention to the emotional aspects of performance enhances expressiveness, but constrains cognitive and motor skills in the piano execution. In contrast, attention to the cognitive aspects may constrain the expressivity and automatism of piano performances.Brazilian government research agency: Fundacao de Amparo a Pesquisa do Estado de Sao Paulo (FAPESP)[08/54844-7]Brazilian government research agency: Fundacao de Amparo a Pesquisa do Estado de Sao Paulo (FAPESP)[07/59826-4
Locus of emotion influences psychophysiological reactions to music
It is now widely accepted that the perception of emotional expression in music can be vastly different from the feelings evoked by it. However, less understood is how the locus of emotion affects the experience of music, that is how the act of perceiving the emotion in music compares with the act of assessing the emotion induced in the listener by the music. In the current study, we compared these two emotion loci based on the psychophysiological response of 40 participants listening to 32 musical excerpts taken from movie soundtracks. Facial electromyography, skin conductance, respiration and heart rate were continuously measured while participants were required to assess either the emotion expressed by, or the emotion they felt in response to the music. Using linear mixed effects models, we found a higher mean response in psychophysiological measures for the “perceived” than the “felt” task. This result suggested that the focus on one’s self distracts from the music, leading to weaker bodily reactions during the “felt” task. In contrast, paying attention to the expression of the music and consequently to changes in timbre, loudness and harmonic progression enhances bodily reactions. This study has methodological implications for emotion induction research using psychophysiology and the conceptualization of emotion loci. Firstly, different tasks can elicit different psychophysiological responses to the same stimulus and secondly, both tasks elicit bodily responses to music. The latter finding questions the possibility of a listener taking on a purely cognitive mode when evaluating emotion expression
Dynamic Emotional and Neural Responses to Music Depend on Performance Expression and Listener Experience
Apart from its natural relevance to cognition, music provides a window into the intimate relationships between production, perception, experience, and emotion. Here, emotional responses and neural activity were observed as they evolved together with stimulus parameters over several minutes. Participants listened to a skilled music performance that included the natural fluctuations in timing and sound intensity that musicians use to evoke emotional responses. A mechanical performance of the same piece served as a control. Before and after fMRI scanning, participants reported real-time emotional responses on a 2-dimensional rating scale (arousal and valence) as they listened to each performance. During fMRI scanning, participants listened without reporting emotional responses. Limbic and paralimbic brain areas responded to the expressive dynamics of human music performance, and both emotion and reward related activations during music listening were dependent upon musical training. Moreover, dynamic changes in timing predicted ratings of emotional arousal, as well as real-time changes in neural activity. BOLD signal changes correlated with expressive timing fluctuations in cortical and subcortical motor areas consistent with pulse perception, and in a network consistent with the human mirror neuron system. These findings show that expressive music performance evokes emotion and reward related neural activations, and that music's affective impact on the brains of listeners is altered by musical training. Our observations are consistent with the idea that music performance evokes an emotional response through a form of empathy that is based, at least in part, on the perception of movement and on violations of pulse-based temporal expectancies
How Psychological Stress Affects Emotional Prosody
We explored how experimentally induced psychological stress affects the production and recognition of vocal emotions. In Study 1a, we demonstrate that sentences spoken by stressed speakers are judged by naive listeners as sounding more stressed than sentences uttered by non-stressed speakers. In Study 1b, negative emotions produced by stressed speakers are generally less well recognized than the same emotions produced by non-stressed speakers. Multiple mediation analyses suggest this poorer recognition of negative stimuli was due to a mismatch between the variation of volume voiced by speakers and the range of volume expected by listeners. Together, this suggests that the stress level of the speaker affects judgments made by the receiver. In Study 2, we demonstrate that participants who were induced with a feeling of stress before carrying out an emotional prosody recognition task performed worse than non-stressed participants. Overall, findings suggest detrimental effects of induced stress on interpersonal sensitivity
Acoustic Intensity Causes Perceived Changes in Arousal Levels in Music: An Experimental Investigation
Listener perceptions of changes in the arousal expressed by classical music have been found to correlate with changes in sound intensity/loudness over time. This study manipulated the intensity profiles of different pieces of music in order to test the causal nature of this relationship. Listeners (N = 38) continuously rated their perceptions of the arousal expressed by each piece. An extract from Dvorak's Slavonic Dance Opus 46 No 1 was used to create a variant in which the direction of change in intensity was inverted, while other features were retained. Even though it was only intensity that was inverted, perceived arousal was also inverted. The original intensity profile was also superimposed on three new pieces of music. The time variation in the perceived arousal of all pieces was similar to their intensity profile. Time series analyses revealed that intensity variation was a major influence on the arousal perception in all pieces, in spite of their stylistic diversity
Empathy Manipulation Impacts Music-Induced Emotions: A Psychophysiological Study on Opera
This study investigated the effects of voluntarily empathizing with a musical performer (i.e., cognitive empathy) on music-induced emotions and their underlying physiological activity. N = 56 participants watched video-clips of two operatic compositions performed in concerts, with low or high empathy instructions. Heart rate and heart rate variability, skin conductance level (SCL), and respiration rate (RR) were measured during music listening, and music-induced emotions were quantified using the Geneva Emotional Music Scale immediately after music listening. Listening to the aria with sad content in a high empathy condition facilitated the emotion of nostalgia and decreased SCL, in comparison to the low empathy condition. Listening to the song with happy content in a high empathy condition also facilitated the emotion of power and increased RR, in comparison to the low empathy condition. To our knowledge, this study offers the first experimental evidence that cognitive empathy influences emotion psychophysiology during music listening
Neural and physiological data from participants listening to affective music
Music provides a means of communicating affective meaning. However, the neurological mechanisms by which music induces affect are not fully understood. Our project sought to investigate this through a series of experiments into how humans react to affective musical stimuli and how physiological and neurological signals recorded from those participants change in accordance with self-reported changes in affect. In this paper, the datasets recorded over the course of this project are presented, including details of the musical stimuli, participant reports of their felt changes in affective states as they listened to the music, and concomitant recordings of physiological and neurological activity. We also include non-identifying meta data on our participant populations for purposes of further exploratory analysis. This data provides a large and valuable novel resource for researchers investigating emotion, music, and how they affect our neural and physiological activity
- …