6,282 research outputs found
Single chords convey distinct emotional qualities to both naïve and expert listeners.
Previous research on music and emotions has been able to pinpoint many structural features conveying emotions. Empirical research on vertical harmony’s emotional qualities, however, has been rare. The main studies in harmony and emotions usually concern the horizontal aspects of harmony, ignoring emotional qualities of chords as such.
An empirical experiment was conducted where participants (N = 269) evaluated pre-chosen chords on a 9-item scale of given emotional dimensions. 14 different chords (major, minor, diminished, augmented triads and dominant, major and minor seventh chords with inversions) were played with two distinct timbres (piano and strings).
The results suggest significant differences in emotion perception across chords. These were consistent with notions about musical conventions, while providing novel data on how seventh chords affect emotion perception. The inversions and timbre also contributed to the evaluations. Moreover, certain chords played on the strings scored moderately high on the dimension of ‘nostalgia/longing,’ which is usually held as a musical emotion rising only from extra-musical connotations and conditioning, not intrinsically from the structural features of the music. The role of background variables to the results was largely negligible, suggesting the capacity of vertical harmony to convey distinct emotional qualities to both naïve and expert listeners
Modelling Emotional Effects of Music: Key Areas of Improvement
Modelling emotions perceived in music and induced by music has garnered increased attention during the last five years. The present paper attempts to put together observations of the areas that need attention in order to make progress in the modelling emotional effects of music. These broad areas are divided into theory, data and context, which are reviewed separately. Each area is given an overview in terms of the present state of the art and promising further avenues, and the main limitations are presented. In theory, there are discrepancies in the terminology and justifications for particular emotion models and focus. In data, reliable estimation of high-level musical concepts and data collection and evaluation routines require systematic attention. In context, which is the least developed area of modelling, the primary area of improvement is incorporating musical context (music genres) into the modelling emotions. In a broad sense, better acknowledgement of music consumption and everyday life context, such as the data provided by social media, may offer novel insights into the modelling emotional effects of music
Single chords convey distinct emotional qualities to both naïve and expert listeners
Previous research on music and emotions has been able to pinpoint many structural features conveying emotions. Empirical research on vertical harmony’s emotional qualities, however, has been rare. The main studies in harmony and emotions usually concern the horizontal aspects of harmony, ignoring emotional qualities of chords as such. An empirical experiment was conducted where participants (N = 269) evaluated pre-chosen chords on a 9-item scale of given emotional dimensions. 14 different chords (major, minor, diminished, augmented triads and dominant, major and minor seventh chords with inversions) were played with two distinct timbres (piano and strings). The results suggest significant differences in emotion perception across chords. These were consistent with notions about musical conventions, while providing novel data on how seventh chords affect emotion perception. The inversions and timbre also contributed to the evaluations. Moreover, certain chords played on the strings scored moderately high on the dimension of ‘nostalgia/longing,’ which is usually held as a musical emotion rising only from extra-musical connotations and conditioning, not intrinsically from the structural features of the music. The role of background variables to the results was largely negligible, suggesting the capacity of vertical harmony to convey distinct emotional qualities to both naïve and expert listeners
EmoteControl: An interactive system for real-time control of emotional expression in music
Several computer systems have been designed for music emotion research that aim to identify how different structural or expressive cues of music influence the emotions conveyed by the music. However, most systems either operate offline by pre-rendering different variations of the music or operate in real-time but focus mostly on structural cues. We present a new interactive system called EmoteControl, which allows users to make changes to both structural and expressive cues (tempo, pitch, dynamics, articulation, brightness, and mode) of music in real-time. The purpose is to allow scholars to probe a variety of cues of emotional expression from non-expert participants who are unable to articulate or perform their expression of music in other ways. The benefits of the interactive system are particularly important in this topic as it offers a massive parameter space of emotion cues and levels for each emotion which is challenging to exhaustively explore without a dynamic system. A brief overview of previous work is given, followed by a detailed explanation of EmoteControl’s interface design and structure. A portable version of the system is also described, and specifications for the music inputted in the system are outlined. Several use-cases of the interface are discussed, and a formal interface evaluation study is reported. Results suggested that the elements controlling the cues were easy to use and understood by the users. The majority of users were satisfied with the way the system allowed them to express different emotions in music and found it a useful tool for research
Age trends in musical preferences in adulthood: 3. Perceived musical attributes as intrinsic determinants of preferences
Increased age has been found to be associated with differences in musical preferences in adulthood. In past research, these differences were mostly attributed to changes in the social context. However, these influences were small and a large proportion of variance in age trends in musical preferences still remains to be explained. The aim of this article is to investigate the hypothesis that age trends in musical preferences are related to differences in preferences for some intrinsic attributes of the music in line with the Music Preferences in Adulthood Model (Bonneville-Roussy et al., 2017). Adult participants (N = 481) were asked to rate their preferences for extracts of 51 audio-music recordings (music clips) and musical attributes related to dynamics, pitch, structure, tempo, and timbre. Audio-features of the 51 clips were extracted using Music Information Retrieval methods. Using self-report, we found that the musical preferences of adults were linked with distinct likings for musical attributes, with large effects. We also discovered that self-rated attributes associated with dynamics and timbre moderated the links between age and musical preferences. Using the extracted features, we found that musical preferences were linked with distinct patterns of musical features. Finally, we established that the patterns of preferences of emerging, young and middle-aged adults were increasingly influenced by audio-features of timbre, dynamics and tonal clarity. These findings suggest that age trends in musical preferences could be partially explained by differences in the ways individuals process the intrinsic attributes of the music with age
Coupled whole-body rhythmic entrainment between two chimpanzees
Dance is an icon of human expression. Despite astounding diversity around the world’s cultures and dazzling abundance of reminiscent animal systems, the evolution of dance in the human clade remains obscure. Dance requires individuals to interactively synchronize their whole-body tempo to their partner’s, with near-perfect precision. This capacity is motorically-heavy, engaging multiple neural circuitries, but also dependent on an acute socio-emotional bond between partners. Hitherto, these factors helped explain why no dance forms were present amongst nonhuman primates. Critically, evidence for conjoined full-body rhythmic entrainment in great apes that could help reconstruct possible proto-stages of human dance is still lacking. Here, we report an endogenously-effected case of ritualized dance-like behaviour between two captive chimpanzees – synchronized bipedalism. We submitted video recordings to rigorous time-series analysis and circular statistics. We found that individual step tempo was within the genus’ range of “solo” bipedalism. Between-individual analyses, however, revealed that synchronisation between individuals was non-random, predictable, phase concordant, maintained with instantaneous centi-second precision and jointly regulated, with individuals also taking turns as “pace-makers”. No function was apparent besides the behaviour’s putative positive social affiliation. Our analyses show a first case of spontaneous whole-body entrainment between two ape peers, thus providing tentative empirical evidence for phylogenies of human dance. Human proto-dance, we argue, may have been rooted in mechanisms of social cohesion among small groups that might have granted stress-releasing benefits via gait-synchrony and mutual-touch. An external sound/musical beat may have been initially uninvolved. We discuss dance evolution as driven by ecologically-, socially- and/or culturally-imposed “captivity”
Being Moved by Unfamiliar Sad Music Is Associated with High Empathy
The paradox of enjoying listening to music that evokes sadness is yet to be fully understood. Unlike prior studies that have explored potential explanations related to lyrics, memories, and mood regulation, we investigated the types of emotions induced by unfamiliar, instrumental sad music, and whether these responses are consistently associated with certain individual difference variables. One hundred and two participants were drawn from a representative sample to minimize self-selection bias. The results suggest that the emotional responses induced by unfamiliar sad music could be characterized in terms of three underlying factors: Relaxing sadness, Moving sadness, and Nervous sadness. Relaxing sadness was characterized by felt and perceived peacefulness and positive valence. Moving sadness captured an intense experience that involved feelings of sadness and being moved. Nervous sadness was associated with felt anxiety, perceived scariness and negative valence. These interpretations were supported by indirect measures of felt emotion. Experiences of Moving sadness were strongly associated with high trait empathy and emotional contagion, but not with other previously suggested traits such as absorption or nostalgia-proneness. Relaxing sadness and Nervous sadness were not significantly predicted by any of the individual difference variables. The findings are interpreted within a theoretical framework of embodied emotions
- …