2,118 research outputs found

    Visual cues in musical synchronisation

    Get PDF
    Although music performance is generally thought of as an auditory activity in the Western tradition, the presence of continuous visual information in live music contributes to the cohesiveness of music ensembles, which presents an interesting psychological phenomenon in which audio and visual cues are presumably integrated. In order to investigate how auditory and visual sensory information are combined in the basic process of synchronising movements with music, this thesis focuses on both musicians and nonmusicians as they respond to two sources of visual information common to ensembles: the conductor, and the ancillary movements (movements that do not directly create sound; e.g. body sway or head nods) of co-performers. These visual cues were hypothesized to improve the timing of intentional synchronous action (matching a musical pulse), as well as increasing the synchrony of emergent ancillary movements between participant and stimulus. The visual cues were tested in controlled renderings of ensemble music arrangements, and were derived from real, biological motion. All three experiments employed the same basic synchronisation task: participants drummed along to the pulse of tempo-changing music while observing various visual cues. For each experiment, participants’ drum timing and upper-body movements were recorded as they completed the synchronisation task. The analyses used to quantify drum timing and ancillary movements came from theoretical approaches to movement timing and entrainment: information processing and dynamical systems. Overall, this thesis shows that basic musical timing is a common ability that is facilitated by visual cues in certain contexts, and that emergent ancillary movements and intentional synchronous movements in combination may best explain musical timing and synchronisation

    Action-based effects on music perception

    Get PDF
    The classical, disembodied approach to music cognition conceptualizes action and perception as separate, peripheral processes. In contrast, embodied accounts of music cognition emphasize the central role of the close coupling of action and perception. It is a commonly established fact that perception spurs action tendencies. We present a theoretical framework that captures the ways in which the human motor system and its actions can reciprocally influence the perception of music. The cornerstone of this framework is the common coding theory, postulating a representational overlap in the brain between the planning, the execution, and the perception of movement. The integration of action and perception in so-called internal models is explained as a result of associative learning processes. Characteristic of internal models is that they allow intended or perceived sensory states to be transferred into corresponding motor commands (inverse modeling), and vice versa, to predict the sensory outcomes of planned actions (forward modeling). Embodied accounts typically refer to inverse modeling to explain action effects on music perception (Leman, 2007). We extend this account by pinpointing forward modeling as an alternative mechanism by which action can modulate perception. We provide an extensive overview of recent empirical evidence in support of this idea. Additionally, we demonstrate that motor dysfunctions can cause perceptual disabilities, supporting the main idea of the paper that the human motor system plays a functional role in auditory perception. The finding that music perception is shaped by the human motor system and its actions suggests that the musical mind is highly embodied. However, we advocate for a more radical approach to embodied (music) cognition in the sense that it needs to be considered as a dynamical process, in which aspects of action, perception, introspection, and social interaction are of crucial importance

    huSync : a model and system for the measure of synchronization in small groups : a case study on musical joint action

    Get PDF
    Human communication entails subtle non-verbal modes of expression, which can be analyzed quantitatively using computational approaches and thus support human sciences. In this paper we present huSync, a computational framework and system that utilizes trajectory information extracted using pose estimation algorithms from video sequences to quantify synchronization between individuals in small groups. The system is exploited to study interpersonal coordination in musical ensembles. Musicians communicate with each other through sounds and gestures, providing nonverbal cues that regulate interpersonal coordination. huSync was applied to recordings of concert performances by a professional instrumental ensemble playing two musical pieces. We examined effects of different aspects of musical structure (texture and phrase position) on interpersonal synchronization, which was quantified by computing phase locking values of head motion for all possible within-group pairs. Results indicate that interpersonal coupling was stronger for polyphonic textures (ambiguous leadership) than homophonic textures (clear melodic leader), and this difference was greater in early portions of phrases than endings (where coordination demands are highest). Results were cross-validated against an analysis of audio features, showing links between phase locking values and event density. This research produced a system, huSync, that can quantify synchronization in small groups and is sensitive to dynamic modulations of interpersonal coupling related to ambiguity in leadership and coordination demands, in standard video recordings of naturalistic human group interaction. huSync enabled a better understanding of the relationship between interpersonal coupling and musical structure, thus enhancing collaborations between human and computer scientists

    Chapter 5 Audience responses in the light of perception–action theories of empathy

    Get PDF
    In recent years, empathy has received considerable research attention as a means of understanding a range of psychological phenomena, and it is fast drawing attention within the fields of music psychology and music education. This volume seeks to promote and stimulate further research in music and empathy, with contributions from many of the leading scholars in the fields of music psychology, neuroscience, music philosophy and education. It exposes current developmental, cognitive, social and philosophical perspectives on research in music and empathy, and considers the notion in relation to our engagement with different types of music and media. Following a Prologue, the volume presents twelve chapters organised into two main areas of enquiry. The first section, entitled 'Empathy and Musical Engagement', explores empathy in music education and therapy settings, and provides social, cognitive and philosophical perspectives about empathy in relation to our interaction with music. The second section, entitled 'Empathy in Performing Together', provides insights into the role of empathy across non-Western, classical, jazz and popular performance domains. This book will be of interest to music educators, musicologists, performers and practitioners, as well as scholars from other disciplines with an interest in empathy research

    ESCOM 2017 Book of Abstracts

    Get PDF

    ESCOM 2017 Proceedings

    Get PDF

    The role of gesture and non-verbal communication in popular music performance, and its application to curriculum and pedagogy

    Get PDF
    Jane Davidson states that ‘the use of the body is vital in generating the technical and expressive qualities of a musical interpretation’ (2002, p. 146). Although technique and expression within music performance are separate elements, ‘they interact with, and depend upon, one another’ (Sloboda, 2000, p. 398) and, therefore, require equal consideration. Although it is possible for a musician to perform with exceptional technical prowess but little expression (Sloboda, 2000), it is important that the significance of the expressive qualities of the performer, and the ramifications of these on the delivery of the given performance, are acknowledged because whilst ‘sound is the greatest result of performance’ (Munoz, 2007, p. 56), music is not exclusively an auditory event; principally because ‘sound is essentially movement’ (Munoz, 2007, p. 56). As a performing art, music relies on the use of the physical self and body in the communicative process, and may require more than technical skill and proficient instrumental handling to be truly communicatively effective not least because, as stated by Juslin and Laukka, ‘music is a means of emotional expression' (2003, p. 774). Through a designed interdisciplinary framework, this thesis examines the use of expressive gesture and non-verbal communication skills in popular music performance, and investigates how these communicative facets can be incorporated into popular music performance education within a higher education curriculum. To do this, this work explores the practices of student and professional musicians, focusing on the areas of gesture, persona and interaction, and uses ethnographic case studies, qualitative interview processes and extracts of video footage of 3 rehearsals and live performances to investigate the importance of the physical delivery of the given musical performance. The findings from these investigations are then applied to existing educational theories to construct a pedagogical approach which will provide student musicians with the knowledge and skill to understand the implications of the art of performance through assimilated study, allowing performers to develop their own unique style of artistic expression, and creating well-rounded, empathetic, and employable musicians who have a visceral understanding of their art form
    • 

    corecore