17 research outputs found

    Music and communication in music psychology

    Get PDF
    There is a general consensus that music is both universal and communicative, and musical dialogue is a key element in much music-therapeutic practice. However, the idea that music is a communicative medium has, to date, received little attention within the cognitive sciences, and the limited amount of research that addresses how and what music communicates has resulted in findings that appear to be of limited relevance to music therapy. This article will draw on ethnomusicological evidence and an understanding of communication derived from the study of speech to sketch a framework within which to situate and understand music as communicative practice. It will outline some key features of music as an interactive participatory medium – including entrainment and floating intentionality – that can help underpin an understanding of music as communicative, and that may help guide experimental approaches in the cognitive science of music to shed light on the processes involved in musical communication and on the consequences of engagement in communication through music for interacting individuals. It will suggest that the development of such approaches may enable the cognitive sciences to provide a more comprehensive, predictive understanding of music in interaction that could be of direct benefit to music therapy. This is the accepted manuscript version. The final version is available at http://pom.sagepub.com/content/42/6/809.full.pdf+htm

    Statistical learning and probabilistic prediction in music cognition: mechanisms of stylistic enculturation

    Get PDF
    Engineering and Physical Sciences Research Council (EPSRC) funding via grant EP/M000702/1

    Musical pluralism and the science of music

    Get PDF
    This is the author accepted manuscript. The final version is available from Springer Verlag via the DOI in this record The scientific investigation of music requires contributions from a diverse array of disciplines (e.g. anthropology, musicology, neuroscience, psychology, music theory, music therapy, sociology, computer science, evolutionary biology, archaeology, acoustics and philosophy). Given the diverse methodologies, interests and research targets of the disciplines involved, we argue that there is a plurality of legitimate research questions about music, necessitating a focus on integration. In light of this we recommend a pluralistic conception of music—that there is no unitary definition divorced from some discipline, research question or context. This has important implications for how the scientific study of music ought to proceed: we show that some definitions are complementary, that is, they reflect different research interests and ought to be retained and, where possible, integrated, while others are antagonistic, they represent real empirical disagreement about music’s nature and how to account for it. We illustrate this in discussion of two related issues: questions about the evolutionary function (if any) of music, and questions of the innateness (or otherwise) of music. These debates have been, in light of pluralism, misconceived. We suggest that, in both cases, scientists ought to proceed by constructing integrated models which take into account the dynamic interaction between different aspects of music

    Generalization of auditory expertise in audio engineers and instrumental musicians

    Get PDF
    From auditory perception to general cognition, the ability to play a musical instrument has been associated with skills both related and unrelated to music. However, it is unclear if these effects are bound to the specific characteristics of musical instrument training, as little attention has been paid to other populations such as audio engineers and designers whose auditory expertise may match or surpass that of musicians in specific auditory tasks or more naturalistic acoustic scenarios. We explored this possibility by comparing students of audio engineering (n = 20) to matched conservatory-trained instrumentalists (n = 24) and to naive controls (n = 20) on measures of auditory discrimination, auditory scene analysis, and speech in noise perception. We found that audio engineers and performing musicians had generally lower psychophysical thresholds than controls, with pitch perception showing the largest effect size. Compared to controls, audio engineers could better memorise and recall auditory scenes composed of non-musical sounds, whereas instrumental musicians performed best in a sustained selective attention task with two competing streams of tones. Finally, in a diotic speech-in-babble task, musicians showed lower signal-to-noise-ratio thresholds than both controls and engineers; however, a follow-up online study did not replicate this musician advantage. We also observed differences in personality that might account for group-based self-selection biases. Overall, we showed that investigating a wider range of forms of auditory expertise can help us corroborate (or challenge) the specificity of the advantages previously associated with musical instrument training

    Network science and the effects of music on the human brain

    Get PDF
    Most people choose to listen to music that they prefer or like such as classical, country or rock. Previous research has focused on how different characteristics of music (i.e., classical versus country) affect the brain. Yet, when listening to preferred music regardless of the type--people report they often experience personal thoughts and memories. To date, understanding how this occurs in the brain has remained elusive. Using network science methods, I evaluated differences in functional brain connectivity when individuals listened to complete songs. Here the results reveal that a circuit important for internally focused thoughts, known as the default mode network, was most connected when listening to preferred music. The results also reveal that listening to a favorite song alters the connectivity between auditory brain areas and the hippocampus, a region responsible for memory and social emotion consolidation. Given that musical preferences are uniquely individualized phenomena and that music can vary in acoustic complexity and the presence or absence of lyrics, the consistency of these results was contrary to previous neuroscientific understanding. These findings may explain why comparable emotional and mental states can be experienced by people listening to music that differs as widely as Beethoven and Eminem. The neurobiological and neurorehabilitation implications of these results are discussed

    Syntax in language and music: what is the right level of comparison?

    Get PDF
    It is often claimed that music and language share a process of hierarchical structure building, a mental "syntax." Although several lines of research point to commonalities, and possibly a shared syntactic component, differences between "language syntax" and "music syntax" can also be found at several levels: conveyed meaning, and the atoms of combination, for example. To bring music and language closer to one another, some researchers have suggested a comparison between music and phonology ("phonological syntax"), but here too, one quickly arrives at a situation of intriguing similarities and obvious differences. In this paper, we suggest that a fruitful comparison between the two domains could benefit from taking the grammar of action into account. In particular, we suggest that what is called "syntax" can be investigated in terms of goal of action, action planning, motor control, and sensory-motor integration. At this level of comparison, we suggest that some of the differences between language and music could be explained in terms of different goals reflected in the hierarchical structures of action planning: the hierarchical structures of music arise to achieve goals with a strong relation to the affective-gestural system encoding tension-relaxation patterns as well as socio-intentional system, whereas hierarchical structures in language are embedded in a conceptual system that gives rise to compositional meaning. Similarities between music and language are most clear in the way several hierarchical plans for executing action are processed in time and sequentially integrated to achieve various goals
    corecore