555 research outputs found
Anticipation is the key to understanding music and the effects of music on emotion
There is certainly a need for a framework to guide the study of the physiological mechanisms underlying the experience of music and the emotions that music evokes. However, this framework should be organised hierarchically, with musical anticipation as its fundamental mechanism
Rhythmic complexity and predictive coding::A novel approach to modeling rhythm and meter perception in music
Musical rhythm, consisting of apparently abstract intervals of accented temporal events, has a remarkable capacity to move our minds and bodies. How does the cognitive system enable our experiences of rhythmically complex music? In this paper, we describe some common forms of rhythmic complexity in music and propose the theory of predictive coding (PC) as a framework for understanding how rhythm and rhythmic complexity are processed in the brain. We also consider why we feel so compelled by rhythmic tension in music. First, we consider theories of rhythm and meter perception, which provide hierarchical and computational approaches to modeling. Second, we present the theory of PC, which posits a hierarchical organization of brain responses reflecting fundamental, survival-related mechanisms associated with predicting future events. According to this theory, perception and learning is manifested through the brainâs Bayesian minimization of the error between the input to the brain and the brainâs prior expectations. Third, we develop a PC model of musical rhythm, in which rhythm perception is conceptualized as an interaction between what is heard (ârhythmâ) and the brainâs anticipatory structuring of music (âmeterâ). Finally, we review empirical studies of the neural and behavioral effects of syncopation, polyrhythm and groove, and propose how these studies can be seen as special cases of the PC theory. We argue that musical rhythm exploits the brainâs general principles of prediction and propose that pleasure and desire for sensorimotor synchronization from musical rhythm may be a result of such mechanisms
Neural Correlates of Music Listening: Does the Music Matter?
The last decades have seen a proliferation of music and brain studies, with a major focus on plastic changes as the outcome of continuous and prolonged engagement with music. Thanks to the advent of neuroaesthetics, research on music cognition has broadened its scope by considering the multifarious phenomenon of listening in all its forms, including incidental listening up to the skillful attentive listening of experts, and all its possible effects. These latter range from objective and sensorial effects directly linked to the acoustic features of the music to the subjectively affective and even transformational effects for the listener. Of special importance is the finding that neural activity in the reward circuit of the brain is a key component of a conscious listening experience. We propose that the connection between music and the reward system makes music listening a gate towards not only hedonia but also eudaimonia, namely a life well lived, full of meaning that aims at realizing oneâs own âdaimonâ or true nature. It is argued, further, that music listening, even when conceptualized in this aesthetic and eudaimonic framework, remains a learnable skill that changes the way brain structures respond to sounds and how they interact with each other
Music and Brain Plasticity: How Sounds Trigger Neurogenerative Adaptations
This contribution describes how music can trigger plastic changes in the brain. We elaborate on the concept of neuroplasticity by focussing on three major topics: the ontogenetic scale of musical development, the phenomenon of neuroplasticity as the outcome of interactions with the sounds and a short survey of clinical and therapeutic applications. First, a distinction is made between two scales of description: the larger evolutionary scale (phylogeny) and the scale of individual development (ontogeny). In this sense, listeners are not constrained by a static dispositional machinery, but they can be considered as dynamical systems that are able to adapt themselves in answer to the solicitations of a challenging environment. Second, the neuroplastic changes are considered both from a structural and functional level of adaptation, with a special focus on the recent findings from network science. The neural activity of the medial regions of the brain seems to become more synchronised when listening to music as compared to rest, and these changes become permanent in individuals such as musicians with year-long musical practice. As such, the question is raised as to the clinical and therapeutic applications of music as a trigger for enhancing the functionality of the brain, both in normal and impaired people
Comment on Solberg and Jensenius: The Temporal Dynamics of Embodied Pleasure in Music
In the paper 'Pleasurable and Intersubjective Embodied Experiences of Electronic Dance Music', Ragnhild Torvanger Solberg and Alexander Refsum Jensenius report on a study in which the movements and self-reported affective responses of a group of dancing participants were recorded and related to structural properties of Electronic Dance Music. They observed that, compared with tracks that had a relatively flat dynamic development, tracks which included a 'break-down', 'build-up' and 'drop' of textural layers were associated with greater changes in movement amount and higher ratings of pleasure. Here I comment on their results and methodological approach and use the opportunity to address the continuous pleasure that was treated as a control in this experiment, discussing some reasons why affective responses to music with more evenly distributed dynamic progressions are so often ignored
MUSIKKENS SPROG
Musik er ofte blevet kaldt for et universelt sprog, men i hvor høj grad er sprog og musik analoge? Med udgangspunkt i mĂĽlinger af hjerneaktiviteten hos musikere og ikke-musikere med EEG (elektroencefalografi) og MEG (magnetisk encefalografi), handler denne artikel om, hvorvidt hjernens aktivitet i forbindelse med musik og sprog kan sammenlignes, og hvad denne indsigt i givet fald fører med sig. Det sandsynliggøres, at den højere kognitive bearbejdning af musik og sprog delvist har samme neurale fundament â for musikere i sĂŚrdeleshed
Automatic Emphysema Detection using Weakly Labeled HRCT Lung Images
A method for automatically quantifying emphysema regions using
High-Resolution Computed Tomography (HRCT) scans of patients with chronic
obstructive pulmonary disease (COPD) that does not require manually annotated
scans for training is presented. HRCT scans of controls and of COPD patients
with diverse disease severity are acquired at two different centers. Textural
features from co-occurrence matrices and Gaussian filter banks are used to
characterize the lung parenchyma in the scans. Two robust versions of multiple
instance learning (MIL) classifiers, miSVM and MILES, are investigated. The
classifiers are trained with the weak labels extracted from the forced
expiratory volume in one minute (FEV) and diffusing capacity of the lungs
for carbon monoxide (DLCO). At test time, the classifiers output a patient
label indicating overall COPD diagnosis and local labels indicating the
presence of emphysema. The classifier performance is compared with manual
annotations by two radiologists, a classical density based method, and
pulmonary function tests (PFTs). The miSVM classifier performed better than
MILES on both patient and emphysema classification. The classifier has a
stronger correlation with PFT than the density based method, the percentage of
emphysema in the intersection of annotations from both radiologists, and the
percentage of emphysema annotated by one of the radiologists. The correlation
between the classifier and the PFT is only outperformed by the second
radiologist. The method is therefore promising for facilitating assessment of
emphysema and reducing inter-observer variability.Comment: Accepted at PLoS ON
Applying Acoustical and Musicological Analysis to Detect Brain Responses to Realistic Music: A Case Study
Music information retrieval (MIR) methods offer interesting possibilities for automatically identifying time points in music recordings that relate to specific brain responses. However, how the acoustical features and the novelty of the music structure affect the brain response is not yet clear. In the present study, we tested a new method for automatically identifying time points of brain responses based on MIR analysis. We utilized an existing database including brain recordings of 48 healthy listeners measured with electroencephalography (EEG) and magnetoencephalography (MEG). While we succeeded in capturing brain responses related to acoustical changes in the modern tango piece Adios Nonino, we obtained less reliable brain responses with a metal rock piece and a modern symphony orchestra musical composition. However, brain responses might also relate to the novelty of the music structure. Hence, we added a manual musicological analysis of novelty in the musical structure to the computational acoustic analysis, obtaining strong brain responses even to the rock and modern pieces. Although no standardized method yet exists, these preliminary results suggest that analysis of novelty in music is an important aid to MIR analysis for investigating brain responses to realistic music.Peer reviewe
- âŚ