16 research outputs found
From Acceleration to Rhythmicity : Smartphone-Assessed Movement Predicts Properties of Music
Music moves us. Yet, querying music is still a disembodied process in most music rec- ommender scenarios. New mediation technologies like querying music by movement would take account of the empirically well founded knowledge of embodied mu- sic cognition. Thus, the goal of the presented study was to explore how movement captured by smartphone accelerometer data can be related to musical properties. Participants (N = 23, mean age = 34.6 yrs, SD = 13.7 yrs, 13 females, 10 males) moved a smartphone to 15 musical stimuli of 20s length presented in random order. Motion features related to tempo, smoothness, size, and regularity were extracted from accelerometer data to predict the musical qualities “rhythmicity”, “pitch level + range” and “complexity” assessed by three music experts. Motion features se- lected by a stepwise AIC model predicted the musical properties to the following degrees “rhythmicity” (R2 = .45), “pitch level and range” (R2 = .06) and “com- plexity” (R2 = .15). We conclude that (rhythmic) music properties can be predicted from the movement it evoked, and that an embodied approach to Music Information Retrieval is feasible
Smartphone-Assessed Movement Predicts Music Properties : Towards Integrating Embodied Music Cognition into Music Recommender Services via Accelerometer
Numerous studies have shown a close relationship between move- ment and music [7], [17], [11], [14], [16], [3], [8]. That is why Leman calls for new mediation technologies to query music in a corporeal way [9]. Thus, the goal of the presented study was to explore how movement captured by smartphone accelerometer data can be re- lated to musical properties. Participants (N = 23, mean age = 34.6 yrs, SD = 13.7 yrs, 13 females, 10 males) moved a smartphone to 15 musical stimuli of 20s length presented in random order. Mo- tion features related to tempo, smoothness, size, regularity, and direction were extracted from accelerometer data to predict the musical qualities “rhythmicity", “pitch level + range" and "complex- ity“ assessed by three music experts. Motion features selected by a 20-fold lasso predicted the musical properties to the following degrees “rhythmicity" (R2 : .47), pitch level and range (R2 : .03) and complexity (R2 : .10). As a consequence, we conclude that music properties can be predicted from the movement it evoked, and that an embodied approach to Music Information Retrieval is feasible
From Motion to Emotion : Accelerometer Data Predict Subjective Experience of Music
Music is often discussed to be emotional because it reflects expressive movements in audible form. Thus, a valid approach to measure musical emotion could be to assess movement stimulated by music. In two experiments we evaluated the discriminative power of mobile-device generated acceleration data produced by free movement during music listening for the prediction of ratings on the Geneva Emotion Music Scales (GEMS-9). The quality of prediction for different dimensions of GEMS varied between experiments for tenderness (R12(first experiment) = 0.50, R22(second experiment) = 0.39), nostalgia (R12 = 0.42, R22 = 0.30), wonder (R12 = 0.25, R22 = 0.34), sadness (R12 = 0.24, R22 = 0.35), peacefulness (R12 = 0.20, R22 = 0.35) and joy (R12 = 0.19, R22 = 0.33) and transcendence (R12 = 0.14, R22 = 0.00). For others like power (R12 = 0.42, R22 = 0.49) and tension (R12 = 0.28, R22 = 0.27) results could be almost reproduced. Furthermore, we extracted two principle components from GEMS ratings, one representing arousal and the other one valence of the experienced feeling. Both qualities, arousal and valence, could be predicted by acceleration data, indicating, that they provide information on the quantity and quality of experience. On the one hand, these findings show how music-evoked movement patterns relate to music-evoked feelings. On the other hand, they contribute to integrate findings from the field of embodied music cognition into music recommender systems
Polarity Profile of Mean GEMS Ratings for the First Experiement.
<p>Polarity Profile of Mean GEMS Ratings for the First Experiement.</p
Principle Component Loadings of GEMS Ratings of First Experiment.
<p>Principle Component Loadings of GEMS Ratings of First Experiment.</p
Principle Component Loadings of GEMS ratings of Second Experiment.
<p>Principle Component Loadings of GEMS ratings of Second Experiment.</p
List of Music Excerpts First Experiment.
<p>List of Music Excerpts First Experiment.</p
R-Squared and RMSE for first experiment ranked according to their <i>R</i><sup>2</sup> on the training set.
<p>R-Squared and RMSE for first experiment ranked according to their <i>R</i><sup>2</sup> on the training set.</p
Fixed Effects Modeling Parameter Estimates for GEMS Ratings and their Principle Components of Second Experiment CONTINUED.
<p>Fixed Effects Modeling Parameter Estimates for GEMS Ratings and their Principle Components of Second Experiment CONTINUED.</p
R-Squared and RMSE for Second Experiment Ranked According to their <i>R</i><sup>2</sup> on the Training Set.
<p>R-Squared and RMSE for Second Experiment Ranked According to their <i>R</i><sup>2</sup> on the Training Set.</p