From Acceleration to Rhythmicity : Smartphone-Assessed Movement Predicts Properties of Music

Abstract

Music moves us. Yet, querying music is still a disembodied process in most music rec- ommender scenarios. New mediation technologies like querying music by movement would take account of the empirically well founded knowledge of embodied mu- sic cognition. Thus, the goal of the presented study was to explore how movement captured by smartphone accelerometer data can be related to musical properties. Participants (N = 23, mean age = 34.6 yrs, SD = 13.7 yrs, 13 females, 10 males) moved a smartphone to 15 musical stimuli of 20s length presented in random order. Motion features related to tempo, smoothness, size, and regularity were extracted from accelerometer data to predict the musical qualities “rhythmicity”, “pitch level + range” and “complexity” assessed by three music experts. Motion features se- lected by a stepwise AIC model predicted the musical properties to the following degrees “rhythmicity” (R2 = .45), “pitch level and range” (R2 = .06) and “com- plexity” (R2 = .15). We conclude that (rhythmic) music properties can be predicted from the movement it evoked, and that an embodied approach to Music Information Retrieval is feasible

    Similar works