Automatic Mood Detection from Acoustic Music Data

Abstract

Music mood describes the inherent emotional meaning of a music clip. It is helpful in music understanding and music search and some music-related applications. In this paper, a hierarchical framework is presented to automate the task of mood detection from acoustic music data, by following some music psychological theories in western cultures. Three feature sets, intensity, timbre and rhythm, are extracted to represent the characteristics of a music clip. Moreover, a mood tracking approach is also presented for a whole piece of music. Experimental evaluations indicate that the proposed algorithms produce satisfactory results

    Similar works