2 research outputs found

    An interactive music playlist generator that responds to user emotion and context

    Get PDF
    This paper aims to demonstrate the mechanisms of a music recommendation system, and accompanying graphical user interface (GUI), that is capable of generating a playlist of songs based upon an individual’s emotion or context. This interactive music playlist generator has been designed as part of a broader system, Intended for mobile devices, which aims to suggest music based upon ‘how the user is feeling’ and ‘what the user is doing’ by evaluating real-time physiological and contextual sensory data using machine learning technologies. For instance, heart rate and skin temperature in conjunction with ambient light, temperature and global positioning satellite (GPS) could be used to a degree to infer one’s current situation and corresponding mood. At present, this interactive music playlist generator has the ability to conceptually demonstrate how a playlist can be formed in accordance with such physiological and contextual parameters. In particular, the affective aspect of the interface is visually represented as a two-dimensional arousal-valence space based upon Russell’s circumplex model of affect (1980). Context refers to environmental, locomotion and activity concepts, and are visually represented in the interface as sliders. These affective and contextual components are discussed in more detail next in Sections 2 and 3, respectively. Section 4 will demonstrate how an affective and contextual music playlist can be formed by interacting with the GUI parameters. For a comprehensive discussion in terms of the development of this research, refer to (Griffiths et al. 2013a, 2013b, 2015). Moreover, refer to Teng et al. (2013) and Yang et al. (2008) for related work in these broader research areas

    Interactive Latent Space for Mood-Based Music Recommendation

    Get PDF
    The way we listen to music has been changing fundamentally in past two decades with the increasing availability of digital recordings and portability of music players. Up to date research in music recommendation attracted millions of users to online, music streaming services, containing tens of millions of tracks (e.g. Spotify, Pandora). The main focus of up to date research in recommender systems has been algorithmic accuracy and optimization of ranking metrics. However, recent work has highlighted the importance of other aspects of the recommendation process, including explanation, transparency, control and user experience in general. Building on these aspects, this dissertation explores user interaction, control and visual explanation of music related mood metadata during recommendation process. It introduces a hybrid recommender system that suggests music artists by combining mood-based and audio content filtering in a novel interactive interface. The main vehicle for exploration and discovery in music collection is a novel visualization that maps moods and artists in the same, latent space, built upon reduced dimensions of high-dimensional artist-mood associations. It is not known what the reduced dimensions represent and this work uses hierarchical mood model to explain the constructed space. Results of two user studies, with over 200 participants each, show that visualization and interaction in a latent space improves acceptance and understanding of both metadata and item recommendations. However, too much of either can result in cognitive overload and a negative impact on user experience. The proposed visual mood space and interactive features, along with the aforementioned findings, aim to inform design of future interactive recommendation systems
    corecore