36 research outputs found
Towards an experiment on perception of affective music generation using MetaCompose
MetaCompose is a music generator based on a hybrid evolutionary technique combining FI-2POP and multi-objective optimization. In this paper we employ the MetaCompose music generator to create music in real-time that expresses different mood-states in a game-playing environment (Checkers) and present preliminary results of an experiment focusing on determining (i) if differences in player experience can be observed when using affective-dynamic music compared to static music; and (ii) if any difference is observed when the music supports the game's internal narrative/state. Participants were tasked to play two games of Checkers while listening to two (out of three) different set-ups of game-related generated music. The possible set-ups were: static expression, consistent affective expression, and random affective expression.</p
MetaCompose: A Compositional Evolutionary Music Composer
This paper describes a compositional, extensible framework for music composition and a user study to systematically evaluate its core components. These components include a graph traversal-based chord sequence generator, a search-based melody generator and a pattern-based accompaniment generator. An important contribution of this paper is the melody generator which uses a novel evolutionary technique combining FI-2POP and multi-objective optimization. A participant-based evaluation overwhelmingly confirms that all current components of the framework combine effectively to create harmonious, pleasant and interesting compositions
Mood Dependent Music Generator
Music is one of the most expressive media to show and manipulate emotions, but there have been few studies on how to generate music connected to emotions. Such studies have always been shunned upon by musicians affirming that a machine cannot create expressive music, as it's the composer's and player's experiences and emotions that get poured into the piece. At the same time another problem is that music is highly complicated (and subjective) and finding out which elements transmit certain emotions is not an easy task.This demo wants to show how the manipulation of a set of features can actually change the mood the music transmits, hopefully awakening an interest in this area of research
Evaluating Musical Foreshadowing of Videogame Narrative Experiences
We experiment with mood-expressing, procedurally gener-ated music for narrative foreshadowing in videogames, in-vestigating the relationship between music and the player’s experience of narrative events in a game. We designed and conducted a user study in which the game’s music expresses true foreshadowing in some trials (e.g. foreboding music before a negative event) and false foreshadowing in others (e.g. happy music that does not lead to a positive event). We observed players playing the game, recorded analytics data, and had them complete a survey upon completion of the gameplay. Thirty undergraduate and graduate students participated in the study. Statistical analyses suggest that the use of musical cues for narrative foreshadowing induces a better perceived consistency between music and game narra-tive. Surprisingly, false foreshadowing was found to enhance the player’s enjoyment
Towards Diverse Non-Player Character behaviour discovery in multi-agent environments
This paper introduces a method for developing diverse Non-Player Character (NPC) behaviour through a multiagent genetic algorithm based on Map-Elites. We examine the outcomes of implementing our system in a test environment, with a particular emphasis on the diversity of the evolved agents in the feature space. This research is motivated by how diverse NPCs are an important factor for improving player experience. We show how our multi agent map-elite algorithm is capable of isolating the evolved NPCs in the chosen feature space. Results showed that variation in agent fitness could be predicted with 40 % from agent genomes, when agents played 100 games each.</p
Adaptive Agents in 1v1 Snake Game with Dynamic Environment
This paper delves into the adaptability of Proximal Policy Optimization (PPO)-trained agents within dynamic environments. Typically, an agent is trained within a specific environment, learning to maximise reward acquisition and to navigate it effectively. However, alterations to this environment can lead to performance deficiencies. Existing research does not fully elucidate how the training of agents influences their adaptability in different environments and which parameters significantly impact this. This study aims to fill this gap, contributing to the creation of more versatile intelligent agents. The objective of this study is to explore how training agents in various environments affects their adaptability when introduced to unfamiliar environments. To this end, 36 models were trained using 36 different configurations to play a one-versus-one (1v1) Snake game. These models were subsequently compared against each configuration to measure their adaptability. The results reveal that map size substantially affect the adaptability of agents in different environments. Interestingly, the results showed that the most adaptive agents were not those trained on the most expansive and complex environment, but rather the simplest.</p
Can You Feel It?: Evaluation of Affective Expression in Music Generated by MetaCompose
This paper describes an evaluation conducted on the MetaCompose music generator, which is based on evolutionary computation and uses a hybrid evolutionary technique that combines FI-2POP and multi-objective optimization. The main objective of MetaCompose is to create music in real-time that can express different mood-states. The experiment presented here aims to evaluate: (i) if the perceived mood experienced by the participants of a music score matches intended mood the system is trying to express and (ii) if participants can identify transitions in the mood expression that occur mid-piece. Music clips including transitions and with static affective states were produced by MetaCompose and a quantitative user study was performed. Participants were tasked with annotating the perceived mood and moreover were asked to annotate in real-time changes in valence. The data collected confirms the hypothesis that people can recognize changes in music mood and that MetaCompose can express perceptibly different levels of arousal. In regards to valence we observe that, while it is mainly perceived as expected, changes in arousal seems to also influence perceived valence, suggesting that one or more of the music features MetaCompose associates with arousal has some effect on valence as well
Evolving in-game mood-expressive music with MetaCompose
MetaCompose is a music generator based on a hybrid evolutionary technique that combines FI-2POP and multi-objective optimization. In this paper we employ the MetaCompose music generator to create music in real-time that expresses different mood-states in a game-playing environment (Checkers). In particular, this paper focuses on determining if differences in player experience can be observed when: (i) using affective-dynamic music compared to static music, and (ii) the music supports the game's internal narrative/state. Participants were tasked to play two games of Checkers while listening to two (out of three) different set-ups of game-related generated music. The possible set-ups were: static expression, consistent affective expression, and random affective expression. During game-play players wore a E4 Wristband, allowing various physiological measures to be recorded such as blood volume pulse (BVP) and electromyographic activity (EDA). The data collected confirms a hypothesis based on three out of four criteria (engagement, music quality, coherency with game excitement, and coherency with performance) that players prefer dynamic affective music when asked to reflect on the current game-state. In the future this system could allow designers/composers to easily create affective and dynamic soundtracks for interactive applications.</p
