oaioai:eprints.mdx.ac.uk:11397

Experience-driven procedural music generation for games

Abstract

As video games have grown from crude and simple circuit-based artefacts to a multibillion dollar worldwide industry, video-game music has become increasingly adaptive. Composers have had to use new techniques to avoid the traditional, event-based approach where music is composed mostly of looped audio tracks, which can lead to music that is too repetitive. In addition, these cannot scale well in the design of today’s games, which have become increasingly complex and nonlinear in narrative. This paper outlines the use of experience-driven procedural music generation, to outline possible ways forward in the dynamic generation of music and audio according to user gameplay metrics

    Similar works

    Full text

    thumbnail-image

    Middlesex University Research Repository

    Provided original full text link
    oaioai:eprints.mdx.ac.uk:11397Last time updated on 9/22/2013

    This paper was published in Middlesex University Research Repository.

    Having an issue?

    Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.