'Institute of Electrical and Electronics Engineers (IEEE)'
Doi
DOI:10.1109/TCIAIG.2012.2212899
Abstract
As video games have grown from crude and simple
circuit-based artefacts to a multibillion dollar worldwide industry,
video-game music has become increasingly adaptive.
Composers have had to use new techniques to avoid the traditional,
event-based approach where music is composed mostly of
looped audio tracks, which can lead to music that is too repetitive.
In addition, these cannot scale well in the design of today’s
games, which have become increasingly complex and nonlinear
in narrative. This paper outlines the use of experience-driven
procedural music generation, to outline possible ways forward
in the dynamic generation of music and audio according to user
gameplay metrics
Is data on this page outdated, violates copyrights or anything else? Report the problem now and we will take corresponding actions after reviewing your request.