3 research outputs found
Procedural generation of music-guided weapons
Beyond the standard use of music as a passive
and, sometimes, optional component of player experience the
impact of music as a guide for the procedural generation of
game content has not been explored yet. Being a core elicitor
of player experience music can be used to drive the generation
of personalized game content for a particular musical theme,
song or sound effect being played during the game. In this paper
we introduce a proof-of-concept game demonstrator exploring
the relationship between music and visual game content across
different playing behaviors and styles. For that purpose, we
created a side-scroller shooter game where players can affect the
relationship between projectiles’ trajectories and the background
music through interactive evolution. By coupling neuroevolution
of augmented topologies with interactive evolution we are able to
create an initial arsenal of innovative weapons. Those weapons
are both interesting to play with and also create novel fusions of
visual and musical aesthetics.Thanks to Ryan Abela for his input on designing the sound
extraction methods. The research was supported, in part, by the
FP7 Marie Curie CIG project AutoGameDesign (project no:
630665).peer-reviewe
AudioInSpace : exploring the creative fusion of generative audio, visuals and gameplay
Computer games are unique creativity domains in that they
elegantly fuse several facets of creative work including visuals, narra-
tive, music, architecture and design. While the exploration of possibil-
ities across facets of creativity o ers a more realistic approach to the
game design process, most existing autonomous (or semi-autonomous)
game content generators focus on the mere generation of single domains
(creativity facets) in games. Motivated by the sparse literature on mul-
tifaceted game content generation, this paper introduces a multifaceted
procedural content generation (PCG) approach that is based on the in-
teractive evolution of multiple arti cial neural networks that orchestrate
the generation of visuals, audio and gameplay. The approach is evaluated
on a spaceship shooter game. The generated artifacts|a fusion of audio-
visual and gameplay elements | showcase the capacity of multifaceted
PCG and its evident potential for computational game creativity.This re-search is supported, in part, by the FP7 ICT project C2Learn (project no:
318480) and by the FP7 Marie Curie CIG project AutoGameDesign (project
no: 630665).peer-reviewe