Skip to main content
Article thumbnail
Location of Repository

Authoring augmented soundscapes with user-contributed content

By Jordi Janer, Gerard Roma and Stefan Kersten

Abstract

Augmented reality audio is an area still not sufficiently explored. In this article we address the creation of soundscapes to augment the acoustic information in a physical location. In particular, we focus on authoring tools that make use of user-contributed content. To facilitate the authoring process, our tool integrates the access to Freesound.org, an online repository with more than 120,000 sounds under a Creative Commons license. The sound search combines the traditional text-query with content-based audio classification. The automatic classification allows searching according to a taxonomy of environmental sounds (e.g. drip, impact, wind, etc.). Finally, we implemented a complete augmented soundscapes system that, in an autonomous and continuous manner, spatializes virtual acoustic sources in a geographic location.

Year: 2013
OAI identifier: oai:CiteSeerX.psu:10.1.1.352.7670
Provided by: CiteSeerX
Download PDF:
Sorry, we are unable to provide the full text but you may find it at the following location(s):
  • http://citeseerx.ist.psu.edu/v... (external link)
  • http://mtg.upf.edu/system/file... (external link)
  • Suggested articles


    To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.