research

Interactive Sonification of Collaborative AR-based Planning Tasks for Enhancing Joint Attention

Abstract

Neumann A, Hermann T. Interactive Sonification of Collaborative AR-based Planning Tasks for Enhancing Joint Attention. In: Strumiłło P, Bujacz M, Popielata M, eds. Proceedings of the 19th International Conference on Auditory Displays. The International Community for Auditory Display (ICAD); 2013: 49-55.This paper introduces a novel sonification-based interaction support for cooperating users in an Augmented Reality setting. When using head-mounted AR displays, the field of view is limited which causes users to miss important activities such as object interactions or deictic references of their interaction partner to (re-)establish joint attention. We introduce an interactive sonification which makes object manipulations of both interaction partners mutually transparent by sounds that convey information about the kind of activity, and which can optionally even identify the object itself. In this paper we focus on the sonification method, interaction design and sound design, and we furthermore render the sonification both from sensor data (e.g. object tracking) and manual annotations. As a spin-off of our approach we propose this method further for the enhancement of interaction observation, data analysis, and multimodal annotation in interactional linguistics and conversation analysis

    Similar works