20 research outputs found
Interactive Sonification to Support Joint Attention in Augmented Reality-based Cooperation
Neumann A, Hermann T, Tünnermann R. Interactive Sonification to Support Joint Attention in Augmented Reality-based Cooperation. In: Proceedings of ISon 2013, 4th Interactive Sonification Workshop. 2013: 58-64.This paper presents and evaluates interactive sonifications to sup- port periphery sensing and joint attention in situations with a limited field of view. Particularly Head-mounted AR displays limit the field of view and thus cause users to miss relevant activities of their interaction partner, such as object interactions or deictic references that normally would be effective to establish joint attention. We give some insight into the differences between face-to-face interaction and interaction via the AR system and introduce five different interactive sonifications which make object manipulations of interaction partners audible by sonifications that convey information about the kind of activity. Finally we present the evaluation of our designs in a study where participants observe an interac- tion episode and rate features of the sonification in questionnaires. We conclude the results into factors for acceptable sonifications to support dyadic interaction
Blended Sonification: Sonification for Casual Interaction
Tünnermann R, Hammerschmidt J, Hermann T. Blended Sonification: Sonification for Casual Interaction. In: ICAD 2013 - Proceedings of the International Conference on Auditory Display. Łódź, Poland; 2013: 119-126.In recent years, graphical user interfaces have become almost ubiquitous in form of notebooks, smartphones and tablets. These systems normally force the user to attend to an often very specific and narrow screen and thus squeeze the information through a chokepoint. This ties the users’ attention to the device and affects other activities and social interaction. In this paper we introduce Blended Sonifications as sonifications that blend into the users’ environment without confronting users with any explicitly perceived technology. Blended Sonification systems can either be used to display information or to provide ambient communication channels. We present a framework that guides developers towards the identification of suitable information sources and appropriate auditory interfaces. We aim at improving the design of interactions and experiences. Along with the introduction and definition of the framework, this paper presents interface examples, both for mediated communication and information display applications
EcoSonic: Auditory Displays supporting Fuel-Efficient Driving
Hammerschmidt J, Tünnermann R, Hermann T. EcoSonic: Auditory Displays supporting Fuel-Efficient Driving. In: Thomas O, Ebba H, eds. NordiCHI '14 Proceedings of the 8th Nordic Conference on Human-Computer Interaction: Fun, Fast, Foundational. Helsinki, Finland: ACM New York, NY, USA; 2014: 979-982.In this paper, we present our work towards an auditory display that is capable of supporting a fuel-efficient operation of vehicles. We introduce five design approaches for employing the auditory modality for a fuel economy display. Furthermore, we have implemented a novel auditory display based on one of these approaches, foussing on giving feedback on the engine’s optimal rpm range, which is a major factor for eco-driving. Finally, we report on the development of a simple but physically realistic car simulator, which allows for a reproducible evaluation of prototype auditory displays as well as a comparison to state-of-the-art visual fuel efficiency indicators
EcoSonic: Towards an Auditory Display Supporting a Fuel-Efficient Driving Style
Hammerschmidt J, Tünnermann R, Hermann T. EcoSonic: Towards an Auditory Display Supporting a Fuel-Efficient Driving Style. In: Sandra P, Howard C, Radek R, eds. Sonification of Health and Environmental Data. York, England; 2014: 51-56.In order to support drivers in adopting a more fuel efficient driving style, there currently exists a range of fuel economy displays, providing drivers feedback on instantaneous and long-term fuel consumption. While these displays rely almost completely on visual components for conveying relevant information, we argue that there are significant benefits in using auditory interfaces for providing feedback while driving.
We review existing literature and discuss various design strategies for auditory displays that are applicable for supporting a fuel-efficient driving style. Exploring one of these design strategies, we furthermore introduce several prototypical sonification designs
Report on the In-vehicle Auditory Interactions Workshop: Taxonomy, Challenges, and Approaches
Jeon M, Hermann T, Bazilinskyy P, et al. Report on the In-vehicle Auditory Interactions Workshop: Taxonomy, Challenges, and Approaches. In: Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications - Automotive'UI 15. 2015: 1-5.As driving is mainly a visual task, auditory displays play a critical role for in-vehicle interactions.To improve in-vehicle auditory interactions to the advanced level, auditory display researchers and automotive user interface researchers came together to discuss this timely topic at an in-vehicle auditory interactions workshop at the International Conference on Auditory Display (ICAD).The present paper reports discussion outcomes from the workshop for more discussions at the AutoUI conference
InfoDrops: Sonification for Enhanced Awareness of Resource Consumption in the Shower
Hammerschmidt J, Tünnermann R, Hermann T. InfoDrops: Sonification for Enhanced Awareness of Resource Consumption in the Shower. In: Strumiłło P, Bujacz M, Popielata M, eds. ICAD 2013 - Proceedings of the International Conference on Auditory Display. Łódź, Poland: Lodz University of Technology Press; 2013: 57-64.Although most of us strive to develop a sustainable and less resource-intensive behavior, this unfortunately is a difficult task, because often we are unaware of relevant information, or our focus of attention lies elsewhere.
Based on this observation, we present a new approach for an unobtrusive and affective ambient auditory information display to become and stay aware of water and energy consumption while taking a shower.
Using the interaction sound of waterdrops falling onto the bathtub as a carrier for information, our system supports users to be in touch with resource-related variables.
We explore the usage of an affective dimension as an additional layer of information and introduce our 4/5-factor approach to adapt the auditory display's output so that it supports a slow but steady adjustment of the personal showering habit over time.
We present and discuss several alternative sound and interaction designs
Supplementary Material for "Multi-Touch Interactions for Model-Based Sonification"
Tünnermann R, Hermann T. Supplementary Material for "Multi-Touch Interactions for Model-Based Sonification". Bielefeld University; 2009.<img
src="https://pub.uni-bielefeld.de/download/2698553/2702889"
width="200" style="float:right;">
This paper presents novel interaction modes for Model-Based Soni- fication (MBS) via a multi-touch interface. We first lay out details about the constructed multi-touch surface. This is followed by a description of the Data Sonogram Sonification Model and how it is implemented using the system. Modifications from the original sonification model such as the limited space scans are described and discussed with sonification examples. Videos showing exam- ples of interaction are provided for various data sets. Beyond Data Sonograms, the presented system provides a basis for the imple- mentation of known and novel sonification models. We discuss the available interaction modes with multi-touch surfaces and how these interactions can be profitably used to control spatial and non- spatial sonification models
Controlling Ambient Information Flow Between Smart Objects with a Mobile Mixed-Reality Interface
Kriesten B, Tünnermann R, Mertes C, Hermann T. Controlling Ambient Information Flow Between Smart Objects with a Mobile Mixed-Reality Interface. In: MobileHCI 2010: Proceedings of the 12th International Conference on Human-Computer Interaction with Mobile Devices and Services. New York, NY, USA: ACM; 2010: 405-406.In this work we present a method to intuitively issue control over devices in smart environments, to display data that smart objects and sensors provide, and to create and manipulate flows of information in smart environments. This makes it easy to customize smart environments by linking arbitrary data sources to various display modalities on the fly. Touchscreen smartphones – as readily available multi-purpose devices – are used to overlay real objects with virtual controls. We evaluated this system with a first qualitative user study
Supplementary Material for "Auditory Augmentation"
Bovermann T, Tünnermann R, Hermann T. Supplementary Material for "Auditory Augmentation". Bielefeld University; 2010.<img
src="https://pub.uni-bielefeld.de/download/2763923/2901784"
width="200" style="float:right;" > Auditory Augmentations are building blocks supporting the design of data representation tools, which unobtrusively alter the auditory characteristics of structure-borne sounds. The system enriches the structure-borne sound of objects with a sonification of (near) real time data streams. The object’s auditory gestalt is shaped by data.driven parameters, creating a subtle display for ambient data streams. Auditory augmentation can be easily overlaid to existing sounds, and does not change prominent auditory features of the augmented objects like the sound’s timing or its volume. In a peripheral monitoring situation, the data stays out of the users’ attention if they want to concentrate on other items. However, a characteristic change will catch the users’ attention.
<video controls="controls"
width="100%" height="100%">
<source src="https://pub.uni-bielefeld.de/download/2763923/2763924"
type="video/mp4" />
</center
Supplementary Material for "Growing Neural Gas Sonification Model for Interactive Surfaces"
Kolbe L, Tünnermann R, Hermann T. Supplementary Material for "Growing Neural Gas Sonification Model for Interactive Surfaces". Bielefeld University; 2010.<img
src="https://pub.uni-bielefeld.de/download/2698054/2702780"
width="300" style="float:right;" >
In this paper we present our reimplementation of the Growing Neural Gas Sonification for interactive surfaces such as our t-Desk or touch-capable tablet PCs. Growing Neural Gas (GNG) is an undirected learning algorithm that incrementally 'grows' a network graph into data distributions, revealing the data distributions' intrinsic dimensionality and aspects of its structure. The GNG Sonification (GNGS) provides a method to interactively explore the GNG during the growing process, utilizing a Model- Based Sonification (MBS) to convey audible information about the data distribution in addition to the visualization. The goal of our reimplementation was to be able to rapidly grasp the structure of the sonified and visualized data, to give the user the ability to conduct direct A/B comparisons between different (or similar) clusters within a data distribution. The direct bi-manual interaction as well as a simplified full-screen touchable user interface helps to focus on the exploration of the GNG rather than the interaction itself. We present and discuss different interaction metaphors for the excitation of the model setup in this MBS.
#### Quiz
<video controls="controls"
width="80%" height="80%">
<source src="https://pub.uni-bielefeld.de/download/2698054/2698057"
type="video/mp4" />
#### Snake
<video controls="controls"
width="80%" height="80%">
<source src="https://pub.uni-bielefeld.de/download/2698054/2698055"
type="video/mp4" />
#### Three Cluster
<video controls="controls"
width="80%" height="80%">
<source src="https://pub.uni-bielefeld.de/download/2698054/2698056"
type="video/mp4" />
#### Video Example 1
<video controls="controls"
width="80%" height="80%">
<source src="https://pub.uni-bielefeld.de/download/2698054/2698058"
type="video/mp4" />
#### Video Example 2
<video controls="controls"
width="80%" height="80%">
<source src="https://pub.uni-bielefeld.de/download/2698054/2698059"
type="video/mp4" />
#### Video Example 3
<video controls="controls"
width="80%" height="80%">
<source src="https://pub.uni-bielefeld.de/download/2698054/2698060"
type="video/mp4" />
</center