45 research outputs found

    AltAR/table: A Platform for Plausible Auditory Augmentation

    Get PDF
    Presented at the 27th International Conference on Auditory Display (ICAD 2022) 24-27 June 2022, Virtual conference.Auditory feedback from everyday interactions can be augmented to project digital information in the physical world. For that purpose, auditory augmentation modulates irrelevant aspects of already existing sounds while at the same time preserving relevant ones. A strategy for maintaining a certain level of plausibility is to metaphorically modulate the physical object itself. By mapping information to physical parameters instead of arbitrary sound parameters, it is assumed that even untrained users can draw on prior knowledge. Here we present AltAR/table, a hard- and software platform for plausible auditory augmentation of flat surfaces. It renders accurate augmentations of rectangular plates by capturing the structure-borne sound, feeding it through a physical sound model, and playing it back through the same object in real time. The implementation solves basic problems of equalization, active feedback control, spatialization, hand tracking, and low-latency signal processing. AltAR/table provides the technical foundations of object-centered auditory augmentations, for embedding sonifications into everyday objects such as tables, walls, or floors

    The influence of different BRIR modification techniques on externalization and sound quality

    Get PDF
    International audienceIn the context of binaural audio, externalization refers to the sensation of virtual sound sources being located outside of the listener's head. Binaural reproduction using anechoic head-related impulse responses is known to suffer from poor externalization. The degree of externalization can be increased by reverberation, as contained in binaural room impulse responses. However, the presence of reverberation is not always desired since the original sound of a recording should usually be preserved. This study concerns the dilemma of creating well-externalized dry-sounding signals. We investigated the manipulation of either the impulse response length, the reverberation time, or the direct-to-reverberant energy ratio regarding externalization and attributes of sound quality. As expected, each condition is a compromise between externalization and sound quality. While externalization increases with increasing amount of reverberation for all methods in a similar way, our findings show that the differences between them lie in sound color and perceived naturalness

    Plausible Auditory Augmentation of Physical Interaction

    Get PDF
    Weger M, Hermann T, Höldrich R. Plausible Auditory Augmentation of Physical Interaction. In: Proceedings of the 24th International Conference on Auditory Display. Sonification as ADSR. (ICAD 2018). Michigan: ICAD; 2018: 97-104.Interactions with physical objects usually evoke sounds, i.e., audi-tory feedback that depends on the interacting objects (e.g., table,hand, or pencil) and interaction type (e.g., tapping or scratching).The continuous real-time adaptation of sound during interactionenables the manipulation/refinement of perceived characteristics(size, material) of physical objects. Furthermore, when controlledby unrelated external data, the resulting ambient sonifications cankeep users aware of changing data. This article introduces the con-cept ofplausibilityto the topic of auditory augmentations of phys-ical interactions, aiming at providing an experimentation platformfor investigating surface-based physical interactions, understand-ing relevant acoustic cues, redefining these via auditory augmenta-tion / blended sonification and particularly to empirically measurethe plausibility limits of such auditory augmentations. Besidesconceptual contributions along the trade-off between plausibilityand usability, a practical experimentation system is introduced, to-gether with a very first qualitative pilot study

    Real-time Auditory Contrast Enhancement

    Get PDF
    Weger M, Hermann T, Höldrich R. Real-time Auditory Contrast Enhancement. In: Proceedings of the 25th International Conference on Auditory Display (ICAD 2019). Newcastle: International Community for Auditory Display (ICAD); In Press.Every day, we rely on the information that is encoded in the auditory feedback of our physical interactions. With the goal to perceptually enhance those sound characteristics that are relevant to us -- especially within professional practices such as percussion and auscultation, we introduce the method of real-time Auditory Contrast Enhancement (ACE). The method is derived from algorithms for speech enhancement as well from the remarkable sound processing mechanisms of our ears. ACE is achieved by individual sharpening of spectral as well as temporal structures contained in a sound. It is designed for real-time application with low latency. The discussed examples illustrate that the proposed algorithms are able to significantly enhance spectral and temporal contrast. ### Sound examples and interaction examples #### **ACE of Impact Sounds** ##### **S1.0-xxx.wav** see article for further details

    Spin Quartets. Sonification of the XY Model

    Get PDF
    Presented at the 16th International Conference on Auditory Display (ICAD2010) on June 9-15, 2010 in Washington, DC.We present an intuitive sonification of data from a statistical physics model, the XY-spin model. Topological structures (anti-/vortices) are hidden to the eye by random spin movement. The behavior of the vortices changes by crossing a phase transition as a function of the temperature. Our sonification builds on basic acoustic properties of phase modulation. Only interesting structures like anti-/vortices remain heard, while everything else falls silent, without additional computational effort. The researcher interacts with the data by a graphical user interface. The sonification can be extended to any lattice model where locally turbulent structures are embedded in rather laminar fields

    Haben Sie bereits Antworten?

    Get PDF

    A 3D real time rendering engine for binaural sound reproduction

    Get PDF
    Proceedings of the 9th International Conference on Auditory Display (ICAD), Boston, MA, July 7-9, 2003.A method of computationally efficient 3D sound reproduction via headphones is presented using a virtual Ambisonic approach. Previous studies have shown that incorporating head tracking as well as room simulation is important to improve sound source localization capabilities. The simulation of virtual acoustic space requires to filter the stimuli with head related transfer functions (HRTFs). In time-varying systems this yields the problem of high quality interpolation between different HRTFs. The proposed model states that encoding signals into Ambisonic domain results in time-invariant HRTF filters. The proposed system is implemented on a usual notebook using Pure Data (PD), a graphically based open source real time computer music software

    Interaction patterns for auditory user interfaces

    Get PDF
    Presented at the 11th International Conference on Auditory Display (ICAD2005)This paper proposes the use of interaction patterns in the design process of auditory displays in human-computer interaction. To avoid introducing visual concepts in auditory design, a common ground for developing user interfaces without determining their means of representation is proposed. This meta domain allows for the design of user interfaces which can be equally realised in different interaction modalities or multi-modal settings. Although this work focuses on the auditory domain the concept shown is developed keeping in mind that it should be equally applicable in other modalities. A set of mode independent interaction patterns for design in the meta domain are introduced along with their transformation into the auditory domain. A real world application was chosen to evaluate the approach. MS Explorer was analysed, described through the mode independent interaction patterns and transformed into the auditory domain making extensive use of 3D audio rendering techniques. The result, a virtual audio reality version of a file manager, was evaluated with normally sighted persons as well as visually impaired and blind participants showing the feasibility and usability of the approach

    Report on the In-vehicle Auditory Interactions Workshop: Taxonomy, Challenges, and Approaches

    Get PDF
    Jeon M, Hermann T, Bazilinskyy P, et al. Report on the In-vehicle Auditory Interactions Workshop: Taxonomy, Challenges, and Approaches. In: Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications - Automotive'UI 15. 2015: 1-5.As driving is mainly a visual task, auditory displays play a critical role for in-vehicle interactions.To improve in-vehicle auditory interactions to the advanced level, auditory display researchers and automotive user interface researchers came together to discuss this timely topic at an in-vehicle auditory interactions workshop at the International Conference on Auditory Display (ICAD).The present paper reports discussion outcomes from the workshop for more discussions at the AutoUI conference
    corecore