5 research outputs found

    Spatialized data sonification in a 3D virtual environment

    Get PDF
    Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2013.Cataloged from PDF version of thesis.Includes bibliographical references (pages 67-69).This thesis explores new ways to communicate sensor data by combining spatialized sonification with data visualiation in a 3D virtual environment. A system for sonifying a space using spatialized recorded audio streams is designed, implemented, and integrated into an existing 3D graphical interface. Exploration of both real-time and archived data is enabled. In particular, algorithms for obfuscating audio to protect privacy, and for time-compressing audio to allow for exploration on diverse time scales are implemented. Synthesized data sonification in this context is also explored.by Nicholas D. Joliat.M. Eng

    Ways of Guided Listening: Embodied approaches to the design of interactive sonifications

    Get PDF
    This thesis presents three use cases for interactive feedback. In each case users interact with a system and receive feedback: the primary source of feedback is visual, while a second source of feedback is offered as sonification. The first use case comprised an interactive sonification system for use by pathologists in the triage stage of cancer diagnostics. Image features derived from computational homology are mapped to a soundscape with integrated auditory glance indicating potential regions of interests. The resulting prototype did not meet the requirements of a domain expert. In the second case this thesis presents an interactive sonification plug-in developed for a software package for interactive visualisation of macromolecular complexes. A framework for building different sonification methods in Python and an OSC-controlled sound producing software was established along with sonification methods and a general sonification plugin. It received generally positive feedback, but the mapping was deemed not very transparent. From these cases and ideas in sonification design literature, the Subject-Position-Based Sonification Design Framework (SPBDF) was developed. It explores an alternative conception of design: that working from a frame of reference encompassing a non-expert audience will lead towards sonifications that are more easily understood. A method for the analysis of sonifications according to its criteria is outlined and put into practice to evaluate a range of sonifications. This framework was evaluated in the third use case, a system for sonified feedback for an exercise device designed for back pain rehabilitation. Two different sonifications, one using SPBDF as basis of their design, were evaluated, indicating that interactive sonification can provide valuable feedback and improve task performance (decrease the mean speed) when the soundscape employed invokes an appropriate emotional response in the user

    Developing an interactive overview for non-visual exploration of tabular numerical information

    Get PDF
    This thesis investigates the problem of obtaining overview information from complex tabular numerical data sets non-visually. Blind and visually impaired people need to access and analyse numerical data, both in education and in professional occupations. Obtaining an overview is a necessary first step in data analysis, for which current non-visual data accessibility methods offer little support. This thesis describes a new interactive parametric sonification technique called High-Density Sonification (HDS), which facilitates the process of extracting overview information from the data easily and efficiently by rendering multiple data points as single auditory events. Beyond obtaining an overview of the data, experimental studies showed that the capabilities of human auditory perception and cognition to extract meaning from HDS representations could be used to reliably estimate relative arithmetic mean values within large tabular data sets. Following a user-centred design methodology, HDS was implemented as the primary form of overview information display in a multimodal interface called TableVis. This interface supports the active process of interactive data exploration non-visually, making use of proprioception to maintain contextual information during exploration (non-visual focus+context), vibrotactile data annotations (EMA-Tactons) that can be used as external memory aids to prevent high mental workload levels, and speech synthesis to access detailed information on demand. A series of empirical studies was conducted to quantify the performance attained in the exploration of tabular data sets for overview information using TableVis. This was done by comparing HDS with the main current non-visual accessibility technique (speech synthesis), and by quantifying the effect of different sizes of data sets on user performance, which showed that HDS resulted in better performance than speech, and that this performance was not heavily dependent on the size of the data set. In addition, levels of subjective workload during exploration tasks using TableVis were investigated, resulting in the proposal of EMA-Tactons, vibrotactile annotations that the user can add to the data in order to prevent working memory saturation in the most demanding data exploration scenarios. An experimental evaluation found that EMA-Tactons significantly reduced mental workload in data exploration tasks. Thus, the work described in this thesis provides a basis for the interactive non-visual exploration of a broad range of sizes of numerical data tables by offering techniques to extract overview information quickly, performing perceptual estimations of data descriptors (relative arithmetic mean) and managing demands on mental workload through vibrotactile data annotations, while seamlessly linking with explorations at different levels of detail and preserving spatial data representation metaphors to support collaboration with sighted users

    Spatial Auditory Maps for Blind Travellers

    Get PDF
    Empirical research shows that blind persons who have the ability and opportunity to access geographic map information tactually, benefit in their mobility. Unfortunately, tangible maps are not found in large numbers. Economics is the leading explanation: tangible maps are expensive to build, duplicate and distribute. SAM, short for Spatial Auditory Map, is a prototype created to address the unavail- ability of tangible maps. SAM presents geographic information to a blind person encoded in sound. A blind person receives maps electronically and accesses them using a small in- expensive digitalizing tablet connected to a PC. The interface provides location-dependent sound as a stylus is manipulated by the user, plus a schematic visual representation for users with residual vision. The assessment of SAM on a group of blind participants suggests that blind users can learn unknown environments as complex as the ones represented by tactile maps - in the same amount of reading time. This research opens new avenues in visualization techniques, promotes alternative communication methods, and proposes a human-computer interaction framework for conveying map information to a blind person

    A Path Based Model for Sonification

    Get PDF
    Recently researchers have been interested in non-visual forms of presentation. Sound, touch, smell, as well as vision can be used to depict information. One important medium is sound; it can encode information for the blind or partial sighted, used to display information when it is impossible to use a screen, and the hardware is cheap and widely available. Researchers have investigated sonifying various data sources with different data types and configurations. In this paper we present a novel path-based model that can be used to describe each of these different sonifications. In summary, the path dictates the sonification tour, the data is mapped into sound via a transfer function, and the quantity of information being sonified is determined by both the span and how the abstract path is registered to the data
    corecore