10 research outputs found

    Designing non-speech sounds to support navigation in mobile phone menus

    Get PDF
    This paper describes a framework for integrating non-speech audio to hierarchical menu structures where the visual feedback is limited. In the first part of this paper, emphasis is put on how to extract sound design principles from actual navigation problems. These design principles are then applied in the second part, through the design, implementation and evaluation of a set of sounds in a computer-based simulation of the Nokia 6110 mobile phone. The evaluation indicates that non-speech sound improves the performance of navigational tasks in terms of the number of errors made and the number of keypresses taken to complete the given tasks. This study provides both theoretical and practical insights about the design of audio cues intended to support navigation in complex menu structures

    Workplace soundscape mapping: A trial of Macaulay and Crerar's Method

    Get PDF
    Presented at the 12th International Conference on Auditory Display (ICAD), London, UK, June 20-23, 2006.This paper describes a trial of Macaulay and Crerar's method of mapping a workplace soundscape [1] to assess its fitness as a basis for an extended soundscape mapping method. Twelve participants took part within 14 separate environments, which included academic, commercial and domestic locations. Results were visualized and subsequently collapsed to produce typical responses to typical environments, as well as specialist responses to a shared workplace

    Sound and soundscape classification: establishing key auditory dimensions and their relative importance

    Get PDF
    Presented at the 12th International Conference on Auditory Display (ICAD), London, UK, June 20-23, 2006.This paper investigates soundscape classification by using two different forms of data gathering and two different populations. The first method involves a questionnaire completed by 75 audio professionals. The second uses a speak-aloud experiment, during which 40 end users were asked to describe their audio environment. While both approaches are different and target a different audience, they provide an indication of key dimensions for the perception of soundscapes and their relative importance. Contrasts and similarities between the results of the questionnaire and speak-alouds are highlighted. Their implications with regards to the establishment of a set of common terms in order to aid future auditory designs are also discussed

    Dynamic Dialects: an articulatory web resource for the study of accents

    No full text
    Dynamic Dialects is the product of a collaboration between researchers at the University of Glasgow, Queen Margaret University Edinburgh, University College London and Napier University, Edinburgh. Dynamic Dialects is an accent database, containing an articulatory video-based corpus of speech samples from world-wide accents of English. Videos in this corpus contain synchronised audio, ultrasound-tongue-imaging video and video of the moving lips

    Seeing Speech: an articulatory web resource for the study of phonetics

    No full text
    This online resource is a product of the collaboration between researchers at six Scottish Universities: The University of Glasgow, Queen Margaret University, Napier University, the University of Strathclyde, the University of Edinburgh and the University of Aberdeen; as well as scholars from University College London and the University of Cardiff. The resource provides teachers and students of Practical Phonetics with ultrasound tongue imaging (UTI) video of speech, magnetic resonance imaging (MRI) video of speech and 2D midsagittal head animations based on MRI and UTI data. The first phase of this resource began in July 2011 and was completed in September 2013. Further funding was obtained in 2014 to improve and augment this resource and to develop its sister site Dynamic Dialects. The website contains two main resources: 1. An introduction to UTI, MRI vocal tract imaging techniques and information about the production of the articulatory animations. 2. Clickable International Phonetic Association charts links to UTI, MRI and animated speech articulator video
    corecore