12,415 research outputs found

    An introduction to interactive sonification

    Get PDF
    The research field of sonification, a subset of the topic of auditory display, has developed rapidly in recent decades. It brings together interests from the areas of data mining, exploratory data analysis, human–computer interfaces, and computer music. Sonification presents information by using sound (particularly nonspeech), so that the user of an auditory display obtains a deeper understanding of the data or processes under investigation by listening

    Ecological IVIS design : using EID to develop a novel in-vehicle information system

    Get PDF
    New in-vehicle information systems (IVIS) are emerging which purport to encourage more environment friendly or ‘green’ driving. Meanwhile, wider concerns about road safety and in-car distractions remain. The ‘Foot-LITE’ project is an effort to balance these issues, aimed at achieving safer and greener driving through real-time driving information, presented via an in-vehicle interface which facilitates the desired behaviours while avoiding negative consequences. One way of achieving this is to use ecological interface design (EID) techniques. This article presents part of the formative human-centred design process for developing the in-car display through a series of rapid prototyping studies comparing EID against conventional interface design principles. We focus primarily on the visual display, although some development of an ecological auditory display is also presented. The results of feedback from potential users as well as subject matter experts are discussed with respect to implications for future interface design in this field

    Considerations in Designing Human-Computer Interfaces for Elderly People

    Get PDF
    As computing devices continue to become more heavily integrated into our lives, proper design of human-computer interfaces becomes a more important topic of discussion. Efficient and useful human-computer interfaces need to take into account the abilities of the humans who will be using such interfaces, and adapt to difficulties that different users may face – such as the difficulties that elderly users must deal with. Interfaces that allow for user-specific customization, while taking into account the multiple difficulties that older users might face, can assist the elderly in properly using these newer computing devices, and in doing so possibly achieving a better quality of life through the advanced technological support that these devices offer. In this paper, we explore common problems the elderly face when using computing devices and solutions developed for these problems. Difficulties ultimately fall into several categories: cognition, auditory, haptic, visual, and motor-based troubles. We also present an idea for a new adaptive operating system with advanced customizations that would simplify computing for older users

    Spatial audio in small display screen devices

    Get PDF
    Our work addresses the problem of (visual) clutter in mobile device interfaces. The solution we propose involves the translation of technique-from the graphical to the audio domain-for expliting space in information representation. This article presents an illustrative example in the form of a spatialisedaudio progress bar. In usability tests, participants performed background monitoring tasks significantly more accurately using this spatialised audio (a compared with a conventional visual) progress bar. Moreover, their performance in a simultaneously running, visually demanding foreground task was significantly improved in the eye-free monitoring condition. These results have important implications for the design of multi-tasking interfaces for mobile devices

    Music Maker – A Camera-based Music Making Tool for Physical Rehabilitation

    Full text link
    The therapeutic effects of playing music are being recognized increasingly in the field of rehabilitation medicine. People with physical disabilities, however, often do not have the motor dexterity needed to play an instrument. We developed a camera-based human-computer interface called "Music Maker" to provide such people with a means to make music by performing therapeutic exercises. Music Maker uses computer vision techniques to convert the movements of a patient's body part, for example, a finger, hand, or foot, into musical and visual feedback using the open software platform EyesWeb. It can be adjusted to a patient's particular therapeutic needs and provides quantitative tools for monitoring the recovery process and assessing therapeutic outcomes. We tested the potential of Music Maker as a rehabilitation tool with six subjects who responded to or created music in various movement exercises. In these proof-of-concept experiments, Music Maker has performed reliably and shown its promise as a therapeutic device.National Science Foundation (IIS-0308213, IIS-039009, IIS-0093367, P200A01031, EIA-0202067 to M.B.); National Institutes of Health (DC-03663 to E.S.); Boston University (Dudley Allen Sargent Research Fund (to A.L.)

    Spotting Agreement and Disagreement: A Survey of Nonverbal Audiovisual Cues and Tools

    Get PDF
    While detecting and interpreting temporal patterns of non–verbal behavioral cues in a given context is a natural and often unconscious process for humans, it remains a rather difficult task for computer systems. Nevertheless, it is an important one to achieve if the goal is to realise a naturalistic communication between humans and machines. Machines that are able to sense social attitudes like agreement and disagreement and respond to them in a meaningful way are likely to be welcomed by users due to the more natural, efficient and human–centered interaction they are bound to experience. This paper surveys the nonverbal cues that could be present during agreement and disagreement behavioural displays and lists a number of tools that could be useful in detecting them, as well as a few publicly available databases that could be used to train these tools for analysis of spontaneous, audiovisual instances of agreement and disagreement

    Tactons: structured tactile messages for non-visual information display

    Get PDF
    Tactile displays are now becoming available in a form that can be easily used in a user interface. This paper describes a new form of tactile output. Tactons, or tactile icons, are structured, abstract messages that can be used to communicate messages non-visually. A range of different parameters can be used for Tacton construction including: frequency, amplitude and duration of a tactile pulse, plus other parameters such as rhythm and location. Tactons have the potential to improve interaction in a range of different areas, particularly where the visual display is overloaded, limited in size or not available, such as interfaces for blind people or in mobile and wearable devices. This paper describes Tactons, the parameters used to construct them and some possible ways to design them. Examples of where Tactons might prove useful in user interfaces are given
    corecore