4,769 research outputs found

    The design and evaluation of a sonically enhanced tool palette

    Get PDF
    This paper describes an experiment to investigate the effectiveness of adding sound to tool palettes. Palettes have usability problems because users need to see the information they present, but they are often outside the area of visual focus. We used nonspeech sounds called earcons to indicate the current tool and when tool changes occurred so that users could tell what tool they were in wherever they were looking. Results showed a significant reduction in the number of tasks performed with the wrong tool. Therefore, users knew what the current tool was and did not try to perform tasks with the wrong one. All of this was not at the expense of making the tool palettes any more annoying to use

    Non-visual information display using tactons

    Get PDF
    This paper describes a novel form of display using tactile output. Tactons, or tactile icons, are structured tactile messages that can be used to communicate message to users non visually. A range of different parameters can be used to construct Tactons, e.g.: frequency, amplitude, waveform and duration of a tactile pulse, plus body location. Tactons have the potential to improve interaction in a range of different areas, particularly where the visual display is overloaded, limited in size or not available, such as interfaces for blind people or on mobile and wearable devices

    Sonically-enhanced widgets: comments on Brewster and Clarke, ICAD 1997

    Get PDF
    This paper presents a review of the research surrounding the paper “The Design and Evaluation of a Sonically Enhanced Tool Palette” by Brewster and Clarke from ICAD 1997. A historical perspective is given followed by a discussion of how this work has fed into current developments in the area

    Using non-speech sounds to provide navigation cues

    Get PDF
    This article describes 3 experiments that investigate the possibiity of using structured nonspeech audio messages called earcons to provide navigational cues in a menu hierarchy. A hierarchy of 27 nodes and 4 levels was created with an earcon for each node. Rules were defined for the creation of hierarchical earcons at each node. Participants had to identify their location in the hierarchy by listening to an earcon. Results of the first experiment showed that participants could identify their location with 81.5% accuracy, indicating that earcons were a powerful method of communicating hierarchy information. One proposed use for such navigation cues is in telephone-based interfaces (TBIs) where navigation is a problem. The first experiment did not address the particular problems of earcons in TBIs such as “does the lower quality of sound over the telephone lower recall rates,” “can users remember earcons over a period of time.” and “what effect does training type have on recall?” An experiment was conducted and results showed that sound quality did lower the recall of earcons. However; redesign of the earcons overcame this problem with 73% recalled correctly. Participants could still recall earcons at this level after a week had passed. Training type also affected recall. With personal training participants recalled 73% of the earcons, but with purely textual training results were significantly lower. These results show that earcons can provide good navigation cues for TBIs. The final experiment used compound, rather than hierarchical earcons to represent the hierarchy from the first experiment. Results showed that with sounds constructed in this way participants could recall 97% of the earcons. These experiments have developed our general understanding of earcons. A hierarchy three times larger than any previously created was tested, and this was also the first test of the recall of earcons over time

    Caring, sharing widgets: a toolkit of sensitive widgets

    Get PDF
    Although most of us communicate using multiple sensory modalities in our lives, and many of our computers are similarly capable of multi-modal interaction, most human-computer interaction is predominantly in the visual mode. This paper describes a toolkit of widgets that are capable of presenting themselves in multiple modalities, but further are capapble of adapting their presentation to suit the contexts and environments in which they are used. This is of increasing importance as the use of mobile devices becomes ubiquitous

    Radiation Effects on Flow Characteristics in Combustion Chambers

    Get PDF
    A JANNAF sponsored workshop was held to discuss the importance and role of radiative heat transfer in rocket combustion chambers. The potential impact of radiative transfer on hardware design, reliability, and performance was discussed. The current state of radiative transfer prediction capability in CFD modeling was reviewed and concluded to be substantially lacking in both the physical models used and the radiative property data available. There is a clear need to begin to establish a data base for making radiation calculations in rocket combustion chambers. A natural starting point for this effort would be the NASA thermochemical equilibrium code (CEC)

    A Dose of Reality: Overcoming Usability Challenges in VR Head-Mounted Displays

    Get PDF
    We identify usability challenges facing consumers adopting Virtual Reality (VR) head-mounted displays (HMDs) in a survey of 108 VR HMD users. Users reported significant issues in interacting with, and being aware of their real-world context when using a HMD. Building upon existing work on blending real and virtual environments, we performed three design studies to address these usability concerns. In a typing study, we show that augmenting VR with a view of reality significantly corrected the performance impairment of typing in VR. We then investigated how much reality should be incorporated and when, so as to preserve users’ sense of presence in VR. For interaction with objects and peripherals, we found that selectively presenting reality as users engaged with it was optimal in terms of performance and users’ sense of presence. Finally, we investigated how this selective, engagement-dependent approach could be applied in social environments, to support the user’s awareness of the proximity and presence of others

    Spatial audio in small display screen devices

    Get PDF
    Our work addresses the problem of (visual) clutter in mobile device interfaces. The solution we propose involves the translation of technique-from the graphical to the audio domain-for expliting space in information representation. This article presents an illustrative example in the form of a spatialisedaudio progress bar. In usability tests, participants performed background monitoring tasks significantly more accurately using this spatialised audio (a compared with a conventional visual) progress bar. Moreover, their performance in a simultaneously running, visually demanding foreground task was significantly improved in the eye-free monitoring condition. These results have important implications for the design of multi-tasking interfaces for mobile devices

    A toolkit of mechanism and context independent widgets

    Get PDF
    Most human-computer interfaces are designed to run on a static platform (e.g. a workstation with a monitor) in a static environment (e.g. an office). However, with mobile devices becoming ubiquitous and capable of running applications similar to those found on static devices, it is no longer valid to design static interfaces. This paper describes a user-interface architecture which allows interactors to be flexible about the way they are presented. This flexibility is defined by the different input and output mechanisms used. An interactor may use different mechanisms depending upon their suitability in the current context, user preference and the resources available for presentation using that mechanism

    Haptic feedback in the training of veterinary students

    Get PDF
    This paper reports on an initial study into the use of haptic (or touch) technology in the training of veterinary students. One major problem faced in veterinary education is that animals can be harmed by inexperienced students who are trying to learn the skills they need. The aim of the work described here is to provide haptic models to simulate internal examinations of horses so that students can learn the basic skills required on computer and then transfer to real animals with much less risk of doing them injury
    corecore