4,600 research outputs found

    The design and evaluation of a sonically enhanced tool palette

    Get PDF
    This paper describes an experiment to investigate the effectiveness of adding sound to tool palettes. Palettes have usability problems because users need to see the information they present, but they are often outside the area of visual focus. We used nonspeech sounds called earcons to indicate the current tool and when tool changes occurred so that users could tell what tool they were in wherever they were looking. Results showed a significant reduction in the number of tasks performed with the wrong tool. Therefore, users knew what the current tool was and did not try to perform tasks with the wrong one. All of this was not at the expense of making the tool palettes any more annoying to use

    Sonically-enhanced widgets: comments on Brewster and Clarke, ICAD 1997

    Get PDF
    This paper presents a review of the research surrounding the paper “The Design and Evaluation of a Sonically Enhanced Tool Palette” by Brewster and Clarke from ICAD 1997. A historical perspective is given followed by a discussion of how this work has fed into current developments in the area

    Using non-speech sounds to provide navigation cues

    Get PDF
    This article describes 3 experiments that investigate the possibiity of using structured nonspeech audio messages called earcons to provide navigational cues in a menu hierarchy. A hierarchy of 27 nodes and 4 levels was created with an earcon for each node. Rules were defined for the creation of hierarchical earcons at each node. Participants had to identify their location in the hierarchy by listening to an earcon. Results of the first experiment showed that participants could identify their location with 81.5% accuracy, indicating that earcons were a powerful method of communicating hierarchy information. One proposed use for such navigation cues is in telephone-based interfaces (TBIs) where navigation is a problem. The first experiment did not address the particular problems of earcons in TBIs such as “does the lower quality of sound over the telephone lower recall rates,” “can users remember earcons over a period of time.” and “what effect does training type have on recall?” An experiment was conducted and results showed that sound quality did lower the recall of earcons. However; redesign of the earcons overcame this problem with 73% recalled correctly. Participants could still recall earcons at this level after a week had passed. Training type also affected recall. With personal training participants recalled 73% of the earcons, but with purely textual training results were significantly lower. These results show that earcons can provide good navigation cues for TBIs. The final experiment used compound, rather than hierarchical earcons to represent the hierarchy from the first experiment. Results showed that with sounds constructed in this way participants could recall 97% of the earcons. These experiments have developed our general understanding of earcons. A hierarchy three times larger than any previously created was tested, and this was also the first test of the recall of earcons over time

    Non-visual information display using tactons

    Get PDF
    This paper describes a novel form of display using tactile output. Tactons, or tactile icons, are structured tactile messages that can be used to communicate message to users non visually. A range of different parameters can be used to construct Tactons, e.g.: frequency, amplitude, waveform and duration of a tactile pulse, plus body location. Tactons have the potential to improve interaction in a range of different areas, particularly where the visual display is overloaded, limited in size or not available, such as interfaces for blind people or on mobile and wearable devices

    Two-handed navigation in a haptic virtual environment

    Get PDF
    This paper describes the initial results from a study looking at a two-handed interaction paradigm for tactile navigation for blind and visually impaired users. Participants were set the task of navigating a virtual maze environment using their dominant hand to move the cursor, while receiving contextual information in the form of tactile cues presented to their non-dominant hand. Results suggest that most participants were comfortable with the two-handed style of interaction even with little training. Two sets of contextual cues were examined with information presented through static patterns or tactile flow of raised pins. The initial results of this study suggest that while both sets of cues were usable, participants performed significantly better and faster with the static cues

    The design and evaluation of a vibrotactile progress bar

    Get PDF
    We present an investigation into the use of Tactons to present progress information. Progress bars are common but must compete for screen space and visual attention with other visual tasks. We created a tactile progress indicator, encoding progress into a series of vibrotactile pulses. An experiment comparing the tactile progress indicator to a standard visual one showed a significant improvement in performance and an overall preference for the tactile display

    "Sitting too close to the screen can be bad for your ears": A study of audio-visual location discrepancy detection under different visual projections

    Get PDF
    In this work, we look at the perception of event locality under conditions of disparate audio and visual cues. We address an aspect of the so called “ventriloquism effect” relevant for multi-media designers; namely, how auditory perception of event locality is influenced by the size and scale of the accompanying visual projection of those events. We observed that recalibration of the visual axes of an audio-visual animation (by resizing and zooming) exerts a recalibrating influence on the auditory space perception. In particular, sensitivity to audio-visual discrepancies (between a centrally located visual stimuli and laterally displaced audio cue) increases near the edge of the screen on which the visual cue is displayed. In other words,discrepancy detection thresholds are not fixed for a particular pair of stimuli, but are influenced by the size of the display space. Moreover, the discrepancy thresholds are influenced by scale as well as size. That is, the boundary of auditory space perception is not rigidly fixed on the boundaries of the screen; it also depends on the spatial relationship depicted. For example,the ventriloquism effect will break down within the boundaries of a large screen if zooming is used to exaggerate the proximity of the audience to the events. The latter effect appears to be much weaker than the former

    Multi-Moji: Combining Thermal, Vibrotactile and Visual Stimuli to Expand the Affective Range of Feedback

    Get PDF
    This paper explores the combination of multiple concurrent modalities for conveying emotional information in HCI: temperature, vibration and abstract visual displays. Each modality has been studied individually, but can only convey a limited range of emotions within two-dimensional valencearousal space. This paper is the first to systematically combine multiple modalities to expand the available affective range. Three studies were conducted: Study 1 measured the emotionality of vibrotactile feedback by itself; Study 2 measured the perceived emotional content of three bimodal combinations: vibrotactile + thermal, vibrotactile + visual and visual + thermal. Study 3 then combined all three modalities. Results show that combining modalities increases the available range of emotional states, particularly in the problematic top-right and bottom-left quadrants of the dimensional model. We also provide a novel lookup resource for designers to identify stimuli to convey a range of emotions

    Using compound earcons to represent hierarchies

    Get PDF
    Previous research on non-speech audio messages called <i>earcons</i> showed that they could provide powerful navigation cues in menu hierarchies. This work used <i>hierarchical</i> earcons. In this paper we suggest <i>compound</i> earcons provide a more flexible method for presenting this information. A set of sounds was created to represent the numbers 0-4 and dot. Sounds could then be created for any node in a hierarchy by concatenating these simple sounds. A hierarchy of four levels and 27 nodes was constructed. An experiment was conducted in which participants had to identify their location in the hierarchy by listening to an earcon. Results showed that participants could identify their location with over 97% accuracy, significantly better than with hierarchical earcons. Participants were also able to recognise previously unheard earcons with over 97% accuracy. These results showed that compound earcons are an effective way of representing hierarchies in sound

    Touching the invisible: Localizing ultrasonic haptic cues

    Get PDF
    While mid-air gestures offer new possibilities to interact with or around devices, some situations, such as interacting with applications, playing games or navigating, may require visual attention to be focused on a main task. Ultrasonic haptic feedback can provide 3D spatial haptic cues that do not demand visual attention for these contexts. In this paper, we present an initial study of active exploration of ultrasonic haptic virtual points that investigates the spatial localization with and without the use of the visual modality. Our results show that, when providing haptic feedback giving the location of a widget, users perform 50% more accurately compared to providing visual feedback alone. When provided with a haptic location of a widget alone, users are more than 30% more accurate than when given a visual location. When aware of the location of the haptic feedback, active exploration decreased the minimum recommended widget size from 2cm2 to 1cm2 when compared to passive exploration from previous studies. Our results will allow designers to create better mid-air interactions using this new form of haptic feedback
    • 

    corecore