583 research outputs found

    Non-visual information display using tactons

    Get PDF
    This paper describes a novel form of display using tactile output. Tactons, or tactile icons, are structured tactile messages that can be used to communicate message to users non visually. A range of different parameters can be used to construct Tactons, e.g.: frequency, amplitude, waveform and duration of a tactile pulse, plus body location. Tactons have the potential to improve interaction in a range of different areas, particularly where the visual display is overloaded, limited in size or not available, such as interfaces for blind people or on mobile and wearable devices

    Messy Tabletops: Clearing Up The Occlusion Problem

    Get PDF
    When introducing interactive tabletops into the home and office, lack of space will often mean that these devices play two roles: interactive display and a place for putting things. Clutter on the table surface may occlude information on the display, preventing the user from noticing it or interacting with it. We present a technique for dealing with clutter on tabletops which finds a suitable unoccluded area of the display in which to show content. We discuss the implementation of this technique and some design issues which arose during implementation

    An Evaluation of Touch and Pressure-Based Scrolling and Haptic Feedback for In-car Touchscreens

    Get PDF
    An in-car study was conducted to examine different input techniques for list-based scrolling tasks and the effectiveness of haptic feedback for in-car touchscreens. The use of physical switchgear on centre consoles is decreasing which allows designers to develop new ways to interact with in-car applications. However, these new methods need to be evaluated to ensure they are usable. Therefore, three input techniques were tested: direct scrolling, pressure-based scrolling and scrolling using onscreen buttons on a touchscreen. The results showed that direct scrolling was less accurate than using onscreen buttons and pressure input, but took almost half the time when compared to the onscreen buttons and was almost three times quicker than pressure input. Vibrotactile feedback did not improve input performance but was preferred by the users. Understanding the speed vs. accuracy trade-off between these input techniques will allow better decisions when designing safer in-car interfaces for scrolling applications

    The challenges of mobile devices for human computer interaction

    Get PDF
    Current mobile computing devices such as palmtop computers, personal digital assistants (PDAs) and mobile phones, and future devices such as Bluetooth and GSM enabled cameras, and music players have many implications for the design of the user interface. These devices share a common problem: attempting to give users access to powerful computing services and resources through small interfaces, which typically have tiny visual displays, poor audio interaction facilities and limited input techniques. They also introduce new challenges such as designing for intermittent and expensive network access, and design for position awareness and context sensitivity. No longer can designers base computing designs around the traditional model of a single user working with a personal computer at his/her workplace. In addition to mobility and size requirements, mobile devices will also typically be used by a larger population spread than traditional PCs and without any training or support networks, whether formal or informal. Furthermore, unlike early computers which had many users per computer, and PCs with usually one computer per user, a single user is likely to own many mobiles devices [1] which they interact with indifferent ways and for different tasks

    Human computer interaction with mobile devices (editorial for special edition)

    Get PDF
    The second international workshop on human-computer interaction with mobile devices took place on 30th August,1999 as part of the IFIP INTERACT '99 conference held in Edinburgh, UK. We had over 60 participants with an almost equal mix between academic and industrial attendees from within Europe, North America and Asia.The first workshop had been held in Glasgow the year before and was one of the first to bring together researchers interested in how to design usable interfaces for mobile computers. It was such a success that we decided to run another- this was obviously an area where there were many problems and many people looking for solutions. The growth of the mobile computing market is rapid. The take-up of mobile telephones and personal digital assistants has been dramatic - huge numbers of people now own a mobile device of some kind. But there are still big problems with usability - it is hard to design interfaces and interactions with devices that have small or no screens and limited computing resources. This is becoming worse as more and more complexity is being integrated into these small devices

    Multi-Moji: Combining Thermal, Vibrotactile and Visual Stimuli to Expand the Affective Range of Feedback

    Get PDF
    This paper explores the combination of multiple concurrent modalities for conveying emotional information in HCI: temperature, vibration and abstract visual displays. Each modality has been studied individually, but can only convey a limited range of emotions within two-dimensional valencearousal space. This paper is the first to systematically combine multiple modalities to expand the available affective range. Three studies were conducted: Study 1 measured the emotionality of vibrotactile feedback by itself; Study 2 measured the perceived emotional content of three bimodal combinations: vibrotactile + thermal, vibrotactile + visual and visual + thermal. Study 3 then combined all three modalities. Results show that combining modalities increases the available range of emotional states, particularly in the problematic top-right and bottom-left quadrants of the dimensional model. We also provide a novel lookup resource for designers to identify stimuli to convey a range of emotions

    The design and evaluation of a sonically enhanced tool palette

    Get PDF
    This paper describes an experiment to investigate the effectiveness of adding sound to tool palettes. Palettes have usability problems because users need to see the information they present, but they are often outside the area of visual focus. We used nonspeech sounds called earcons to indicate the current tool and when tool changes occurred so that users could tell what tool they were in wherever they were looking. Results showed a significant reduction in the number of tasks performed with the wrong tool. Therefore, users knew what the current tool was and did not try to perform tasks with the wrong one. All of this was not at the expense of making the tool palettes any more annoying to use

    Evaluating Multimodal Driver Displays of Varying Urgency

    Get PDF
    Previous studies have evaluated Audio, Visual and Tactile warnings for drivers, highlighting the importance of conveying the appropriate level of urgency through the signals. However, these modalities have never been combined exhaustively with different urgency levels and tested while using a driving simulator. This paper describes two experiments investigating all multimodal combinations of such warnings along three different levels of designed urgency. The warnings were first evaluated in terms of perceived urgency and perceived annoyance in the context of a driving simulator. The results showed that the perceived urgency matched the designed urgency of the warnings. More urgent warnings were also rated as more annoying but the effect of annoyance was lower compared to urgency. The warnings were then tested for recognition time when presented during a simulated driving task. It was found that warnings of high urgency induced quicker and more accurate responses than warnings of medium and of low urgency. In both studies, the number of modalities used in warnings (one, two or three) affected both subjective and objective responses. More modalities led to higher ratings of urgency and annoyance, with annoyance having a lower effect compared to urgency. More modalities also led to quicker responses. These results provide implications for multimodal warning design and reveal how modalities and modality combinations can influence participant responses during a simulated driving task

    Rhythmic Micro-Gestures: Discreet Interaction On-the-Go

    Get PDF
    We present rhythmic micro-gestures, micro-movements of the hand that are repeated in time with a rhythm. We present a user study that investigated how well users can perform rhythmic micro-gestures and if they can use them eyes-free with non-visual feedback. We found that users could successfully use our interaction technique (97% success rate across all gestures) with short interaction times, rating them as low difficulty as well. Simple audio cues that only convey the rhythm outperformed animations showing the hand movements, supporting rhythmic micro-gestures as an eyes-free input technique

    Automatically Adapting Home Lighting to Assist Visually Impaired Children

    Get PDF
    For visually impaired children, activities like finding everyday items, locating favourite toys and moving around the home can be challenging. Assisting them during these activities is important because it promotes independence and encourages them to use and develop their remaining visual function. We describe our work towards a system that adapts the lighting conditions at home to help visually impaired children with everyday tasks. We discuss scenarios that show how they may benefit from adaptive lighting, report on our progress and describe our planned future work and evaluation
    corecore