2,136 research outputs found

    Augmenting Graphical User Interfaces with Haptic Assistance for Motion-Impaired Operators

    Get PDF
    Haptic assistance is an emerging field of research that is designed to improve human-computer interaction (HCI) by reducing error rates and targeting times through the use of force feedback. Haptic feedback has previously been investigated to assist motion-impaired computer users, however, limitations such as target distracters have hampered its integration with graphical user interfaces (GUIs). In this paper two new haptic assistive techniques are presented that utilise the 3DOF capabilities of the Phantom Omni. These are referred to as deformable haptic cones and deformable virtual switches. The assistance is designed specifically to enable motion-impaired operators to use existing GUIs more effectively. Experiment 1 investigates the performance benefits of the new haptic techniques when used in conjunction with the densely populated Windows on-screen keyboard (OSK). Experiment 2 utilises the ISO 9241-9 point-and-click task to investigate the effects of target size and shape. The results of the study prove that the newly proposed techniques improve interaction rates and can be integrated with existing software without many of the drawbacks of traditional haptic assistance. Deformable haptic cones and deformable virtual switches were shown to reduce the mean number of missed-clicks by at least 75% and reduce targeting times by at least 25%

    Guidelines for the design of haptic widgets

    Get PDF
    Haptic feedback has been shown to improve user performance in Graphical User Interface (GUI) targeting tasks in a number of studies. These studies have typically focused on interactions with individual targets, and it is unclear whether the performance increases reported will generalise to the more realistic situation where multiple targets are presented simultaneously. This paper addresses this issue in two ways. Firstly two empirical studies dealing with groups of haptically augmented widgets are presented. These reveal that haptic augmentations of complex widgets can reduce performance, although carefully designed feedback can result in performance improvements. The results of these studies are then used in conjunction with the previous literature to generate general design guidelines for the creation of haptic widgets

    Augmenting User Interfaces with Haptic Feedback

    Get PDF
    Computer assistive technologies have developed considerably over the past decades. Advances in computer software and hardware have provided motion-impaired operators with much greater access to computer interfaces. For people with motion impairments, the main di�culty in the communication process is the input of data into the system. For example, the use of a mouse or a keyboard demands a high level of dexterity and accuracy. Traditional input devices are designed for able-bodied users and often do not meet the needs of someone with disabilities. As the key feature of most graphical user interfaces (GUIs) is to point-and-click with a cursor this can make a computer inaccessible for many people. Human-computer interaction (HCI) is an important area of research that aims to improve communication between humans and machines. Previous studies have identi�ed haptics as a useful method for improving computer access. However, traditional haptic techniques su�er from a number of shortcomings that have hindered their inclusion with real world software. The focus of this thesis is to develop haptic rendering algorithms that will permit motion-impaired operators to use haptic assistance with existing graphical user interfaces. The main goal is to improve interaction by reducing error rates and improving targeting times. A number of novel haptic assistive techniques are presented that utilise the three degrees-of-freedom (3DOF) capabilities of modern haptic devices to produce assistance that is designed speci�- cally for motion-impaired computer users. To evaluate the e�ectiveness of the new techniques a series of point-and-click experiments were undertaken in parallel with cursor analysis to compare the levels of performance. The task required the operator to produce a prede�ned sentence on the densely populated Windows on-screen keyboard (OSK). The results of the study prove that higher performance levels can be i ii achieved using techniques that are less constricting than traditional assistance

    Tactons: structured tactile messages for non-visual information display

    Get PDF
    Tactile displays are now becoming available in a form that can be easily used in a user interface. This paper describes a new form of tactile output. Tactons, or tactile icons, are structured, abstract messages that can be used to communicate messages non-visually. A range of different parameters can be used for Tacton construction including: frequency, amplitude and duration of a tactile pulse, plus other parameters such as rhythm and location. Tactons have the potential to improve interaction in a range of different areas, particularly where the visual display is overloaded, limited in size or not available, such as interfaces for blind people or in mobile and wearable devices. This paper describes Tactons, the parameters used to construct them and some possible ways to design them. Examples of where Tactons might prove useful in user interfaces are given

    Tactile-STAR: A Novel Tactile STimulator And Recorder System for Evaluating and Improving Tactile Perception

    Get PDF
    Many neurological diseases impair the motor and somatosensory systems. While several different technologies are used in clinical practice to assess and improve motor functions, somatosensation is evaluated subjectively with qualitative clinical scales. Treatment of somatosensory deficits has received limited attention. To bridge the gap between the assessment and training of motor vs. somatosensory abilities, we designed, developed, and tested a novel, low-cost, two-component (bimanual) mechatronic system targeting tactile somatosensation: the Tactile-STAR—a tactile stimulator and recorder. The stimulator is an actuated pantograph structure driven by two servomotors, with an end-effector covered by a rubber material that can apply two different types of skin stimulation: brush and stretch. The stimulator has a modular design, and can be used to test the tactile perception in different parts of the body such as the hand, arm, leg, big toe, etc. The recorder is a passive pantograph that can measure hand motion using two potentiometers. The recorder can serve multiple purposes: participants can move its handle to match the direction and amplitude of the tactile stimulator, or they can use it as a master manipulator to control the tactile stimulator as a slave. Our ultimate goal is to assess and affect tactile acuity and somatosensory deficits. To demonstrate the feasibility of our novel system, we tested the Tactile-STAR with 16 healthy individuals and with three stroke survivors using the skin-brush stimulation. We verified that the system enables the mapping of tactile perception on the hand in both populations. We also tested the extent to which 30 min of training in healthy individuals led to an improvement of tactile perception. The results provide a first demonstration of the ability of this new system to characterize tactile perception in healthy individuals, as well as a quantification of the magnitude and pattern of tactile impairment in a small cohort of stroke survivors. The finding that short-term training with Tactile-STARcan improve the acuity of tactile perception in healthy individuals suggests that Tactile-STAR may have utility as a therapeutic intervention for somatosensory deficits

    Considerations in Designing Human-Computer Interfaces for Elderly People

    Get PDF
    As computing devices continue to become more heavily integrated into our lives, proper design of human-computer interfaces becomes a more important topic of discussion. Efficient and useful human-computer interfaces need to take into account the abilities of the humans who will be using such interfaces, and adapt to difficulties that different users may face – such as the difficulties that elderly users must deal with. Interfaces that allow for user-specific customization, while taking into account the multiple difficulties that older users might face, can assist the elderly in properly using these newer computing devices, and in doing so possibly achieving a better quality of life through the advanced technological support that these devices offer. In this paper, we explore common problems the elderly face when using computing devices and solutions developed for these problems. Difficulties ultimately fall into several categories: cognition, auditory, haptic, visual, and motor-based troubles. We also present an idea for a new adaptive operating system with advanced customizations that would simplify computing for older users
    corecore