885 research outputs found

    Augmenting Graphical User Interfaces with Haptic Assistance for Motion-Impaired Operators

    Get PDF
    Haptic assistance is an emerging field of research that is designed to improve human-computer interaction (HCI) by reducing error rates and targeting times through the use of force feedback. Haptic feedback has previously been investigated to assist motion-impaired computer users, however, limitations such as target distracters have hampered its integration with graphical user interfaces (GUIs). In this paper two new haptic assistive techniques are presented that utilise the 3DOF capabilities of the Phantom Omni. These are referred to as deformable haptic cones and deformable virtual switches. The assistance is designed specifically to enable motion-impaired operators to use existing GUIs more effectively. Experiment 1 investigates the performance benefits of the new haptic techniques when used in conjunction with the densely populated Windows on-screen keyboard (OSK). Experiment 2 utilises the ISO 9241-9 point-and-click task to investigate the effects of target size and shape. The results of the study prove that the newly proposed techniques improve interaction rates and can be integrated with existing software without many of the drawbacks of traditional haptic assistance. Deformable haptic cones and deformable virtual switches were shown to reduce the mean number of missed-clicks by at least 75% and reduce targeting times by at least 25%

    Guidelines for the design of haptic widgets

    Get PDF
    Haptic feedback has been shown to improve user performance in Graphical User Interface (GUI) targeting tasks in a number of studies. These studies have typically focused on interactions with individual targets, and it is unclear whether the performance increases reported will generalise to the more realistic situation where multiple targets are presented simultaneously. This paper addresses this issue in two ways. Firstly two empirical studies dealing with groups of haptically augmented widgets are presented. These reveal that haptic augmentations of complex widgets can reduce performance, although carefully designed feedback can result in performance improvements. The results of these studies are then used in conjunction with the previous literature to generate general design guidelines for the creation of haptic widgets

    Putting the feel in ’look and feel‘

    Get PDF
    Haptic devices are now commercially available and thus touch has become a potentially realistic solution to a variety of interaction design challenges. We report on an investigation of the use of touch as a way of reducing visual overload in the conventional desktop. In a two-phase study, we investigated the use of the PHANToM haptic device as a means of interacting with a conventional graphical user interface. The first experiment compared the effects of four different haptic augmentations on usability in a simple targeting task. The second experiment involved a more ecologically-oriented searching and scrolling task. Results indicated that the haptic effects did not improve users performance in terms of task completion time. However, the number of errors made was significantly reduced. Subjective workload measures showed that participants perceived many aspects of workload as significantly less with haptics. The results are described and the implications for the use of haptics in user interface design are discussed

    Augmenting User Interfaces with Haptic Feedback

    Get PDF
    Computer assistive technologies have developed considerably over the past decades. Advances in computer software and hardware have provided motion-impaired operators with much greater access to computer interfaces. For people with motion impairments, the main di�culty in the communication process is the input of data into the system. For example, the use of a mouse or a keyboard demands a high level of dexterity and accuracy. Traditional input devices are designed for able-bodied users and often do not meet the needs of someone with disabilities. As the key feature of most graphical user interfaces (GUIs) is to point-and-click with a cursor this can make a computer inaccessible for many people. Human-computer interaction (HCI) is an important area of research that aims to improve communication between humans and machines. Previous studies have identi�ed haptics as a useful method for improving computer access. However, traditional haptic techniques su�er from a number of shortcomings that have hindered their inclusion with real world software. The focus of this thesis is to develop haptic rendering algorithms that will permit motion-impaired operators to use haptic assistance with existing graphical user interfaces. The main goal is to improve interaction by reducing error rates and improving targeting times. A number of novel haptic assistive techniques are presented that utilise the three degrees-of-freedom (3DOF) capabilities of modern haptic devices to produce assistance that is designed speci�- cally for motion-impaired computer users. To evaluate the e�ectiveness of the new techniques a series of point-and-click experiments were undertaken in parallel with cursor analysis to compare the levels of performance. The task required the operator to produce a prede�ned sentence on the densely populated Windows on-screen keyboard (OSK). The results of the study prove that higher performance levels can be i ii achieved using techniques that are less constricting than traditional assistance

    CobotTouch: AR-based Interface with Fingertip-worn Tactile Display for Immersive Operation/Control of Collaborative Robots

    Full text link
    Complex robotic tasks require human collaboration to benefit from their high dexterity. Frequent human-robot interaction is mentally demanding and time-consuming. Intuitive and easy-to-use robot control interfaces reduce the negative influence on workers, especially inexperienced users. In this paper, we present CobotTouch, a novel intuitive robot control interface with fingertip haptic feedback. The proposed interface consists of a projected Graphical User Interface on the robotic arm to control the position of the robot end-effector based on gesture recognition, and a wearable haptic interface to deliver tactile feedback on the user's fingertips. We evaluated the user's perception of the designed tactile patterns presented by the haptic interface and the intuitiveness of the proposed system for robot control in a use case. The results revealed a high average recognition rate of 75.25\% for the tactile patterns. An average NASA Task Load Index (TLX) indicated small mental and temporal demands proving a high level of the intuitiveness of CobotTouch for interaction with collaborative robots.Comment: 12 pages, 11 figures, Accepted paper in AsiaHaptics 202

    An introduction to interactive sonification

    Get PDF
    The research field of sonification, a subset of the topic of auditory display, has developed rapidly in recent decades. It brings together interests from the areas of data mining, exploratory data analysis, human–computer interfaces, and computer music. Sonification presents information by using sound (particularly nonspeech), so that the user of an auditory display obtains a deeper understanding of the data or processes under investigation by listening

    A haptic-enabled multimodal interface for the planning of hip arthroplasty

    Get PDF
    Multimodal environments help fuse a diverse range of sensory modalities, which is particularly important when integrating the complex data involved in surgical preoperative planning. The authors apply a multimodal interface for preoperative planning of hip arthroplasty with a user interface that integrates immersive stereo displays and haptic modalities. This article overviews this multimodal application framework and discusses the benefits of incorporating the haptic modality in this area

    Interplayable surface: an exploration on augmented GUI that co-exists with physical environments

    Get PDF
    The main goal of this experiment-driven thesis is to envision and design an interactive GUI1(graphic user interface) that coexists with physical surfaces. Based on an understanding of user behavioral patterns for getting access to information in these types of situations, experimentations and prototypes are implemented and tested with participants. In particular, to observe the user behavioral pattern for augmented GUI within certain environments and circumstances, this thesis presents several types of participatory experimentations with physical GUIs. The experiment participants were encouraged to participate in re-creates and reorganizes physical GUI, relating to their own situational specificity or informational tendencies they have. Based on extracted insights from research and experiments, in the last phase, I propose two thesis models about how interactive GUI applies to a physical environment: simulation mock-ups for user scenarios of augmented GUI and interactive GUI surface combined with projection mapping. Related to people’s behavioral patterns on augmented GUI, the thesis models will show several types of information structures and interactions. Also, in framing the overall data structure and wireframe for the thesis product model, informative affordance corresponding with users’ situational specificity2 is considered as a crucial direction point, actualized on an artifact in a perceptible way. Through experimentally prototyping a thesis model, consequently, I would like to expand the speculative usability interactive GUI will feature in the near future
    corecore