17 research outputs found

    Bimodal Feedback for In-car Mid-air Gesture Interaction

    Get PDF
    This demonstration showcases novel multimodal feedback designs for in-car mid-air gesture interaction. It explores the potential of multimodal feedback types for mid-air gestures in cars and how these can reduce eyes-off-the-road time thus make driving safer. We will show four different bimodal feedback combinations to provide effective information about interaction with systems in a car. These feedback techniques are visual-auditory, auditory-ambient (peripheral vision), ambient-tactile, and tactile-auditory. Users can interact with the system after a short introduction, creating an exciting opportunity to deploy these displays in cars in the future

    Evaluation of Haptic Patterns on a Steering Wheel

    Get PDF
    Infotainment Systems can increase mental workload and divert visual attention away from looking ahead on the roads. When these systems give information to the driver, provide it through the tactile channel on the steering, it wheel might improve driving behaviour and safety. This paper describes an investigation into the perceivability of haptic feedback patterns using an actuated surface on a steering wheel. Six solenoids were embedded along the rim of the steering wheel creating three bumps under each palm. Maximally, four of the six solenoids were actuated simultaneously, resulting in 56 patterns to test. Participants were asked to keep in the middle road of the driving simulator as good as possible. Overall recognition accuracy of the haptic patterns was 81.3%, where identification rate increased with decreasing number of active solenoids (up to 92.2% for a single solenoid). There was no significant increase in lane deviation or steering angle during haptic pattern presentation. These results suggest that drivers can reliably distinguish between cutaneous patterns presented on the steering wheel. Our findings can assist in delivering non-critical messages to the driver (e.g. driving performance, incoming text messages, etc.) without decreasing driving performance or increasing perceived mental workload

    Thermal in-car interaction for navigation

    Get PDF
    In this demonstration we show a thermal interaction design on the steering wheel for navigational cues in a car. Participants will be able to use a thermally enhanced steering wheel to follow instructions given in a turn-to-turn based navigation task in a virtual city. The thermal cues will be provided on both sides of the steering wheel and will indicate the turning direction by warming the corresponding side, while the opposite side is being cooled

    Novel Multimodal Feedback Techniques for In-Car Mid-Air Gesture Interaction

    Get PDF
    This paper presents an investigation into the effects of different feedback modalities on mid-air gesture interaction for infotainment systems in cars. Car crashes and near-crash events are most commonly caused by driver distraction. Mid-air interaction is a way of reducing driver distraction by reducing visual demand from infotainment. Despite a range of available modalities, feedback in mid-air gesture systems is generally provided through visual displays. We conducted a simulated driving study to investigate how different types of multimodal feedback can support in-air gestures. The effects of different feedback modalities on eye gaze behaviour, and the driving and gesturing tasks are considered. We found that feedback modality influenced gesturing behaviour. However, drivers corrected falsely executed gestures more often in non-visual conditions. Our findings show that non-visual feedback can reduce visual distraction significantl

    Investigation of Thermal Stimuli for Lane Changes

    Get PDF
    Haptic feedback has been widely studied for in-car interactions. However, most of this research has used vibrotactile cues. This paper presents two studies that examine novel thermal feedback for navigation during simulated driving for a lane change task. In the first, we compare the distraction and time differences of audio and thermal feedback. The results show that the presentation of thermal stimuli does not increase lane deviation, but the time needed to complete a lane change increased by 1.82 seconds. In the second study, the influence of variable changes of thermal stimuli on the lane change task performance was tested. We found that the same stimulus design for warm and cold temperatures does not always elicit the same results. Furthermore, variable alterations can have different effects on specified tasks. This suggests that the design of thermal stimuli is highly dependent on what task result should be maximized

    You Got It in Your Hands: Stop-Signal Modality Influences on Reactive Response Inhibition with Gaming Controls

    Get PDF
    Mastering the art of stopping initiated actions is vital when playing video games. However, what characteristics make up the perfect warning or stop-signal remains unclear. In the present study we compared performance in a basic and a gamified stop-signal task depending on different stop-signal modalities: auditory, haptic and audio-haptic. Data from a complete within-subjects design (N = 24), revealed an advantage of haptic or audio-haptic stop-signals as compared to purely auditory ones. Further, results show an overall slower performance in the game-version compared to the basic version. With regards to the subjective experience, the results revealed higher motivation to perform in the gamified task, but a somewhat deeper flow experience in the basic task. In sum, these results confirm that stop-signal modality influences reactive response inhibition in both basic and gamified tasks. Future research may extend and generalize these findings to other cross-modal and more complicated gaming setups. Game developers may draw on these findings to optimize the communication of stop signals via vibrations in a handheld controller

    Purring Wheel: Thermal and Vibrotactile Notifications on the Steering Wheel

    Get PDF
    Haptic feedback can improve safety and driving behaviour. While vibration has been widely studied, other haptic modalities have been neglected. To address this, we present two studies investigating the use of uni- and bimodal vibrotactile and thermal cues on the steering wheel. First, notifications with three levels of urgency were subjectively rated and then identified during simulated driving. Bimodal feedback showed an increased identification time over unimodal vibrotactile cues. Thermal feedback was consistently rated less urgent, showing its suitability for less time critical notifications, where vibration would be unnecessarily attention-grabbing. The second study investigated more complex thermal and bimodal haptic notifications comprised of two different types of information (Nature and Importance of incoming message). Results showed that both modalities could be identified with high recognition rates of up to 92% for both and up to 99% for a single type, opening up a novel design space for haptic in-car feedback

    Doctor of Philosophy

    Get PDF
    dissertationThe study of haptic interfaces focuses on the use of the sense of touch in human-machine interaction. This document presents a detailed investigation of lateral skin stretch at the fingertip as a means of direction communication. Such tactile communication has applications in a variety of situations where traditional audio and visual channels are inconvenient, unsafe, or already saturated. Examples include handheld consumer electronics, where tactile communication would allow a user to control a device without having to look at it, or in-car navigation systems, where the audio and visual directions provided by existing GPS devices can distract the driver's attention away from the road. Lateral skin stretch, the displacement of the skin of the fingerpad in a plane tangent to the fingerpad, is a highly effective means of communicating directional information. Users are able to correctly identify the direction of skin stretch stimuli with skin displacements as small as 0.1 mm at rates as slow as 2 mm/s. Such stimuli can be rendered by a small, portable device suitable for integration into handheld devices. The design of the device-finger interface affects the ability of the user to perceive the stimuli accurately. A properly designed conical aperture effectively constrains the motion of the finger and provides an interface that is practical for use in handheld devices. When a handheld device renders directional tactile cues on the fingerpad, the user must often mentally rotate those cues from the reference frame of the finger to the world-centered reference frame where those cues are to be applied. Such mental rotation incurs a cognitive cost, requiring additional time to mentally process the stimuli. The magnitude of these cognitive costs is a function of the angle of rotation, and of the specific orientations of the arm, wrist and finger. Even with the difficulties imposed by required mental rotations, lateral skin stretch is a promising means of communicating information using the sense of touch with potential to substantially improve certain types of human-machine interaction

    Enhancing navigation information with tactile output embedded into the steering wheel

    Get PDF
    Navigation systems are in common use by drivers and typically present information using either audio or visual representations. However, there are many pressures on the driver's cognitive systems in a car and navigational systems can add to this complexity. In this paper, we present two studies which investigated how vibro-tactile representations of navigational information, might be presented to the driver via the steering wheel to ameliorate this problem. Our results show that adding tactile information to existing audio, or particularly visual representations, can improve both driving performance and experience

    Supporting Eyes-Free Human–Computer Interaction with Vibrotactile Haptification

    Get PDF
    The sense of touch is a crucial sense when using our hands in complex tasks. Some tasks we learn to do even without sight by just using the sense of touch in our fingers and hands. Modern touchscreen devices, however, have lost some of that tactile feeling while removing physical controls from the interaction. Touch is also a sense that is underutilized in interactions with technology and could provide new ways of interaction to support users. While users are using information technology in certain situations, they cannot visually and mentally focus completely during the interaction. Humans can utilize their sense of touch more comprehensively in interactions and learn to understand tactile information while interacting with information technology. This thesis introduces a set of experiments that evaluate human capabilities to understand and notice tactile information provided by current actuator technology and further introduces a couple of examples of haptic user interfaces (HUIs) to use under eyes-free use scenarios. These experiments evaluate the benefits of such interfaces for users and concludes with some guidelines and methods for how to create this kind of user interfaces. The experiments in this thesis can be divided into three groups. In the first group, with the first two experiments, the detection of vibrotactile stimuli and interpretation of the abstract meaning of vibrotactile feedback was evaluated. Experiments in the second group evaluated how to design rhythmic vibrotactile tactons to be basic vibrotactile primitives for HUIs. The last group of two experiments evaluated how these HUIs benefit the users in the distracted and eyes-free interaction scenarios. The primary aim for this series of experiments was to evaluate if utilizing the current level of actuation technology could be used more comprehensively than in current-day solutions with simple haptic alerts and notifications. Thus, to find out if the comprehensive use of vibrotactile feedback in interactions would provide additional benefits for the users, compared to the current level of haptic interaction methods and nonhaptic interaction methods. The main finding of this research is that while using more comprehensive HUIs in eyes-free distracted-use scenarios, such as while driving a car, the user’s main task, driving, is performed better. Furthermore, users liked the comprehensively haptified user interfaces
    corecore