3 research outputs found

    Supporting Eyes-Free Human–Computer Interaction with Vibrotactile Haptification

    Get PDF
    The sense of touch is a crucial sense when using our hands in complex tasks. Some tasks we learn to do even without sight by just using the sense of touch in our fingers and hands. Modern touchscreen devices, however, have lost some of that tactile feeling while removing physical controls from the interaction. Touch is also a sense that is underutilized in interactions with technology and could provide new ways of interaction to support users. While users are using information technology in certain situations, they cannot visually and mentally focus completely during the interaction. Humans can utilize their sense of touch more comprehensively in interactions and learn to understand tactile information while interacting with information technology. This thesis introduces a set of experiments that evaluate human capabilities to understand and notice tactile information provided by current actuator technology and further introduces a couple of examples of haptic user interfaces (HUIs) to use under eyes-free use scenarios. These experiments evaluate the benefits of such interfaces for users and concludes with some guidelines and methods for how to create this kind of user interfaces. The experiments in this thesis can be divided into three groups. In the first group, with the first two experiments, the detection of vibrotactile stimuli and interpretation of the abstract meaning of vibrotactile feedback was evaluated. Experiments in the second group evaluated how to design rhythmic vibrotactile tactons to be basic vibrotactile primitives for HUIs. The last group of two experiments evaluated how these HUIs benefit the users in the distracted and eyes-free interaction scenarios. The primary aim for this series of experiments was to evaluate if utilizing the current level of actuation technology could be used more comprehensively than in current-day solutions with simple haptic alerts and notifications. Thus, to find out if the comprehensive use of vibrotactile feedback in interactions would provide additional benefits for the users, compared to the current level of haptic interaction methods and nonhaptic interaction methods. The main finding of this research is that while using more comprehensive HUIs in eyes-free distracted-use scenarios, such as while driving a car, the user’s main task, driving, is performed better. Furthermore, users liked the comprehensively haptified user interfaces

    Haptic feedback in eye typing

    Get PDF
    Proper feedback is essential in gaze based interfaces, where the same modality is used for both perception and control. We measured how vibrotactile feedback, a form of haptic feedback, compares with the commonly used visual and auditory feedback in eye typing. Haptic feedback was found to produce results that are close to those of auditory feedback; both were easy to perceive and participants liked both the auditory ”click” and the tactile “tap” of the selected key. Implementation details (such as the placement of the haptic actuator) were also found important

    Latency guidelines for touchscreen virtual button feedback

    Get PDF
    Touchscreens are very widely used, especially in mobile phones. They feature many interaction methods, pressing a virtual button being one of the most popular ones. In addition to an inherent visual feedback, virtual button can provide audio and tactile feedback. Since mobile phones are essentially computers, the processing causes latencies in interaction. However, it has not been known, if the latency is an issue in mobile touchscreen virtual button interaction, and what the latency recommendations for visual, audio and tactile feedback are. The research in this thesis has investigated multimodal latency in mobile touchscreen virtual button interaction. For the first time, an affordable, but accurate tool was built to measure all three feedback latencies in touchscreens. For the first time, simultaneity perception of touch and feedback, as well as the effect of latency on virtual button perceived quality has been studied and thresholds found for both unimodal and bimodal feedback. The results from these studies were combined as latency guidelines for the first time. These guidelines enable interaction designers to establish requirements for mobile phone engineers to optimise the latencies on the right level. The latency measurement tool consisted of a high-speed camera, a microphone and an accelerometer for visual, audio and tactile feedback measurements. It was built with off-the-shelf components and, in addition, it was portable. Therefore, it could be copied at low cost or moved wherever needed. The tool enables touchscreen interaction designers to validate latencies in their experiments, making their results more valuable and accurate. The tool could benefit the touchscreen phone manufacturers, since it enables engineers to validate latencies during development of mobile phones. The tool has been used in mobile phone R&D within Nokia Corporation and for validation of a research device within the University of Glasgow. The guidelines established for unimodal feedback was as follows: visual feedback latency should be between 30 and 85 ms, audio between 20 and 70 ms and tactile between 5 and 50 ms. The guidelines were found to be different for bimodal feedback: visual feedback latency should be 95 and audio 70 ms when the feedback was visual-audio, visual 100 and tactile 55 ms when the feedback was visual-tactile and tactile 25 and audio 100 ms when the feedback was tactile-audio. These guidelines will help engineers and interaction designers to select and optimise latencies to be low enough, but not too low. Designers using these guidelines will make sure that most of the users will both perceive the feedback as simultaneous with their touch and experience high quality virtual buttons. The results from this thesis show that latency has a remarkable effect on touchscreen virtual buttons, and it is a key part of virtual button feedback design. The novel results enable researchers, designers and engineers to master the effect of latencies in research and development. This will lead to more accurate and reliable research results and help mobile phone manufacturers make better products
    corecore