726 research outputs found

    Latency guidelines for touchscreen virtual button feedback

    Get PDF
    Touchscreens are very widely used, especially in mobile phones. They feature many interaction methods, pressing a virtual button being one of the most popular ones. In addition to an inherent visual feedback, virtual button can provide audio and tactile feedback. Since mobile phones are essentially computers, the processing causes latencies in interaction. However, it has not been known, if the latency is an issue in mobile touchscreen virtual button interaction, and what the latency recommendations for visual, audio and tactile feedback are. The research in this thesis has investigated multimodal latency in mobile touchscreen virtual button interaction. For the first time, an affordable, but accurate tool was built to measure all three feedback latencies in touchscreens. For the first time, simultaneity perception of touch and feedback, as well as the effect of latency on virtual button perceived quality has been studied and thresholds found for both unimodal and bimodal feedback. The results from these studies were combined as latency guidelines for the first time. These guidelines enable interaction designers to establish requirements for mobile phone engineers to optimise the latencies on the right level. The latency measurement tool consisted of a high-speed camera, a microphone and an accelerometer for visual, audio and tactile feedback measurements. It was built with off-the-shelf components and, in addition, it was portable. Therefore, it could be copied at low cost or moved wherever needed. The tool enables touchscreen interaction designers to validate latencies in their experiments, making their results more valuable and accurate. The tool could benefit the touchscreen phone manufacturers, since it enables engineers to validate latencies during development of mobile phones. The tool has been used in mobile phone R&D within Nokia Corporation and for validation of a research device within the University of Glasgow. The guidelines established for unimodal feedback was as follows: visual feedback latency should be between 30 and 85 ms, audio between 20 and 70 ms and tactile between 5 and 50 ms. The guidelines were found to be different for bimodal feedback: visual feedback latency should be 95 and audio 70 ms when the feedback was visual-audio, visual 100 and tactile 55 ms when the feedback was visual-tactile and tactile 25 and audio 100 ms when the feedback was tactile-audio. These guidelines will help engineers and interaction designers to select and optimise latencies to be low enough, but not too low. Designers using these guidelines will make sure that most of the users will both perceive the feedback as simultaneous with their touch and experience high quality virtual buttons. The results from this thesis show that latency has a remarkable effect on touchscreen virtual buttons, and it is a key part of virtual button feedback design. The novel results enable researchers, designers and engineers to master the effect of latencies in research and development. This will lead to more accurate and reliable research results and help mobile phone manufacturers make better products

    Electrostatic Friction Displays to Enhance Touchscreen Experience

    Get PDF
    Touchscreens are versatile devices that can display visual content and receive touch input, but they lack the ability to provide programmable tactile feedback. This limitation has been addressed by a few approaches generally called surface haptics technology. This technology modulates the friction between a user’s fingertip and a touchscreen surface to create different tactile sensations when the finger explores the touchscreen. This functionality enables the user to see and feel digital content simultaneously, leading to improved usability and user experiences. One major approach in surface haptics relies on the electrostatic force induced between the finger and an insulating surface on the touchscreen by supplying high AC voltage. The use of AC also induces a vibrational sensation called electrovibration to the user. Electrostatic friction displays require only electrical components and provide uniform friction over the screen. This tactile feedback technology not only allows easy and lightweight integration into touchscreen devices but also provides dynamic, rich, and satisfactory user interfaces. In this chapter, we review the fundamental operation of the electrovibration technology as well as applications have been built upon

    Does It Ping or Pong? Auditory and Tactile Classification of Materials by Bouncing Events

    Get PDF
    Two experiments studied the role of impact sounds and vibrations in classification of materials. The task consisted of feeling on an actuated surface and listening through headphones to the recorded feedback of a ping-pong ball hitting three flat objects respectively made of wood, plastic, and metal, and then identifying their material. In Experiment 1, sounds and vibrations were recorded by keeping the objects in mechanical isolation. In Experiment 2, recordings were taken while the same objects stood on a table, causing their resonances to fade faster due to mechanical coupling with the support. A control experiment, where participants listened to and touched the real objects in mechanical isolation, showed high accuracy of classification from either sounds (90% correct) or vibrations (67% correct). Classification of reproduced bounces in Experiments 1 and 2 was less precise. In both experiments, the main effect of material was statistically significant; conversely, the main effect of modality (auditory or tactile) was significant only in the control. Identification of plastic and especially metal was less accurate in Experiment 2, suggesting that participants, when possible, classified materials by longer resonance tails. Audio-tactile summation of classification accuracy was found, suggesting that multisensory integration influences the perception of materials. Such results have prospective application to the nonvisual design of virtual buttons, which is the object of our current research

    StateLens: A Reverse Engineering Solution for Making Existing Dynamic Touchscreens Accessible

    Full text link
    Blind people frequently encounter inaccessible dynamic touchscreens in their everyday lives that are difficult, frustrating, and often impossible to use independently. Touchscreens are often the only way to control everything from coffee machines and payment terminals, to subway ticket machines and in-flight entertainment systems. Interacting with dynamic touchscreens is difficult non-visually because the visual user interfaces change, interactions often occur over multiple different screens, and it is easy to accidentally trigger interface actions while exploring the screen. To solve these problems, we introduce StateLens - a three-part reverse engineering solution that makes existing dynamic touchscreens accessible. First, StateLens reverse engineers the underlying state diagrams of existing interfaces using point-of-view videos found online or taken by users using a hybrid crowd-computer vision pipeline. Second, using the state diagrams, StateLens automatically generates conversational agents to guide blind users through specifying the tasks that the interface can perform, allowing the StateLens iOS application to provide interactive guidance and feedback so that blind users can access the interface. Finally, a set of 3D-printed accessories enable blind people to explore capacitive touchscreens without the risk of triggering accidental touches on the interface. Our technical evaluation shows that StateLens can accurately reconstruct interfaces from stationary, hand-held, and web videos; and, a user study of the complete system demonstrates that StateLens successfully enables blind users to access otherwise inaccessible dynamic touchscreens.Comment: ACM UIST 201

    Eignung von virtueller Physik und Touch-Gesten in Touchscreen-Benutzerschnittstellen fĂĽr kritische Aufgaben

    Get PDF
    The goal of this reasearch was to examine if modern touch screen interaction concepts that are established on consumer electronic devices like smartphones can be used in time-critical and safety-critical use cases like for machine control or healthcare appliances. Several prevalent interaction concepts with and without touch gestures and virtual physics were tested experimentally in common use cases to assess their efficiency, error rate and user satisfaction during task completion. Based on the results, design recommendations for list scrolling and horizontal dialog navigation are given.Das Ziel dieser Forschungsarbeit war es zu untersuchen, ob moderne Touchscreen-Interaktionskonzepte, die auf Consumer-Electronic-Geräten wie Smartphones etabliert sind, für zeit- und sicherheitskritische Anwendungsfälle wie Maschinensteuerung und Medizingeräte geeignet sind. Mehrere gebräuchliche Interaktionskonzepte mit und ohne Touch-Gesten und virtueller Physik wurden in häufigen Anwendungsfällen experimentell auf ihre Effizienz, Fehlerrate und Nutzerzufriedenheit bei der Aufgabenlösung untersucht. Basierend auf den Resultaten werden Empfehlungen für das Scrollen in Listen und dem horizontalen Navigieren in mehrseitigen Software-Dialogen ausgesprochen

    Implementation and Characterization of Vibrotactile Interfaces

    Get PDF
    While a standard approach is more or less established for rendering basic vibratory cues in consumer electronics, the implementation of advanced vibrotactile feedback still requires designers and engineers to solve a number of technical issues. Several off-the-shelf vibration actuators are currently available, having different characteristics and limitations that should be considered in the design process. We suggest an iterative approach to design in which vibrotactile interfaces are validated by testing their accuracy in rendering vibratory cues and in measuring input gestures. Several examples of prototype interfaces yielding audio-haptic feedback are described, ranging from open-ended devices to musical interfaces, addressing their design and the characterization of their vibratory output

    An Innovative Human Machine Interface for UAS Flight Management System

    Get PDF
    The thesis is relative to the development of an innovative Human Machine Interface for UAS Flight Management System. In particular, touchscreena have been selected as data entry interface. The thesis has been done together at Alenia Aermacch

    A Toolkit for Tracking and Mediating Parametric Objects upon Commodity Mobile Devices

    Get PDF
    The large number of mobile devices introduced in the market in recent years provide numerous interaction opportunities. We used commercially available products to create an interface for interaction with diverse datasets. We extended this scenario to include multiple physical instances, realizing a real-time multi-party interaction session. Tangibles act as tools and containers, in a fashion that requires their physical location and orientation to be evaluated continuously during a session. We present a technique to detect an object using only the touchscreen of a mobile device as an input, by means of touch patterns created on the screen. This helps avoid additional embedded electronics and allows an object to be used as a tangible. We also conducted various other experiments using sensors available on mobile devices to detect objects. We also discuss the architecture and implementation of a low latency backend to relay and propagate events to all participating devices. This includes a low latency memory based key-value store implemented using Redis, and persistent storage using MongoDB. The server side scripts are implemented using the fast non-blocking Node.js Javascript engine to minimize impact on latency imparted during data processing
    • …
    corecore