2,548 research outputs found

    Investigating Performance and Usage of Input Methods for Soft Keyboard Hotkeys

    Get PDF
    Touch-based devices, despite their mainstream availability, do not support a unified and efficient command selection mechanism, available on every platform and application. We advocate that hotkeys, conventionally used as a shortcut mechanism on desktop computers, could be generalized as a command selection mechanism for touch-based devices, even for keyboard-less applications. In this paper, we investigate the performance and usage of soft keyboard shortcuts or hotkeys (abbreviated SoftCuts) through two studies comparing different input methods across sitting, standing and walking conditions. Our results suggest that SoftCuts not only are appreciated by participants but also support rapid command selection with different devices and hand configurations. We also did not find evidence that walking deters their performance when using the Once input method.Comment: 17+2 pages, published at Mobile HCI 202

    Tap 'N' Shake: Gesture-based Smartwatch-Smartphone Communications System

    Get PDF
    Smartwatches have recently seen a surge in popularity, and the new technology presents a number of interesting opportunities and challenges, many of which have not been adequately dealt with by existing applications. Current smartwatch messaging systems fail to adequately address the problem of smartwatches requiring two-handed interactions. This paper presents Tap 'n' Shake, a novel gesture-based messaging system for Android smartwatches and smartphones addressing the problem of two-handed interactions by utilising various motion-gestures within the applications. The results of a user evaluation carried out with sixteen subjects demonstrated the usefulness and usability of using gestures over two-handed interactions for smartwatches. Additionally, the study provides insight into the types of gestures that subjects preferred to use for various actions in a smartwatch-smartphone messaging system

    Fingers of a Hand Oscillate Together: Phase Syncronisation of Tremor in Hover Touch Sensing

    Get PDF
    When using non-contact finger tracking, fingers can be classified as to which hand they belong to by analysing the phase relation of physiological tremor. In this paper, we show how 3D capacitive sensors can pick up muscle tremor in fingers above a device. We develop a signal processing pipeline based on nonlinear phase synchronisation that can reliably group fingers to hands and experimentally validate our technique. This allows significant new gestural capabilities for 3D finger sensing without additional hardware

    Mobile interaction design : is it time for a universal gestural design system?

    Get PDF
    Interaction design in the last 12 years has transitioned to the small screen. Portable devices such as touch screen mobile phones have become the new personal computer. With this new touch screen technology, gestures have risen as the dominant interaction for mobile devices. Gestures will be the branch of interaction design this thesis will focus on as it is the most relevant to subject of portable touch screens. The purpose of this thesis is to determine through theory and a quantitative survey, if it is possible to have a universal gestural design system. The goal is to determine if the end users would benefit from universal gestures when using their touch screen devices. The theoretical part of the study focuses on User Experience (UX) and User Interface (UI) principles along with the interaction design ideas. Based on the replies of the survey and mirroring it to the theory of UX and UI principles as well an interaction design, the result of the study is that universal gestural design system would be better for the end users. The theory of UI states through five dimensions of interaction design, that some gestures and interactions are far more learnable, memorable and rewarding for users than others. According to the survey only eight percent of the survey respondents know all possible gestures of their Operating Systems and many feel tutorials are needed to help to learn the gestures. The majority of participants owned multiple touch screen devices from different manufacturers. Even though the end user might benefit from the universal gestural design systems within touch screen devices, it is still unlikely that it will happen due the nature of competition in the market between different operating systems

    Touchalytics: On the Applicability of Touchscreen Input as a Behavioral Biometric for Continuous Authentication

    Full text link
    We investigate whether a classifier can continuously authenticate users based on the way they interact with the touchscreen of a smart phone. We propose a set of 30 behavioral touch features that can be extracted from raw touchscreen logs and demonstrate that different users populate distinct subspaces of this feature space. In a systematic experiment designed to test how this behavioral pattern exhibits consistency over time, we collected touch data from users interacting with a smart phone using basic navigation maneuvers, i.e., up-down and left-right scrolling. We propose a classification framework that learns the touch behavior of a user during an enrollment phase and is able to accept or reject the current user by monitoring interaction with the touch screen. The classifier achieves a median equal error rate of 0% for intra-session authentication, 2%-3% for inter-session authentication and below 4% when the authentication test was carried out one week after the enrollment phase. While our experimental findings disqualify this method as a standalone authentication mechanism for long-term authentication, it could be implemented as a means to extend screen-lock time or as a part of a multi-modal biometric authentication system.Comment: to appear at IEEE Transactions on Information Forensics & Security; Download data from http://www.mariofrank.net/touchalytics

    Mobile Pointing Task in the Physical World: Balancing Focus and Performance while Disambiguating

    Get PDF
    International audienceWe address the problem of mobile distal selection of physical objects when pointing at them in augmented environments. We focus on the disambiguation step needed when several objects are selected with a rough pointing gesture. A usual disambiguation technique forces the users to switch their focus from the physical world to a list displayed on a handheld device's screen. In this paper, we explore the balance between change of users' focus and performance. We present two novel interaction techniques allowing the users to maintain their focus in the physical world. Both use a cycling mechanism, respectively performed with a wrist rolling gesture for P2Roll or with a finger sliding gesture for P2Slide. A user experiment showed that keeping users' focus in the physical world outperforms techniques that require the users to switch their focus to a digital representation distant from the physical objects, when disambiguating up to 8 objects

    Different strokes for different folks? Revealing the physical characteristics of smartphone users from their swipe gestures

    Get PDF
    Anthropometrics show that the lengths of many human body segments follow a common proportional relationship. To know the length of one body segment - such as a thumb - potentially provides a predictive route to other physical characteristics, such as overall standing height. In this study, we examined whether it is feasible that the length of a person’s thumb could be revealed from the way in which they complete swipe gestures on a touchscreen-based smartphone.From a corpus of approx. 19000 swipe gestures captured from 178 volunteers, we found that people with longer thumbs complete swipe gestures with shorter completion times, higher speeds and with higher accelerations than people with shorter thumbs. These differences were also observed to exist between our male and female volunteers, along with additional differences in the amount of touch pressure applied to the screen.Results are discussed in terms of linking behavioural and physical biometrics. Keywords: Touchscreen gestures, behavioral biometrics, physical biometrics<br/

    A Single-Handed Partial Zooming Technique for Touch-Screen Mobile Devices

    Get PDF
    Despite its ubiquitous use, the pinch zooming technique is not effective for one-handed interaction. We propose ContextZoom, a novel technique for single-handed zooming on touch-screen mobile devices. It allows users to specify any place on a device screen as the zooming center to ensure that the intended zooming target is always visible on the screen after zooming. ContextZoom supports zooming in/out a portion of a viewport, and provides a quick switch between the partial and whole viewports. We conducted an empirical evaluation of ContextZoom through a controlled lab experiment to compare ContextZoom and the Google maps’ single-handed zooming technique. Results show that ContextZoom outperforms the latter in task completion time and the number of discrete actions taken. Participants also reported higher levels of perceived effectiveness and overall satisfaction with ContextZoom than with the Google maps’ single-handed zooming technique, as well as a similar level of perceived ease of use
    • 

    corecore