4,499 research outputs found

    Tap 'N' Shake: Gesture-based Smartwatch-Smartphone Communications System

    Get PDF
    Smartwatches have recently seen a surge in popularity, and the new technology presents a number of interesting opportunities and challenges, many of which have not been adequately dealt with by existing applications. Current smartwatch messaging systems fail to adequately address the problem of smartwatches requiring two-handed interactions. This paper presents Tap 'n' Shake, a novel gesture-based messaging system for Android smartwatches and smartphones addressing the problem of two-handed interactions by utilising various motion-gestures within the applications. The results of a user evaluation carried out with sixteen subjects demonstrated the usefulness and usability of using gestures over two-handed interactions for smartwatches. Additionally, the study provides insight into the types of gestures that subjects preferred to use for various actions in a smartwatch-smartphone messaging system

    GlobalFestival: Evaluating Real World Interaction on a Spherical Display

    Get PDF
    Spherical displays present compelling opportunities for interaction in public spaces. However, there is little research into how touch interaction should control a spherical surface or how these displays are used in real world settings. This paper presents an in the wild deployment of an application for a spherical display called GlobalFestival that utilises two different touch interaction techniques. The first version of the application allows users to spin and tilt content on the display, while the second version only allows spinning the content. During the 4-day deployment, we collected overhead video data and on-display interaction logs. The analysis brings together quantitative and qualitative methods to understand how users approach and move around the display, how on screen interaction compares in the two versions of the application, and how the display supports social interaction given its novel form factor

    Investigating Performance and Usage of Input Methods for Soft Keyboard Hotkeys

    Get PDF
    Touch-based devices, despite their mainstream availability, do not support a unified and efficient command selection mechanism, available on every platform and application. We advocate that hotkeys, conventionally used as a shortcut mechanism on desktop computers, could be generalized as a command selection mechanism for touch-based devices, even for keyboard-less applications. In this paper, we investigate the performance and usage of soft keyboard shortcuts or hotkeys (abbreviated SoftCuts) through two studies comparing different input methods across sitting, standing and walking conditions. Our results suggest that SoftCuts not only are appreciated by participants but also support rapid command selection with different devices and hand configurations. We also did not find evidence that walking deters their performance when using the Once input method.Comment: 17+2 pages, published at Mobile HCI 202

    Musical Gesture through the Human Computer Interface: An Investigation using Information Theory

    Get PDF
    This study applies information theory to investigate human ability to communicate using continuous control sensors with a particular focus on informing the design of digital musical instruments. There is an active practice of building and evaluating such instruments, for instance, in the New Interfaces for Musical Expression (NIME) conference community. The fidelity of the instruments can depend on the included sensors, and although much anecdotal evidence and craft experience informs the use of these sensors, relatively little is known about the ability of humans to control them accurately. This dissertation addresses this issue and related concerns, including continuous control performance in increasing degrees-of-freedom, pursuit tracking in comparison with pointing, and the estimations of musical interface designers and researchers of human performance with continuous control sensors. The methodology used models the human-computer system as an information channel while applying concepts from information theory to performance data collected in studies of human subjects using sensing devices. These studies not only add to knowledge about human abilities, but they also inform on issues in musical mappings, ergonomics, and usability

    Interaction tasks and controls for public display applications

    Get PDF
    Public displays are becoming increasingly interactive and a broad range of interaction mechanisms can now be used to create multiple forms of interaction. However, the lack of interaction abstractions forces each developer to create specific approaches for dealing with interaction, preventing users from building consistent expectations on how to interact across different display systems. There is a clear analogy with the early days of the graphical user interface, when a similar problem was addressed with the emergence of high-level interaction abstractions that provided consistent interaction experiences to users and shielded developers from low-level details. This work takes a first step in that same direction by uncovering interaction abstractions that may lead to the emergence of interaction controls for applications in public displays. We identify a new set of interaction tasks focused on the specificities of public displays; we characterise interaction controls that may enable those interaction tasks to be integrated into applications; we create a mapping between the high-level abstractions provided by the interaction tasks and the concrete interaction mechanisms that can be implemented by those displays. Together, these contributions constitute a step towards the emergence of programming toolkits with widgets that developers could incorporate into their public display applications.The research has received funding from the European Union Seventh Framework Programme (FP7/2007-2013) under Grant agreement no. 244011 (PD-NET). Jorge Cardoso has been supported by "Fundacao para a Ciencia e Tecnologia" (FCT) and "Programa Operacional Ciencia e Inovacao 2010", co-funded by the Portuguese Government and European Union by FEDER Program and by FCT training Grant SFRH/BD/47354/2008

    Tap'n'shake:gesture-based smartwatch-smartphone communications system

    Get PDF
    Smartwatches have recently seen a surge in popularity, and the new technology presents a number of interesting opportunities and challenges, many of which have not been adequately dealt with by existing applications. Current smartwatch messaging systems fail to adequately address the problem of smartwatches requiring two-handed interactions. This paper presents Tap 'n' Shake, a novel gesture-based messaging system for Android smartwatches and smartphones addressing the problem of two-handed interactions by utilising various motion-gestures within the applications. The results of a user evaluation carried out with sixteen subjects demonstrated the usefulness and usability of using gestures over two-handed interactions for smartwatches. Additionally, the study provides insight into the types of gestures that subjects preferred to use for various actions in a smartwatch-smartphone messaging system

    Effective Identity Management on Mobile Devices Using Multi-Sensor Measurements

    Get PDF
    Due to the dramatic increase in popularity of mobile devices in the past decade, sensitive user information is stored and accessed on these devices every day. Securing sensitive data stored and accessed from mobile devices, makes user-identity management a problem of paramount importance. The tension between security and usability renders the task of user-identity verification on mobile devices challenging. Meanwhile, an appropriate identity management approach is missing since most existing technologies for user-identity verification are either one-shot user verification or only work in restricted controlled environments. To solve the aforementioned problems, we investigated and sought approaches from the sensor data generated by human-mobile interactions. The data are collected from the on-board sensors, including voice data from microphone, acceleration data from accelerometer, angular acceleration data from gyroscope, magnetic force data from magnetometer, and multi-touch gesture input data from touchscreen. We studied the feasibility of extracting biometric and behaviour features from the on-board sensor data and how to efficiently employ the features extracted to perform user-identity verification on the smartphone device. Based on the experimental results of the single-sensor modalities, we further investigated how to integrate them with hardware such as fingerprint and Trust Zone to practically fulfill a usable identity management system for both local application and remote services control. User studies and on-device testing sessions were held for privacy and usability evaluation.Computer Science, Department o
    corecore