35 research outputs found

    Effects of common keyboard layouts on physical effort : implications for kiosks and Internet banking

    Get PDF
    This study investigates the effect common keyboard layouts have physical effort. First, alphabetic keyboard layouts are experimentally compared to the QWERTY layout. Second, the number row often found on QWERTY keyboards are experimentally compared to numeric keypad layout. Our study shows that users operate more effectively using a QWERTY layout than an alphabetical layout. Moreover, users operate more effectively using a numeric keypad compared to a row of number keys. Implications for two important application areas in society, namely touch-based self-service kiosks and numeric input in context of Internet banking are discussed

    On object selection in gaze controlled environments

    Get PDF
    In the past twenty years, gaze control has become a reliable alternative input method not only for handicapped users. The selection of objects, however, which is of highest importance and of highest frequency in computer control, requires explicit control not inherent in eye movements. Objects have been therefore usually selected via prolonged fixations (dwell times). Dwell times seemed to be for many years the unique reliable method for selection. In this paper, we review pros and cons of classical selection methods and novel metaphors, which are based on pies and gestures. The focus is on the effectiveness and efficiency of selections. In order to estimate the potential of current suggestions for selection, a basic empirical comparison is recommended

    Filteryedping: a dwell-free eye typing technique

    Get PDF
    The ability to type using eye gaze only is extremely important for individuals with a severe motor disability. To eye type, the user currently must sequentially gaze at letters in a virtual keyboard and dwell on each desired letter for a specific amount of time to input that key. Dwell-based eye typing has two possible drawbacks: unwanted input if the dwell threshold is too short or slow typing rates if the threshold is long. We demonstrate an eye typing technique, which does not require the user to dwell on the letters that she wants to input. Our method automatically filters out unwanted letters from the sequence of letters gazed at while typing a word. It ranks candidate words based on their length and frequency and presents them to the user for confirmation. Spell correction and support for typing words not in the corpus are also included.SĂŁo Paulo Research Foundation (FAPESP) (grant #2012/01510-0)CAPESCNP

    Pervasive and standalone computing: The perceptual effects of variable multimedia quality.

    Get PDF
    The introduction of multimedia on pervasive and mobile communication devices raises a number of perceptual quality issues, however, limited work has been done examining the 3-way interaction between use of equipment, quality of perception and quality of service. Our work measures levels of informational transfer (objective) and user satisfaction (subjective)when users are presented with multimedia video clips at three different frame rates, using four different display devices, simulating variation in participant mobility. Our results will show that variation in frame-rate does not impact a user’s level of information assimilation, however, does impact a users’ perception of multimedia video ‘quality’. Additionally, increased visual immersion can be used to increase transfer of video information, but can negatively affect the users’ perception of ‘quality’. Finally, we illustrate the significant affect of clip-content on the transfer of video, audio and textual information, placing into doubt the use of purely objective quality definitions when considering multimedia presentations

    Wrist-worn pervasive gaze interaction

    Get PDF
    This paper addresses gaze interaction for smart home control, conducted from a wrist-worn unit. First we asked ten people to enact the gaze movements they would propose for e.g. opening a door or adjusting the room temperature. On basis of their suggestions we built and tested different versions of a prototype applying off-screen stroke input. Command prompts were given to twenty participants by text or arrow displays. The success rate achieved by the end of their first encounter with the system was 46% in average; it took them 1.28 seconds to connect with the system and 1.29 seconds to make a correct selection. Their subjective evaluations were positive with regard to the speed of the interaction. We conclude that gaze gesture input seems feasible for fast and brief remote control of smart home technology provided that robustness of tracking is improved

    Exploring Gaze for Assisting Freehand Selection-based Text Entry in AR

    Get PDF
    With eye-tracking increasingly available in Augmented Reality, we explore how gaze can be used to assist freehand gestural text entry. Here the eyes are often coordinated with manual input across the spatial positions of the keys. Inspired by this, we investigate gaze-assisted selection-based text entry through the concept of spatial alignment of both modalities. Users can enter text by aligning both gaze and manual pointer at each key, as a novel alternative to existing dwell-time or explicit manual triggers. We present a text entry user study comparing two of such alignment techniques to a gaze-only and a manual-only baseline. The results show that one alignment technique reduces physical finger movement by more than half compared to standard in-air finger typing, and is faster and exhibits less perceived eye fatigue than an eyes-only dwell-time technique. We discuss trade-offs between uni and multimodal text entry techniques, pointing to novel ways to integrate eye movements to facilitate virtual text entry
    corecore