1,232 research outputs found

    Dwell-free input methods for people with motor impairments

    Full text link
    Millions of individuals affected by disorders or injuries that cause severe motor impairments have difficulty performing compound manipulations using traditional input devices. This thesis first explores how effective various assistive technologies are for people with motor impairments. The following questions are studied: (1) What activities are performed? (2) What tools are used to support these activities? (3) What are the advantages and limitations of these tools? (4) How do users learn about and choose assistive technologies? (5) Why do users adopt or abandon certain tools? A qualitative study of fifteen people with motor impairments indicates that users have strong needs for efficient text entry and communication tools that are not met by existing technologies. To address these needs, this thesis proposes three dwell-free input methods, designed to improve the efficacy of target selection and text entry based on eye-tracking and head-tracking systems. They yield: (1) the Target Reverse Crossing selection mechanism, (2) the EyeSwipe eye-typing interface, and (3) the HGaze Typing interface. With Target Reverse Crossing, a user moves the cursor into a target and reverses over a goal to select it. This mechanism is significantly more efficient than dwell-time selection. Target Reverse Crossing is then adapted in EyeSwipe to delineate the start and end of a word that is eye-typed with a gaze path connecting the intermediate characters (as with traditional gesture typing). When compared with a dwell-based virtual keyboard, EyeSwipe affords higher text entry rates and a more comfortable interaction. Finally, HGaze Typing adds head gestures to gaze-path-based text entry to enable simple and explicit command activations. Results from a user study demonstrate that HGaze Typing has better performance and user satisfaction than a dwell-time method

    Ability-Based Methods for Personalized Keyboard Generation

    Full text link
    This study introduces an ability-based method for personalized keyboard generation, wherein an individual's own movement and human-computer interaction data are used to automatically compute a personalized virtual keyboard layout. Our approach integrates a multidirectional point-select task to characterize cursor control over time, distance, and direction. The characterization is automatically employed to develop a computationally efficient keyboard layout that prioritizes each user's movement abilities through capturing directional constraints and preferences. We evaluated our approach in a study involving 16 participants using inertial sensing and facial electromyography as an access method, resulting in significantly increased communication rates using the personalized keyboard (52.0 bits/min) when compared to a generically optimized keyboard (47.9 bits/min). Our results demonstrate the ability to effectively characterize an individual's movement abilities to design a personalized keyboard for improved communication. This work underscores the importance of integrating a user's motor abilities when designing virtual interfaces.Comment: 20 pages, 7 figure

    Exploitation of multiplayer interaction and development of virtual puppetry storytelling using gesture control and stereoscopic devices

    Get PDF
    With the rapid development of human-computer interaction technologies, the new media generation demands novel learning experiences with natural interaction and immersive experience. Considering that digital storytelling is a powerful pedagogical tool for young children, in this paper, we design an immersive storytelling environment that allows multiple players to use naturally interactive hand gestures to manipulate virtual puppetry for assisting narration. A set of multimodal interaction techniques is presented for a hybrid user interface that integrates existing 3D visualization and interaction devices including head-mounted displays and depth motion sensor. In this system, the young players could intuitively use hand gestures to manipulate virtual puppets to perform a story and interact with props in a virtual stereoscopic environment. We have conducted a user experiment with four young children for pedagogical evaluation, as well as system acceptability and interactivity evaluation by postgraduate students. The results show that our framework has great potential to stimulate learning abilities of young children through collaboration tasks. The stereoscopic head-mounted display outperformed the traditional monoscopic display in a comparison between the two

    Using Virtual Reality as a Tool in the Rehabilitation of Movement Abnormalities in Schizophrenia.

    Get PDF
    Movement abnormalities are prevalent across all stages of schizophrenia contributing to poor social functioning and reduced quality of life. To date, treatments are scarce, often involving pharmacological agents, but none have been shown to improve movement abnormalities effectively. Virtual reality (VR) is a tool used to simulate virtual environments where behavioral performance can be quantified safely across different tasks while exerting control over stimulus delivery, feedback and measurement in real time. Sensory information is transmitted via a head mounted display allowing users to directly interact with virtual objects and bodies using gestures and body movements in the real world to perform different actions, permitting a sense of immersion in the simulated virtual environment. Although, VR has been widely used for successful motor rehabilitation in a variety of different neurological domains, none have been exploited for motor rehabilitation in schizophrenia. The objectives of this article are to review movement abnormalities specific to schizophrenia, and how VR can be utilized to restore and improve motor functioning in patients with schizophrenia. Constructing VR-mediated motor-cognitive interventions that can help in retaining and transferring the learned outcomes to real life are also discussed

    A Framework for Research in Gamified Mobile Guide Applications using Embodied Conversational Agents (ECAs)

    Get PDF
    Mobile Guides are mobile applications that provide players with local and location-based services (LBS), such as navigation assistance, where and when they need them most. Advances in mobile technologies in recent years have enabled the gamification of these applications, opening up new opportunities to transfer education and culture through game play. However, adding traditional game elements such as PBLs (points, badges, and leaderboards) alone cannot ensure that the intended learning outcomes will be met, as the player’s cognitive resources are shared between the application and the surrounding environment. This distribution of resources prevents players from easily immersing themselves into the educational scenario. Adding artificial conversational characters (ECAs) that simulate the social norms found in real-life human-to-human guide scenarios has the potential to address this problem and improve the player’s experience and learning of cultural narratives [1]. Although significant progress has been made towards creating game-like mobile guides with ECAs ([2], [3]), there is still a lack of a unified framework that enables researchers and practitioners to investigate the potential effects of such applications to players and how to approach the concepts of player experience, cognitive accessibility and usability in this context. This paper presents a theoretically-well supported research framework consisted of four key components: differences in players, different features of the gamified task, aspects of how the ECA looks, sound or behaves and different mobile environments. Furthermore, it provides based on this framework a working definition of what player experience, cognitive accessibility and usability are in the context of game-like mobile guide applications. Finally, a synthesis of the results of six empirical studies conducted within this research framework is discussed and a series of design guidelines for the effective gamification of mobile guide applications using ECAs are presented. Results show that an ECA can positively affect the quality of the player’s experience, but it did not elicit better player retention of cultural narratives and navigation of routes
    • 

    corecore