34,986 research outputs found

    Evaluating Camera Mouse as a computer access system for augmentative and alternative communication in cerebral palsy: a case study

    Full text link
    PUPRPOSE: Individuals with disabilities, who do not have reliable motor control to manipulate a standard computer mouse, require alternate access methods for complete computer access and for communication as well. The Camera Mouse system visually tracks the movement of selected facial features using a camera to directly control the mouse pointer of a computer. Current research suggests that this system can successfully provide a means of computer access and communication for individuals with motor impairments. However, there are no existing data on the efficacy of the software’s communication output capabilities. The goal of this case study is to provide a comprehensive evaluation of Camera Mouse as a computer access method for Augmentative and Alternative Communication (AAC) for an individual with cerebral palsy, who prefers to use her unintelligible dysarthric speech to communicate her desires and thoughts despite having access to a traditional AAC system. METHOD: The current study compared the Camera Mouse system, the Tobii PCEye Mini (a popular commercially available eye tracking device) paired with speech generating technology, and natural speech using a variety of tasks in a single dysarthric speaker. Tasks consisted of two questionnaires designed to measure psychosocial impact and satisfaction with assistive technology, two sentence intelligibility tasks that were judged by 4 unfamiliar listeners, and two language samples designed to measure expressive language. Each task was completed three times—once for each communication modality in question: natural speech, Camera Mouse-to-speech system, and Tobii eye tracker-to- speech system. Participant responses were recorded and transcribed. RESULTS: Data were analyzed in terms of psychosocial effects, user satisfaction, communication efficiency (using intelligibility and rate), and various measures of expressive output ability, to determine which modality offered the highest communicative aptitude. Measures showed that when paired with an orthographic selection interface and speech-generating device, the Camera Mouse and Tobii eye tracker resulted in greatly increased intelligibility. However, natural speech was superior to assistive technology options in all other measures, including psychosocial impact, satisfaction, communication efficiency, and several expressive language components. Though results indicate that use of the Tobii eye tracker resulted in a slightly higher rate and intelligibility, the participant reported increased satisfaction and psychosocial impact when using the novel Camera Mouse access system. CONCLUSION: This study is the first to provide quantitative information regarding the efficiency, psychosocial impact, user satisfaction, and expressive language capabilities of Camera Mouse as a computer access system for AAC. This study shows promising results for Camera Mouse as a functional access system for individuals with disabilities and for future AAC applications as well.2018-08-28T00:00:00

    Enabling Disabled Persons to Gain Access to Digital Media

    Get PDF
    A report describes the first phase in an effort to enhance the NaviGaze software to enable profoundly disabled persons to operate computers. (Running on a Windows-based computer equipped with a video camera aimed at the user s head, the original NaviGaze software processes the user's head movements and eye blinks into cursor movements and mouse clicks to enable hands-free control of the computer.) To accommodate large variations in movement capabilities among disabled individuals, one of the enhancements was the addition of a graphical user interface for selection of parameters that affect the way the software interacts with the computer and tracks the user s movements. Tracking algorithms were improved to reduce sensitivity to rotations and reduce the likelihood of tracking the wrong features. Visual feedback to the user was improved to provide an indication of the state of the computer system. It was found that users can quickly learn to use the enhanced software, performing single clicks, double clicks, and drags within minutes of first use. Available programs that could increase the usability of NaviGaze were identified. One of these enables entry of text by using NaviGaze as a mouse to select keys on a virtual keyboard

    Low-cost natural interface based on head movements

    Get PDF
    Sometimes people look for freedom in the virtual world. However, not all have the possibility to interact with a computer in the same way. Nowadays, almost every job requires interaction with computerized systems, so people with physical impairments do not have the same freedom to control a mouse, a keyboard or a touchscreen. In the last years, some of the government programs to help people with reduced mobility suffered a lot with the global economic crisis and some of those programs were even cut down to reduce costs. This paper focuses on the development of a touchless human-computer interface, which allows anyone to control a computer without using a keyboard, mouse or touchscreen. By reusing Microsoft Kinect sensors from old videogames consoles, a cost-reduced, easy to use, and open-source interface was developed, allowing control of a computer using only the head, eyes or mouth movements, with the possibility of complementary sound commands. There are already available similar commercial solutions, but they are so expensive that their price tends to be a real obstacle in their purchase; on the other hand, free solutions usually do not offer the freedom that people with reduced mobility need. The present solution tries to address these drawbacks. (C) 2015 Published by Elsevier B.V

    Surface electromyographic control of a novel phonemic interface for speech synthesis

    Full text link
    Many individuals with minimal movement capabilities use AAC to communicate. These individuals require both an interface with which to construct a message (e.g., a grid of letters) and an input modality with which to select targets. This study evaluated the interaction of two such systems: (a) an input modality using surface electromyography (sEMG) of spared facial musculature, and (b) an onscreen interface from which users select phonemic targets. These systems were evaluated in two experiments: (a) participants without motor impairments used the systems during a series of eight training sessions, and (b) one individual who uses AAC used the systems for two sessions. Both the phonemic interface and the electromyographic cursor show promise for future AAC applications.F31 DC014872 - NIDCD NIH HHS; R01 DC002852 - NIDCD NIH HHS; R01 DC007683 - NIDCD NIH HHS; T90 DA032484 - NIDA NIH HHShttps://www.ncbi.nlm.nih.gov/pubmed/?term=Surface+electromyographic+control+of+a+novel+phonemic+interface+for+speech+synthesishttps://www.ncbi.nlm.nih.gov/pubmed/?term=Surface+electromyographic+control+of+a+novel+phonemic+interface+for+speech+synthesisPublished versio

    Interactive form creation: exploring the creation and manipulation of free form through the use of interactive multiple input interface

    Get PDF
    Most current CAD systems support only the two most common input devices: a mouse and a keyboard that impose a limit to the degree of interaction that a user can have with the system. However, it is not uncommon for users to work together on the same computer during a collaborative task. Beside that, people tend to use both hands to manipulate 3D objects; one hand is used to orient the object while the other hand is used to perform some operation on the object. The same things could be applied to computer modelling in the conceptual phase of the design process. A designer can rotate and position an object with one hand, and manipulate the shape [deform it] with the other hand. Accordingly, the 3D object can be easily and intuitively changed through interactive manipulation of both hands.The research investigates the manipulation and creation of free form geometries through the use of interactive interfaces with multiple input devices. First the creation of the 3D model will be discussed; several different types of models will be illustrated. Furthermore, different tools that allow the user to control the 3D model interactively will be presented. Three experiments were conducted using different interactive interfaces; two bi-manual techniques were compared with the conventional one-handed approach. Finally it will be demonstrated that the use of new and multiple input devices can offer many opportunities for form creation. The problem is that few, if any, systems make it easy for the user or the programmer to use new input devices

    Towards predicting web searcher gaze position from mouse movements

    Get PDF
    Abstract A key problem in information retrieval is inferring the searcher's interest in the results, which can be used for implicit feedback, query suggestion, and result ranking and summarization. One important indicator of searcher interest is gaze position -that is, the results or the terms in a result listing where a searcher concentrates her attention. Capturing this information normally requires eye tracking equipment, which until now has limited the use of gaze-based feedback to the laboratory. While previous research has reported a correlation between mouse movement and gaze position, we are not aware of prior work on automatically inferring searcher's gaze position from mouse movement or similar interface interactions. In this paper, we report the first results on automatically inferring whether the searcher's gaze position is coordinated with the mouse position -a crucial step towards predicting the searcher gaze position by analyzing the computer mouse movements

    A Self-initializing Eyebrow Tracker for Binary Switch Emulation

    Full text link
    We designed the Eyebrow-Clicker, a camera-based human computer interface system that implements a new form of binary switch. When the user raises his or her eyebrows, the binary switch is activated and a selection command is issued. The Eyebrow-Clicker thus replaces the "click" functionality of a mouse. The system initializes itself by detecting the user's eyes and eyebrows, tracks these features at frame rate, and recovers in the event of errors. The initialization uses the natural blinking of the human eye to select suitable templates for tracking. Once execution has begun, a user therefore never has to restart the program or even touch the computer. In our experiments with human-computer interaction software, the system successfully determined 93% of the time when a user raised his eyebrows.Office of Naval Research; National Science Foundation (IIS-0093367

    SymbolDesign: A User-centered Method to Design Pen-based Interfaces and Extend the Functionality of Pointer Input Devices

    Full text link
    A method called "SymbolDesign" is proposed that can be used to design user-centered interfaces for pen-based input devices. It can also extend the functionality of pointer input devices such as the traditional computer mouse or the Camera Mouse, a camera-based computer interface. Users can create their own interfaces by choosing single-stroke movement patterns that are convenient to draw with the selected input device and by mapping them to a desired set of commands. A pattern could be the trace of a moving finger detected with the Camera Mouse or a symbol drawn with an optical pen. The core of the SymbolDesign system is a dynamically created classifier, in the current implementation an artificial neural network. The architecture of the neural network automatically adjusts according to the complexity of the classification task. In experiments, subjects used the SymbolDesign method to design and test the interfaces they created, for example, to browse the web. The experiments demonstrated good recognition accuracy and responsiveness of the user interfaces. The method provided an easily-designed and easily-used computer input mechanism for people without physical limitations, and, with some modifications, has the potential to become a computer access tool for people with severe paralysis.National Science Foundation (IIS-0093367, IIS-0308213, IIS-0329009, EIA-0202067
    • …
    corecore