26 research outputs found

    On object selection in gaze controlled environments

    Get PDF
    In the past twenty years, gaze control has become a reliable alternative input method not only for handicapped users. The selection of objects, however, which is of highest importance and of highest frequency in computer control, requires explicit control not inherent in eye movements. Objects have been therefore usually selected via prolonged fixations (dwell times). Dwell times seemed to be for many years the unique reliable method for selection. In this paper, we review pros and cons of classical selection methods and novel metaphors, which are based on pies and gestures. The focus is on the effectiveness and efficiency of selections. In order to estimate the potential of current suggestions for selection, a basic empirical comparison is recommended

    Eye typing in application: A comparison of two systems with ALS patients

    Get PDF
    A variety of eye typing systems has been developed during the last decades. Such systems can provide support for people who lost the ability to communicate, e.g. patients suffering from motor neuron diseases such as amyotrophic lateral sclerosis (ALS). In the current retrospective analysis, two eye typing applications were tested (EyeGaze, GazeTalk) by ALS patients (N = 4) in order to analyze objective performance measures and subjective ratings. An advantage of the EyeGaze system was found for most of the evaluated criteria. The results are discussed in respect of the special target population and in relation to requirements of eye tracking devices

    Enhancing an Eye-Tracker based Human-Computer Interface with Multi-modal Accessibility Applied for Text Entry

    Get PDF
    In natural course, human beings usually make use of multi-sensory modalities for effective communication or efficiently executing day-to-day tasks. For instance, during verbal conversations we make use of voice, eyes, and various body gestures. Also effective human-computer interaction involves hands, eyes, and voice, if available. Therefore by combining multi-sensory modalities, we can make the whole process more natural and ensure enhanced performance even for the disabled users. Towards this end, we have developed a multi-modal human-computer interface (HCI) by combining an eye-tracker with a soft-switch which may be considered as typically representing another modality. This multi-modal HCI is applied for text entry using a virtual keyboard appropriately designed in-house, facilitating enhanced performance. Our experimental results demonstrate that using multi-modalities for text entry through the virtual keyboard is more efficient and less strenuous than single modality system and also solves the Midas-touch problem, which is inherent in an eye-tracker based HCI system where only dwell time is used for selecting a character

    Technology and the Politics of Mobility: Evidence Generation in Accessible Transport Activism

    Get PDF
    Digital technologies offer the possibility of community empowerment via the reconfiguration of public services. This potential relies on actively involved citizens engaging with decision makers to pursue civic goals. In this paper we study one such group of involved citizens, examining the evidencing practices of a rare disease charity campaigning for accessible public transport. Through fieldwork and interviews, we highlight the ways in which staff and volunteers assembled and presented different forms of evidence, in doing so reframing what is conceived as 'valid knowledge'. We note the challenges this group faced in capturing experiential knowledge around the accessibility barriers of public transport, and the trade-offs that are made when presenting evidence to policy and decision makers. We offer a number of design considerations for future HCI research, focusing on how digital technology might be configured more appropriately to support campaigning around the politics of mobility

    Design of a portable device: Toward assisting in tongue-strengthening exercises and dysphagia management

    Get PDF
    A Tongue-Machine Interaction System (TMIS) can serve as a valuable tool for tongue strengthening training which could contribute to rehabilitation of patients with dysphagia and eventually help in mending the oropharyngeal pattern of swallowing. The TMIS can also facilitate research into dysphagia, as tongue positioning and Range-of-Motion are commonly used outcome parameters in dysphagia research. Using a TMIS (for interacting with computers, a variety of communication devices and mobility support systems) would be tantamount to performing tongue muscle strengthening exercises. Such exercises can help patients with dysphagia in improving strength of the oral musculature. TMIS’s features can also provide valuable biofeedback during the tongue muscle exercises. The adoption of TMIS’s in clinical practice has been limited in the past since many of them require patients to have a palatal plate or some component of interactivity mounted in the mouth and/or on the tongue. This paper reports the design and implementation of a portable, low-cost, minimally invasive and, easy to learn TMIS which can be utilized for training and strengthening of tongue musculature. The selection and incorporation of design features important to the target patient demography are also discussed

    Advantages of Eye-Gaze over Head-Gaze-Based Selection in Virtual and Augmented Reality under Varying Field of Views

    Get PDF
    Blattgerste J, Renner P, Pfeiffer T. Advantages of Eye-Gaze over Head-Gaze-Based Selection in Virtual and Augmented Reality under Varying Field of Views. In: COGAIN '18. Proceedings of the Symposium on Communication by Gaze Interaction. New York: ACM; 2018.The current best practice for hands-free selection using Virtual and Augmented Reality (VR/AR) head-mounted displays is to use head-gaze for aiming and dwell-time or clicking for triggering the selection. There is an observable trend for new VR and AR devices to come with integrated eye-tracking units to improve rendering, to provide means for attention analysis or for social interactions. Eye-gaze has been successfully used for human-computer interaction in other domains, primarily on desktop computers. In VR/AR systems, aiming via eye-gaze could be significantly faster and less exhausting than via head-gaze. To evaluate benefits of eye-gaze-based interaction methods in VR and AR, we compared aiming via head-gaze and aiming via eye-gaze. We show that eye-gaze outperforms head-gaze in terms of speed, task load, required head movement and user preference. We furthermore show that the advantages of eye-gaze further increase with larger FOV sizes

    Intelligent Interfaces to Empower People with Disabilities

    Full text link
    Severe motion impairments can result from non-progressive disorders, such as cerebral palsy, or degenerative neurological diseases, such as Amyotrophic Lateral Sclerosis (ALS), Multiple Sclerosis (MS), or muscular dystrophy (MD). They can be due to traumatic brain injuries, for example, due to a traffic accident, or to brainste

    Disabling Access: Barriers to Eye Gaze Technology for Students with Disabilities

    Get PDF
    Major Research Paper (Master's), Critical Disability Studies, School of Health Policy and Management,Faculty of Health, York UniversityThe MRP concludes that scientific and biomedical models of disability have historically shaped government policy responses to disability and continue to do so today. Canadian policy and programs meant to facilitate access to Eye Gaze technology are guided by scientific understandings of disability, which embed systematic, procedural and training barriers into policy programs that are supposed to provide funding support to overcome financial barriers. A list of 10 classroom recommendations for barrier free access to Eye Gaze technology is presented using the social model approach, to help parents, educators and support workers identify and eliminate obstacles for users. The MRP ends with a call for further discussion and scholarship of Eye Gaze technology in classrooms, which provides readers with 6 recommended areas of Eye Gaze technology research
    corecore