314 research outputs found

    SymbolDesign: A User-centered Method to Design Pen-based Interfaces and Extend the Functionality of Pointer Input Devices

    Full text link
    A method called "SymbolDesign" is proposed that can be used to design user-centered interfaces for pen-based input devices. It can also extend the functionality of pointer input devices such as the traditional computer mouse or the Camera Mouse, a camera-based computer interface. Users can create their own interfaces by choosing single-stroke movement patterns that are convenient to draw with the selected input device and by mapping them to a desired set of commands. A pattern could be the trace of a moving finger detected with the Camera Mouse or a symbol drawn with an optical pen. The core of the SymbolDesign system is a dynamically created classifier, in the current implementation an artificial neural network. The architecture of the neural network automatically adjusts according to the complexity of the classification task. In experiments, subjects used the SymbolDesign method to design and test the interfaces they created, for example, to browse the web. The experiments demonstrated good recognition accuracy and responsiveness of the user interfaces. The method provided an easily-designed and easily-used computer input mechanism for people without physical limitations, and, with some modifications, has the potential to become a computer access tool for people with severe paralysis.National Science Foundation (IIS-0093367, IIS-0308213, IIS-0329009, EIA-0202067

    Personalised tiling paradigm for motor impaired users

    Get PDF

    On the benefits of speech and touch interaction with communication services for mobility impaired users

    Get PDF
    Although technology for communication has evolved tremendously over the past decades, mobility impaired individuals still face many difficulties interacting with communication services, either due to HCI issues or intrinsic design problems with the services. In this paper we present the results of a usability study, conducted with a group of five mobility impaired users, comprising paraplegic and quadriplegic individuals. The study participants carried out a set of tasks with a multimodal (speech, touch, gesture, keyboard and mouse) and multiplatform (mobile, desktop) prototype, offering an integrated access to communication and entertainment services, such as email, agenda, conferencing and social media. The prototype was designed to take into account the requirements captured from these users, with the objective of evaluating if the use of multimodal interfaces for communication and social media services, could improve the interaction with such services. Our study revealed that a multimodal prototype system, offering natural interaction modalities, especially supporting speech and touch, can in fact improve access to the presented services, contributing to a better social inclusion of mobility impaired individuals.info:eu-repo/semantics/acceptedVersio

    Design and evaluation of Multi-function Scanning System : a case study

    Get PDF
    International audienceWe present in this paper an assistive technology of communication and command for quadriplegics. To carry out this assistive technology, a user centered design approach with the patient, his occupational therapists and his family was conducted. Various iterative versions of the prototype have been defined by means of the SOKEYTO platform to meet the needs and the abilities of the quadriplegic person. Options carried out and consecutive choice will be reported as well the difficulties to implement. The assistive technology was used by one quadriplegic person. A qualitative evaluation is also reported

    "One-button” brain-computer interfaces

    Get PDF

    Accessibility and dimensionality: enhanced real time creative independence for digital musicians with quadriplegic cerebral palsy

    Get PDF
    Inclusive music activities for people with physical disabilities commonly emphasise facilitated processes, based both on constrained gestural capabilities, and on the simplicity of the available interfaces. Inclusive music processes employ consumer controllers, computer access tools and/or specialized digital musical instruments (DMIs). The first category reveals a design ethos identified by the authors as artefact multiplication -- many sliders, buttons, dials and menu layers; the latter types offer ergonomic accessibility through artefact magnification. We present a prototype DMI that eschews artefact multiplication in pursuit of enhanced real time creative independence. We reconceptualise the universal click-drag interaction model via a single sensor type, which affords both binary and continuous performance control. Accessibility is optimized via a familiar interaction model and through customized ergonomics, but it is the mapping strategy that emphasizes transparency and sophistication in the hierarchical correspondences between the available gesture dimensions and expressive musical cues. Through a participatory and progressive methodology we identify an ostensibly simple targeting gesture rich in dynamic and reliable features: (1) contact location; (2) contact duration; (3) momentary force; (4) continuous force, and; (5) dyad orientation. These features are mapped onto dynamic musical cues, most notably via new mappings for vibrato and arpeggio execution

    Multimodal access to social media services

    Get PDF
    Tese de mestrado integrado. Engenharia Informåtica e Computação. Faculdade de Engenharia. Universidade do Porto, Microsoft Language Development Center. 201

    Music Maker – A Camera-based Music Making Tool for Physical Rehabilitation

    Full text link
    The therapeutic effects of playing music are being recognized increasingly in the field of rehabilitation medicine. People with physical disabilities, however, often do not have the motor dexterity needed to play an instrument. We developed a camera-based human-computer interface called "Music Maker" to provide such people with a means to make music by performing therapeutic exercises. Music Maker uses computer vision techniques to convert the movements of a patient's body part, for example, a finger, hand, or foot, into musical and visual feedback using the open software platform EyesWeb. It can be adjusted to a patient's particular therapeutic needs and provides quantitative tools for monitoring the recovery process and assessing therapeutic outcomes. We tested the potential of Music Maker as a rehabilitation tool with six subjects who responded to or created music in various movement exercises. In these proof-of-concept experiments, Music Maker has performed reliably and shown its promise as a therapeutic device.National Science Foundation (IIS-0308213, IIS-039009, IIS-0093367, P200A01031, EIA-0202067 to M.B.); National Institutes of Health (DC-03663 to E.S.); Boston University (Dudley Allen Sargent Research Fund (to A.L.)
    • 

    corecore