248,159 research outputs found

    A Customizable Camera-based Human Computer Interaction System Allowing People With Disabilities Autonomous Hands Free Navigation of Multiple Computing Task

    Full text link
    Many people suffer from conditions that lead to deterioration of motor control and makes access to the computer using traditional input devices difficult. In particular, they may loose control of hand movement to the extent that the standard mouse cannot be used as a pointing device. Most current alternatives use markers or specialized hardware to track and translate a user's movement to pointer movement. These approaches may be perceived as intrusive, for example, wearable devices. Camera-based assistive systems that use visual tracking of features on the user's body often require cumbersome manual adjustment. This paper introduces an enhanced computer vision based strategy where features, for example on a user's face, viewed through an inexpensive USB camera, are tracked and translated to pointer movement. The main contributions of this paper are (1) enhancing a video based interface with a mechanism for mapping feature movement to pointer movement, which allows users to navigate to all areas of the screen even with very limited physical movement, and (2) providing a customizable, hierarchical navigation framework for human computer interaction (HCI). This framework provides effective use of the vision-based interface system for accessing multiple applications in an autonomous setting. Experiments with several users show the effectiveness of the mapping strategy and its usage within the application framework as a practical tool for desktop users with disabilities.National Science Foundation (IIS-0093367, IIS-0329009, 0202067

    Emotionface: Prototype facial expression display of emotion in music

    Get PDF
    Presented at the 10th International Conference on Auditory Display (ICAD2004)EmotionFace is a software interface for visually displaying the self-reported emotion expressed by music. Taken in reverse, it can be viewed as a facial expression whose auditory connection or exemplar is the time synchronized, associated music. The present instantiation of the software uses a simple schematic face with eyes and mouth moving according to a parabolic model: Smiling and frowning of mouth represents valence (happiness and sadness) and amount of opening of eyes represents arousal. Continuous emotional responses to music collected in previous research have been used to test and calibrate EmotionFace. The interface provides an alternative to the presentation of data on a two-dimensional emotion-space, the same space used for the collection of emotional data in response to music. These synthesized facial expressions make the observation of the emotion data expressed by music easier for the human observer to process and may be a more natural interface between the human and computer. Future research will include optimization of EmotionFace, using more sophisticated algorithms and facial expression databases, and the examination of the lag structure between facial expression and musical structure. Eventually, with more elaborate systems, automation and greater knowledge of emotion and associated musical structure, it may be possible to compose music meaningfully from synthesized and real facial expressions

    Human Machine Interface for Controlling a Robot Using Image Processing

    Get PDF
    This paper introduces a head movement based Human Machine Interface (HMI) that uses the right and left movements of head to control a robot motion. Here we present an approach for making an effective technique for real-time face orientation information system, to control a robot which can be efficiently used for Electrical Powered Wheelchair (EPW). Basically this project aims at application related to HMI. The system (machine) identifies the orientation of the face movement with respect to the pixel values of image in a certain areas. Initially we take an image and divide that whole image into three parts on the basis of its number of columns. On the basis of orientation of face, maximum pixel value of approximate same range of (R, G, and B value of a pixel) lie in one of divided parts of image. This information we transfer to the microcontroller through serial communication port and control the motion of robot like forward motion, left and right turn and stop in real time by using head movements
    • …
    corecore