2 research outputs found

    A multi-modal human machine interface for controlling an intelligent wheelchair using face movements

    No full text
    This paper introduces a novel face movement based human machine interface (HMI) that uses jaw clenching and eye closing movements to control an electric powered wheelchair (EPW). A multi-modality HMI derived from both facial EMG and face image information is developed and testified in comparison with a traditional joystick control in an indoor corridor environment. In the experiment, ten repetitive experiments are carried out in a navigation task by controlling an EPW using either face movement control or joystick control. Wheelchair trajectories and execution time during the task are recorded to evaluate the performance of the new face HMI. © 2011 IEEE
    corecore