2,140 research outputs found

    A Customizable Camera-based Human Computer Interaction System Allowing People With Disabilities Autonomous Hands Free Navigation of Multiple Computing Task

    Full text link
    Many people suffer from conditions that lead to deterioration of motor control and makes access to the computer using traditional input devices difficult. In particular, they may loose control of hand movement to the extent that the standard mouse cannot be used as a pointing device. Most current alternatives use markers or specialized hardware to track and translate a user's movement to pointer movement. These approaches may be perceived as intrusive, for example, wearable devices. Camera-based assistive systems that use visual tracking of features on the user's body often require cumbersome manual adjustment. This paper introduces an enhanced computer vision based strategy where features, for example on a user's face, viewed through an inexpensive USB camera, are tracked and translated to pointer movement. The main contributions of this paper are (1) enhancing a video based interface with a mechanism for mapping feature movement to pointer movement, which allows users to navigate to all areas of the screen even with very limited physical movement, and (2) providing a customizable, hierarchical navigation framework for human computer interaction (HCI). This framework provides effective use of the vision-based interface system for accessing multiple applications in an autonomous setting. Experiments with several users show the effectiveness of the mapping strategy and its usage within the application framework as a practical tool for desktop users with disabilities.National Science Foundation (IIS-0093367, IIS-0329009, 0202067

    Interaction tasks and controls for public display applications

    Get PDF
    Public displays are becoming increasingly interactive and a broad range of interaction mechanisms can now be used to create multiple forms of interaction. However, the lack of interaction abstractions forces each developer to create specific approaches for dealing with interaction, preventing users from building consistent expectations on how to interact across different display systems. There is a clear analogy with the early days of the graphical user interface, when a similar problem was addressed with the emergence of high-level interaction abstractions that provided consistent interaction experiences to users and shielded developers from low-level details. This work takes a first step in that same direction by uncovering interaction abstractions that may lead to the emergence of interaction controls for applications in public displays. We identify a new set of interaction tasks focused on the specificities of public displays; we characterise interaction controls that may enable those interaction tasks to be integrated into applications; we create a mapping between the high-level abstractions provided by the interaction tasks and the concrete interaction mechanisms that can be implemented by those displays. Together, these contributions constitute a step towards the emergence of programming toolkits with widgets that developers could incorporate into their public display applications.The research has received funding from the European Union Seventh Framework Programme (FP7/2007-2013) under Grant agreement no. 244011 (PD-NET). Jorge Cardoso has been supported by "Fundacao para a Ciencia e Tecnologia" (FCT) and "Programa Operacional Ciencia e Inovacao 2010", co-funded by the Portuguese Government and European Union by FEDER Program and by FCT training Grant SFRH/BD/47354/2008

    Neurophysiological and Behavioral Responses to Music Therapy in Vegetative and Minimally Conscious States

    Get PDF
    Assessment of awareness for those with disorders of consciousness is a challenging undertaking, due to the complex presentation of the population. Debate surrounds whether behavioral assessments provide greatest accuracy in diagnosis compared to neuro-imaging methods, and despite developments in both, misdiagnosis rates remain high. Music therapy may be effective in the assessment and rehabilitation with this population due to effects of musical stimuli on arousal, attention, and emotion, irrespective of verbal or motor deficits. However, an evidence base is lacking as to which procedures are most effective. To address this, a neurophysiological and behavioral study was undertaken comparing electroencephalogram (EEG), heart rate variability, respiration, and behavioral responses of 20 healthy subjects with 21 individuals in vegetative or minimally conscious states (VS or MCS). Subjects were presented with live preferred music and improvised music entrained to respiration (procedures typically used in music therapy), recordings of disliked music, white noise, and silence. ANOVA tests indicated a range of significant responses (p ? 0.05) across healthy subjects corresponding to arousal and attention in response to preferred music including concurrent increases in respiration rate with globally enhanced EEG power spectra responses (p = 0.05–0.0001) across frequency bandwidths. Whilst physiological responses were heterogeneous across patient cohorts, significant post hoc EEG amplitude increases for stimuli associated with preferred music were found for frontal midline theta in six VS and four MCS subjects, and frontal alpha in three VS and four MCS subjects (p = 0.05–0.0001). Furthermore, behavioral data showed a significantly increased blink rate for preferred music (p = 0.029) within the VS cohort. Two VS cases are presented with concurrent changes (p ? 0.05) across measures indicative of discriminatory responses to both music therapy procedures. A third MCS case study is presented highlighting how more sensitive selective attention may distinguish MCS from VS. The findings suggest that further investigation is warranted to explore the use of music therapy for prognostic indicators, and its potential to support neuroplasticity in rehabilitation programs

    Brain-Computer Interfacing for Wheelchair Control by Detecting Voluntary Eye Blinks

    Get PDF
    The human brain is considered as one of the most powerful quantum computers and combining the human brain with technology can even outperform artificial intelligence. Using a Brain-Computer Interface (BCI) system, the brain signals can be analyzed and programmed for specific tasks. This research work employs BCI technology for a medical application that gives the unfortunate paralyzed individuals the capability to interact with their surroundings solely using voluntary eye blinks. This research contributes to the existing technology to be more feasible by introducing a modular design with three physically separated components: a headwear, a computer, and a wheelchair. As the signal-to-noise ratio (SNR) of the existing systems is too high to separate the eye blink artifacts from the regular EEG signal, a precise ThinkGear module is used which acquired the raw EEG signal through a single dry electrode. This chip offers an advanced filtering technology that has a high noise immunity along with an embedded Bluetooth module using which the acquired signal is transferred wirelessly to a computer. A MATLAB program captures voluntary eye blink artifacts from the brain waves and commands the movement of a miniature wheelchair via Bluetooth. To distinguish voluntary eye blinks from involuntary eye blinks, blink strength thresholds are determined. A Graphical User Interface (GUI) designed in MATLAB displays the EEG waves in real-time and enables the user to determine the movements of the wheelchair which is specially designed to take commands from the GUI.  The findings from the testing phase unveil the advantages of a modular design and the efficacy of using eye blink artifacts as the control element for brain-controlled wheelchairs. The work presented here gives a basic understanding of the functionality of a BCI system, and provides eye blink-controlled navigation of a wheelchair for patients suffering from severe paralysis

    Prediction of Digital Eye Strain Due to Online Learning Based on the Number of Blinks

    Get PDF
    Eye strain is a big concern, especially when it comes to continuous and prolonged online learning. If this is allowed to continue, it will result in Computer Vision Syndrome, also known as Digital Eye Strain (DES), which includes headaches, blurred vision, dry eyes, and even neck and shoulder pain. This condition can be observed either directly based on excessive eye blinking or indirectly based on observations of the electrical activity of eye movements or electrooculography (EOG). The observed blink signal from the EOG, as a representation of eye strain, is the focus of this study. Data acquisition was obtained using the EOG sensor and was carried out on the condition that the participants were conducting online learning activities. There are four different modes of observation taken in succession: when the eye is in a viewing state but without blinking, when the eye blinks intentionally, when the eye is closed, and finally when the eye sees naturally. Observation time is 10s, 20s and 30s, where each interval is performed three times for every mode. The obtained signal is processed by the proposed method. The resulting signal is then labeled as a Blinking signal. Determination of the number of blinks or CNT_PEAK is the result of training this signal by tuning its threshold and width. If the number of blinks is less than or more than 17 then the system will provide a prediction of eye status which is stated in two categories, the first is normal eye while the last is eye strain or fatigue
    corecore