2,569 research outputs found

    Toward a semi-self-paced EEG brain computer interface: decoding initiation state from non-initiation state in dedicated time slots.

    Get PDF
    Brain computer interfaces (BCIs) offer a broad class of neurologically impaired individuals an alternative means to interact with the environment. Many BCIs are "synchronous" systems, in which the system sets the timing of the interaction and tries to infer what control command the subject is issuing at each prompting. In contrast, in "asynchronous" BCIs subjects pace the interaction and the system must determine when the subject's control command occurs. In this paper we propose a new idea for BCI which draws upon the strengths of both approaches. The subjects are externally paced and the BCI is able to determine when control commands are issued by decoding the subject's intention for initiating control in dedicated time slots. A single task with randomly interleaved trials was designed to test whether it can be used as stimulus for inducing initiation and non-initiation states when the sensory and motor requirements for the two types of trials are very nearly identical. Further, the essential problem on the discrimination between initiation state and non-initiation state was studied. We tested the ability of EEG spectral power to distinguish between these two states. Among the four standard EEG frequency bands, beta band power recorded over parietal-occipital cortices provided the best performance, achieving an average accuracy of 86% for the correct classification of initiation and non-initiation states. Moreover, delta band power recorded over parietal and motor areas yielded a good performance and thus could also be used as an alternative feature to discriminate these two mental states. The results demonstrate the viability of our proposed idea for a BCI design based on conventional EEG features. Our proposal offers the potential to mitigate the signal detection challenges of fully asynchronous BCIs, while providing greater flexibility to the subject than traditional synchronous BCIs

    Sensory System for Implementing a Human—Computer Interface Based on Electrooculography

    Get PDF
    This paper describes a sensory system for implementing a human–computer interface based on electrooculography. An acquisition system captures electrooculograms and transmits them via the ZigBee protocol. The data acquired are analysed in real time using a microcontroller-based platform running the Linux operating system. The continuous wavelet transform and neural network are used to process and analyse the signals to obtain highly reliable results in real time. To enhance system usability, the graphical interface is projected onto special eyewear, which is also used to position the signal-capturing electrodes

    A Comparison of a Brain-Computer Interface and an Eye Tracker: Is There a More Appropriate Technology for Controlling a Virtual Keyboard in an ALS Patient?

    Get PDF
    The ability of people affected by amyotrophic lateral sclerosis (ALS), muscular dystrophy or spinal cord injuries to physically interact with the environment, is usually reduced. In some cases, these patients suffer from a syndrome known as locked-in syndrome (LIS), defined by the patient’s inability to make any move-ment but blinks and eye movements. Tech communication systems available for people in LIS are very limited, being those based on eye-tracking and brain-computer interface (BCI) the most useful for these patients. A comparative study between both technologies in an ALS patient is carried out: an eye tracker and a visual P300-based BCI. The purpose of the study presented in this paper is to show that the choice of the technology could depend on user´s preference. The evaluation of performance, workload and other subjective measures will allow us to determine the usability of the systems. The obtained results suggest that, even if for this patient the BCI technology is more appropriate, the technology should be always tested and adapted for each user.Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech

    Eye movements may cause motor contagion effects

    Get PDF
    When a person executes a movement, the movement is more errorful while observing another person’s actions that are incongruent rather than congruent with the executed action. This effect is known as “motor contagion”. Accounts of this effect are often grounded in simulation mechanisms: increased movement error emerges because the motor codes associated with observed actions compete with motor codes of the goal action. It is also possible, however, that the increased movement error is linked to eye movements that are executed simultaneously with the hand movement because oculomotor and manual-motor systems are highly interconnected. In the present study, participants performed a motor contagion task in which they executed horizontal arm movements while observing a model making either vertical (incongruent) or horizontal (congruent) movements under three conditions: no instruction, maintain central fixation, or track the model’s hand with the eyes. A significant motor contagion-like effect was only found in the ‘track’ condition. Thus, ‘motor contagion’ in the present task may be an artifact of simultaneously executed incongruent eye movements. These data are discussed in the context of stimulation and associative learning theories, and raise eye movements as a critical methodological consideration for future work on motor contagion

    A Method of EOG Signal Processing to Detect the Direction of Eye Movements

    Get PDF
    In this paper, a signal processing algorithm to detect eye movements is developed. The algorithm works with two kinds of inputs: derivative and amplitude level of electrooculographic signal. Derivative is used to detect signal edges and the amplitude level is used to filter noise. Depending of movement direction, different kinds of events are generated. Events are associated with a movement and its route. A hit rate equal to 94% is reached. This algorithm has been used to implement an application that allows computer control using ocular movementJunta de Andalucía p08-TIC-363

    A Method of EOG Signal Processing to Detect the Direction of Eye Movements

    Get PDF
    In this paper, a signal processing algorithm to detect eye movements is developed. The algorithm works with two kinds of inputs: derivative and amplitude level of electrooculographic signal. Derivative is used to detect signal edges and the amplitude level is used to filter noise. Depending of movement direction, different kinds of events are generated. Events are associated with a movement and its route. A hit rate equal to 94% is reached. This algorithm has been used to implement an application that allows computer control using ocular movementJunta de Andalucía p08-TIC-363

    Bio-signals application in solution of human- machine interface

    Get PDF
    The article deals with the field called Human Machine Interface. It is targeted on the characteristics of the bio signals and their evaluation. Based on the results of cerebral and cephalic bio signals analysis, it is possible to evaluate the immediate physical and psychical condition of the operator of the controlling process. Timely analysis of the bio signals allows avoiding the negative consequences of the incorrect decisions resulting from the attention fatigue

    Envelope filter sequence to delete blinks and overshoots

    Get PDF
    Background: Eye movements have been used in control interfaces and as indicators of somnolence, workload and concentration. Different techniques can be used to detect them: we focus on the electrooculogram (EOG) in which two kinds of interference occur: blinks and overshoots. While they both draw bell-shaped waveforms, blinks are caused by the eyelid, whereas overshoots occur due to target localization error and are placed on saccade. They need to be extracted from the EOG to increase processing effectiveness. Methods: This paper describes off- and online processing implementations based on lower envelope for removing bell-shaped noise; they are compared with a 300-msmedian filter. Techniques were analyzed using two kinds of EOG data: those modeled from our own design, and real signals. Using a model signal allowed to compare filtered outputs with ideal data, so that it was possible to quantify processing precision to remove noise caused by blinks, overshoots, and general interferences. We analyzed the ability to delete blinks and overshoots, and waveform preservation. Results: Our technique had a high capacity for reducing interference amplitudes (>97%), even exceeding median filter (MF) results. However, the MF obtained better waveform preservation, with a smaller dependence on fixation width. Conclusions: The proposed technique is better at deleting blinks and overshoots than the MF in model and real EOG signals

    Human Computer Interactions for Amyotrophic Lateral Sclerosis Patients

    Get PDF

    EOG-Based Eye Movement Classification and Application on HCI Baseball Game

    Full text link
    © 2013 IEEE. Electrooculography (EOG) is considered as the most stable physiological signal in the development of human-computer interface (HCI) for detecting eye-movement variations. EOG signal classification has gained more traction in recent years to overcome physical inconvenience in paralyzed patients. In this paper, a robust classification technique, such as eight directional movements is investigated by introducing a concept of buffer along with a variation of the slope to avoid misclassification effects in EOG signals. Blinking detection becomes complicated when the magnitude of the signals are considered. Hence, a correction technique is introduced to avoid misclassification for oblique eye movements. Meanwhile, a case study has been considered to apply these correction techniques to HCI baseball game to learn eye-movements
    corecore