272 research outputs found

    Controlling a Mouse Pointer with a Single-Channel EEG Sensor

    Get PDF
    Goals: The purpose of this study was to analyze the feasibility of using the information obtained from a one-channel electro-encephalography (EEG) signal to control a mouse pointer. We used a low-cost headset, with one dry sensor placed at the FP1 position, to steer a mouse pointer and make selections through a combination of the user’s attention level with the detection of voluntary blinks. There are two types of cursor movements: spinning and linear displacement. A sequence of blinks allows for switching between these movement types, while the attention level modulates the cursor’s speed. The influence of the attention level on performance was studied. Additionally, Fitts’ model and the evolution of the emotional states of participants, among other trajectory indicators, were analyzed. (2) Methods: Twenty participants distributed into two groups (Attention and No-Attention) performed three runs, on different days, in which 40 targets had to be reached and selected. Target positions and distances from the cursor’s initial position were chosen, providing eight different indices of difficulty (IDs). A self-assessment manikin (SAM) test and a final survey provided information about the system’s usability and the emotions of participants during the experiment. (3) Results: The performance was similar to some brain–computer interface (BCI) solutions found in the literature, with an averaged information transfer rate (ITR) of 7 bits/min. Concerning the cursor navigation, some trajectory indicators showed our proposed approach to be as good as common pointing devices, such as joysticks, trackballs, and so on. Only one of the 20 participants reported difficulty in managing the cursor and, according to the tests, most of them assessed the experience positively. Movement times and hit rates were significantly better for participants belonging to the attention group. (4) Conclusions: The proposed approach is a feasible low-cost solution to manage a mouse pointe

    Aerospace medicine and biology: A continuing bibliography with indexes (supplement 359)

    Get PDF
    This bibliography lists 164 reports, articles and other documents introduced into the NASA Scientific and Technical Information System during Jan. 1992. Subject coverage includes: aerospace medicine and physiology, life support systems and man/system technology, protective clothing, exobiology and extraterrestrial life, planetary biology, and flight crew behavior and performance

    Gaze-tracking-based interface for robotic chair guidance

    Get PDF
    This research focuses on finding solutions to enhance the quality of life for wheelchair users, specifically by applying a gaze-tracking-based interface for the guidance of a robotized wheelchair. For this purpose, the interface was applied in two different approaches for the wheelchair control system. The first one was an assisted control in which the user was continuously involved in controlling the movement of the wheelchair in the environment and the inclination of the different parts of the seat through the user’s gaze and eye blinks obtained with the interface. The second approach was to take the first steps to apply the device to an autonomous wheelchair control in which the wheelchair moves autonomously avoiding collisions towards the position defined by the user. To this end, the basis for obtaining the gaze position relative to the wheelchair and the object detection was developed in this project to be able to calculate in the future the optimal route to which the wheelchair should move. In addition, the integration of a robotic arm in the wheelchair to manipulate different objects was also considered, obtaining in this work the object of interest indicated by the user's gaze within the detected objects so that in the future the robotic arm could select and pick up the object the user wants to manipulate. In addition to the two approaches, an attempt was also made to estimate the user's gaze without the software interface. For this purpose, the gaze is obtained from pupil detection libraries, a calibration and a mathematical model that relates pupil positions to gaze. The results of the implementations have been analysed in this work, including some limitations encountered. Nevertheless, future improvements are proposed, with the aim of increasing the independence of wheelchair user

    Development of EOG Based Human Machine Interface Control System for Motorized Wheelchair

    Get PDF
    Abstract Rehabilitation devices are increasingly being used to improve the quality of the life of differentially abled people. Human Machine Interface (HMI) have been studied extensively to control electromechanical rehabilitation aids using biosignals such as EEG, EOG and EMG etc. among the various biosignals, EOG signals have been studied in depth due to the occurrence of a definite signal pattern. Persons suffering from extremely limited peripheral mobility like paraplegia or quadriplegia usually have the ability to coordinate eye movements. The current project focuses on the development of a prototype motor wheelchair controlled by EOG signals. EOG signals were used to generate control signals for the movement of the wheelchair. As a part of this work an EOG signal acquisition system was developed. The acquired EOG signal was then processed to generate various control signals depending upon the amplitude and duration of signal components. These control signals were then used to control the movements of the prototype motorized wheelchair model

    Multimodality with Eye tracking and Haptics: A New Horizon for Serious Games?

    Get PDF
    The goal of this review is to illustrate the emerging use of multimodal virtual reality that can benefit learning-based games. The review begins with an introduction to multimodal virtual reality in serious games and we provide a brief discussion of why cognitive processes involved in learning and training are enhanced under immersive virtual environments. We initially outline studies that have used eye tracking and haptic feedback independently in serious games, and then review some innovative applications that have already combined eye tracking and haptic devices in order to provide applicable multimodal frameworks for learning-based games. Finally, some general conclusions are identified and clarified in order to advance current understanding in multimodal serious game production as well as exploring possible areas for new applications

    The influence of mental representations on eye movement patterns under uncertainty

    Get PDF
    This thesis investigated eye movements (i.e. number of fixations, fixation duration) during learning in uncertain situations, i.e. when interacting with a technical system like a ticket machine and users are not aware of the functioning. It was predicted that eye movements allow insights into the process of developing a mental representation under uncertainty. In order to induce uncertainty, a visual spatial search task with likely and unlikely target locations was developed. Participants were asked to predict the appearance of stimuli at target locations as accurately as possible by learning the underlying probability concept. In quick succession, they were asked to react as quickly as possible on changes of the stimuli. In total, five eye tracking experiments were gradually developed and conducted. In a first experiment, participants performed the newly developed visual spatial search task und learned the underlying probability concept of likely and unlikely target locations accurately. Eye movements became more focused, i.e. number of fixations as well as fixation duration decreased significantly over the time course of the task with increasing learning and reduced uncertainty. The aim of the second experiment was to assess to what extent search difficulty affects the development of the mental representation. Therefore, target objects were presented at an unstructured white-gray patterned background. Results showed an overall higher number of fixations than in the first experiment, however, participants also developed an accurate mental representation of the probability concept. A third experiment was designed as a relearning experiment to study the effect of initial knowledge on the development of mental representations and thus on eye movements. Participants initially learned a probability concept and in a second phase learned a different concept of target-location associations. Thereby, eye movements indicated different phases of relearning. In a fourth experiment the prediction and the reaction task were assessed separately to elucidate which dominated the development of mental representation. Results indicated that the developed mental representation of the visual spatial search task was mainly based on the prediction of the target stimuli and not on the reaction on changes of the target stimuli. In a last experiment, the manipulation of the degree of objective uncertainty by varying the probabilities of the probability concept did not lead to different eye movements. It seemed that the degree of subjective uncertainty was not affected by varying the probabilities. In conclusion, the results of the thesis demonstrated that eye movements actually gave insights into the development of mental representations under uncertainty. Eye movements informed about the learning stage, viz. the accumulation of information, independent of the content as well as the subjective uncertainty of the participants, viz. the usage of decision strategies and strategies to cope with uncertainty
    corecore