20,645 research outputs found
Using Noninvasive Brain Measurement to Explore the Psychological Effects of Computer Malfunctions on Users during Human-Computer Interactions
In todayâs technologically driven world, there is a need to better understand the ways that common computer malfunctions affect computer users. These malfunctions may have measurable influences on computer userâs cognitive, emotional, and behavioral responses. An experiment was conducted where participants conducted a series of web search tasks while wearing functional nearinfrared spectroscopy (fNIRS) and galvanic skin response sensors. Two computer malfunctions were introduced during the sessions which had the potential to influence correlates of user trust and suspicion. Surveys were given after each session to measure userâs perceived emotional state, cognitive load, and perceived trust. Results suggest that fNIRS can be used to measure the different cognitive and emotional responses associated with computer malfunctions. These cognitive and emotional changes were correlated with usersâ self-report levels of suspicion and trust, and they in turn suggest future work that further explores the capability of fNIRS for the measurement of user experience during human-computer interactions
BRAIN COMPUTER INTERFACE - Application of an Adaptive Bi-stage Classifier based on RBF-HMM
Brain Computer Interface is an emerging technology that allows new output paths to communicate the users intentions without the use of normal output paths, such as muscles or nerves. In order to obtain their objective, BCI devices make use of classifiers which translate inputs from the users brain signals into commands for external devices. This paper describes an adaptive bi-stage classifier. The first stage is based on Radial Basis Function neural networks, which provides sequences of pre-assignations to the second stage, that it is based on three different Hidden Markov Models, each one trained with pre-assignation sequences from the cognitive activities between classifying. The segment of EEG signal is assigned to the HMMwith the highest probability of generating the pre-assignation sequence. The algorithm is tested with real samples of electroencephalografic signal, from five healthy volunteers using the cross-validation method. The results allow to conclude that it is possible to implement this algorithm in an on-line BCI device. The results also shown the huge dependency of the percentage of the correct classification from the user and the setup parameters of the classifier
A hierarchical architecture for recognising intentionality in mental tasks on a brain-computer interface
A brain-computer interface (BCI), based on motor imagery EEG, uses information extracted from the electroencephalography signals generated by a person who intends to perform any action. One of the most important issues of current research is how to detect automatically whether the user intends to send some message to a certain device. This study presents a proposal, based on a hierarchical structured system, for recognising intentional and non-intentional mental tasks on a BCI system by applying machine learning techniques to the EEG signals. First-level clustering is performed to distinguish between intentional control (IC) and non-intentional control (NC) state patterns. Then, the patterns recognised as IC are passed on to a second stage where supervised learning techniques are used to classify them. In BCI applications, it is critical to correctly classify NC states with a low false positive rate (FPR) to avoid undesirable effects. According to the literature, we selected a maximum FPR of 10%. Under these conditions, our proposal achieved an average test accuracy of 66.6%, with an 8.2% FPR, for the BCI competition IIIa dataset. The main contribution of this paper is the hierarchical approach, based on machine learning paradigms, which performs intentional and non-intentional discrimination and, depending on the case, classifies the intended command selected by the user.This work was partially supported by the ERDF/Spanish Ministry of Science, Innovation and Universities - National Research Agency/PhysComp project, TIN2017-85409-P and by the Department of Education, Universities and Research of the Basque Government (ADIAN research group, grant IT980-16)
Predicting mental imagery based BCI performance from personality, cognitive profile and neurophysiological patterns
Mental-Imagery based Brain-Computer Interfaces (MI-BCIs) allow their users to send commands
to a computer using their brain-activity alone (typically measured by ElectroEncephaloGraphyâ
EEG), which is processed while they perform specific mental tasks. While very
promising, MI-BCIs remain barely used outside laboratories because of the difficulty
encountered by users to control them. Indeed, although some users obtain good control
performances after training, a substantial proportion remains unable to reliably control an
MI-BCI. This huge variability in user-performance led the community to look for predictors of
MI-BCI control ability. However, these predictors were only explored for motor-imagery
based BCIs, and mostly for a single training session per subject. In this study, 18 participants
were instructed to learn to control an EEG-based MI-BCI by performing 3 MI-tasks, 2
of which were non-motor tasks, across 6 training sessions, on 6 different days. Relationships
between the participantsâ BCI control performances and their personality, cognitive
profile and neurophysiological markers were explored. While no relevant relationships with
neurophysiological markers were found, strong correlations between MI-BCI performances
and mental-rotation scores (reflecting spatial abilities) were revealed. Also, a predictive
model of MI-BCI performance based on psychometric questionnaire scores was proposed.
A leave-one-subject-out cross validation process revealed the stability and reliability of this
model: it enabled to predict participantsâ performance with a mean error of less than 3
points. This study determined how usersâ profiles impact their MI-BCI control ability and
thus clears the way for designing novel MI-BCI training protocols, adapted to the profile of
each user
Brainwave Classification for EEG-based Neurofeedback
The aim of this project was to find a way to differentiate active and rested brain signals in a patient using tasks without bodily movement to provide extremely motorly disabled patients a method of control for robotic devices that enable them to move independently of a caretaker. Although many control methods exist for less severely motorly impaired patients, this method would improve quality of life for all patients by allowing for movements to be controlled exclusively using the brain. The three steps for our project were to define the tasks and collect data, process the signals, and run the processed signals through a machine learning algorithm. In addition to the tasks not involving movement, having the subjectâs eyes open was required as closing oneâs eyes as a control method would not be practical. Different processing techniques were used to prepare the data and extract features for the training of the machine learning model for the classification task. Due to COVID-19, a limited amount of data was collected, resulting in inaccurate classification results. The âimagining-to-moveâ and âat restâ tasks that we designed for data collection appear to be the most effective when focusing on the mu rhythms at 7 to 12 Hz from the central cortex, but much more data is needed to prove this point. These tasks, brain area, and frequency ranges would be ideal for control method research projects in the future
Data-driven multivariate and multiscale methods for brain computer interface
This thesis focuses on the development of data-driven multivariate and multiscale methods
for brain computer interface (BCI) systems. The electroencephalogram (EEG), the
most convenient means to measure neurophysiological activity due to its noninvasive nature,
is mainly considered. The nonlinearity and nonstationarity inherent in EEG and its
multichannel recording nature require a new set of data-driven multivariate techniques to
estimate more accurately features for enhanced BCI operation. Also, a long term goal
is to enable an alternative EEG recording strategy for achieving long-term and portable
monitoring.
Empirical mode decomposition (EMD) and local mean decomposition (LMD), fully
data-driven adaptive tools, are considered to decompose the nonlinear and nonstationary
EEG signal into a set of components which are highly localised in time and frequency. It
is shown that the complex and multivariate extensions of EMD, which can exploit common
oscillatory modes within multivariate (multichannel) data, can be used to accurately
estimate and compare the amplitude and phase information among multiple sources, a
key for the feature extraction of BCI system. A complex extension of local mean decomposition
is also introduced and its operation is illustrated on two channel neuronal
spike streams. Common spatial pattern (CSP), a standard feature extraction technique
for BCI application, is also extended to complex domain using the augmented complex
statistics. Depending on the circularity/noncircularity of a complex signal, one of the
complex CSP algorithms can be chosen to produce the best classification performance
between two different EEG classes.
Using these complex and multivariate algorithms, two cognitive brain studies are
investigated for more natural and intuitive design of advanced BCI systems. Firstly, a Yarbus-style auditory selective attention experiment is introduced to measure the user
attention to a sound source among a mixture of sound stimuli, which is aimed at improving
the usefulness of hearing instruments such as hearing aid. Secondly, emotion experiments
elicited by taste and taste recall are examined to determine the pleasure and displeasure
of a food for the implementation of affective computing. The separation between two
emotional responses is examined using real and complex-valued common spatial pattern
methods.
Finally, we introduce a novel approach to brain monitoring based on EEG recordings
from within the ear canal, embedded on a custom made hearing aid earplug. The new
platform promises the possibility of both short- and long-term continuous use for standard
brain monitoring and interfacing applications
- âŠ