1,927 research outputs found

    Emotional Brain-Computer Interfaces

    Get PDF
    Research in Brain-computer interface (BCI) has significantly increased during the last few years. In addition to their initial role as assisting devices for the physically challenged, BCIs are now proposed for a wider range of applications. As in any HCI application, BCIs can also benefit from adapting their operation to the emotional state of the user. BCIs have the advantage of having access to brain activity which can provide signicant insight into the user's emotional state. This information can be utilized in two manners. 1) Knowledge of the inuence of the emotional state on brain activity patterns can allow the BCI to adapt its recognition algorithms, so that the intention of the user is still correctly interpreted in spite of signal deviations induced by the subject's emotional state. 2) The ability to recognize emotions can be used in BCIs to provide the user with more natural ways of controlling the BCI through affective modulation. Thus, controlling a BCI by recollecting a pleasant memory can be possible and can potentially lead to higher information transfer rates.\ud These two approaches of emotion utilization in BCI are elaborated in detail in this paper in the framework of noninvasive EEG based BCIs

    Affective Medicine: a review of Affective Computing efforts in Medical Informatics

    Get PDF
    Background: Affective computing (AC) is concerned with emotional interactions performed with and through computers. It is defined as ā€œcomputing that relates to, arises from, or deliberately influences emotionsā€. AC enables investigation and understanding of the relation between human emotions and health as well as application of assistive and useful technologies in the medical domain. Objectives: 1) To review the general state of the art in AC and its applications in medicine, and 2) to establish synergies between the research communities of AC and medical informatics. Methods: Aspects related to the human affective state as a determinant of the human health are discussed, coupled with an illustration of significant AC research and related literature output. Moreover, affective communication channels are described and their range of application fields is explored through illustrative examples. Results: The presented conferences, European research projects and research publications illustrate the recent increase of interest in the AC area by the medical community. Tele-home healthcare, AmI, ubiquitous monitoring, e-learning and virtual communities with emotionally expressive characters for elderly or impaired people are few areas where the potential of AC has been realized and applications have emerged. Conclusions: A number of gaps can potentially be overcome through the synergy of AC and medical informatics. The application of AC technologies parallels the advancement of the existing state of the art and the introduction of new methods. The amount of work and projects reviewed in this paper witness an ambitious and optimistic synergetic future of the affective medicine field

    A Robot is a Smart Tool: Investigating Younger Users' Preferences for the Multimodal Interaction of Domestic Service Robot

    Get PDF
    The degree that domestic service robots are generally accepted mainly depends on the user experience and the surprise that the design brings to people. To make the design of robots to follow the trend of interactions of smart devices, researchers should have insights into young people's acceptance and opinions of emerging new interactions. The main content of this study is a user elicitation through which the users' suggestions for commanding a robot in specific contexts are gathered. Accordingly, it sheds light on the features of user preferences for human-robot interaction. This study claims that younger users regard service robots merely as intelligent tools, which is the direct cause of the above interaction preferences. Keywords: Service robot, Interaction design, User preferenc

    Gesturing on the steering wheel, a comparison with speech and touch interaction modalities

    Get PDF
    This paper compares an emergent interaction modality for the In-Vehicle Infotainment System (IVIS), i.e., gesturing on the steering wheel, with two more popular modalities in modern cars: touch and speech. We conducted a betweensubjects experiment with 20 participants for each modality to assess the interaction performance with the IVIS and the impact on the driving performance. Moreover, we compared the three modalities in terms of usability, subjective workload and emotional response. The results showed no statically significant differences between the three interaction modalities regarding the various indicators for the driving task performance, while significant differences were found in measures of IVIS interaction performance: users performed less interactions to complete the secondary tasks with the speech modality, while, in average, a lower task completion time was registered with the touch modality. The three interfaces were comparable in all the subjective metrics

    Using minimal number of electrodes for emotion detection using noisy EEG data

    Get PDF
    Emotion is an important aspect in the interaction between humans. It is fundamental to human experience and rational decision-making. There is a great interest for detecting emotions automatically. A number of techniques have been employed for this purpose using channels such as voice and facial expressions. However, these channels are not very accurate because they can be affected by users\u27 intentions. Other techniques use physiological signals along with electroencephalography (EEG) for emotion detection. However, these approaches are not very practical for real time applications because they ask the participants to reduce any motion and facial muscle movement, reject EEG data contaminated with artifacts and rely on large number of electrodes. In this thesis, we propose an approach that analyzes highly contaminated EEG data produced from a new emotion elicitation technique. We also use a feature selection mechanism to extract features that are relevant to the emotion detection task based on neuroscience findings. We reached an average accuracy of 51% for joy emotion, 53% for anger, 58% for fear and 61% for sadness. We are also, applying our approach on smaller number of electrodes that ranges from 4 up to 25 electrodes and we reached an average classification accuracy of 33% for joy emotion, 38% for anger, 33% for fear and 37.5% for sadness using 4 or 6 electrodes only

    Classification of EEG signals of user states in gaming using machine learning

    Get PDF
    In this research, brain activity of user states was analyzed using machine learning algorithms. When a user interacts with a computer-based system including playing computer games like Tetris, he or she may experience user states such as boredom, flow, and anxiety. The purpose of this research is to apply machine learning models to Electroencephalogram (EEG) signals of three user states -- boredom, flow and anxiety -- to identify and classify the EEG correlates for these user states. We focus on three research questions: (i) How well do machine learning models like support vector machine, random forests, multinomial logistic regression, and k-nearest neighbor classify the three user states -- Boredom, Flow, and Anxiety? (ii) Can we distinguish the flow state from other user states using machine learning models? (iii) What are the essential components of EEG signals for classifying the three user states? To extract the critical components of EEG signals, a feature selection method known as minimum redundancy and maximum relevance method was implemented. An average accuracy of 85 % is achieved for classifying the three user states by using the support vector machine classifier --Abstract, page iii
    • ā€¦
    corecore