2,552 research outputs found

    A Survey on Emotion Recognition for Human Robot Interaction

    Get PDF
    With the recent developments of technology and the advances in artificial intelligent and machine learning techniques, it becomes possible for the robot to acquire and show the emotions as a part of Human-Robot Interaction (HRI). An emotional robot can recognize the emotional states of humans so that it will be able to interact more naturally with its human counterpart in different environments. In this article, a survey on emotion recognition for HRI systems has been presented. The survey aims to achieve two objectives. Firstly, it aims to discuss the main challenges that face researchers when building emotional HRI systems. Secondly, it seeks to identify sensing channels that can be used to detect emotions and provides a literature review about recent researches published within each channel, along with the used methodologies and achieved results. Finally, some of the existing emotion recognition issues and recommendations for future works have been outlined

    Affective Man-Machine Interface: Unveiling human emotions through biosignals

    Get PDF
    As is known for centuries, humans exhibit an electrical profile. This profile is altered through various psychological and physiological processes, which can be measured through biosignals; e.g., electromyography (EMG) and electrodermal activity (EDA). These biosignals can reveal our emotions and, as such, can serve as an advanced man-machine interface (MMI) for empathic consumer products. However, such a MMI requires the correct classification of biosignals to emotion classes. This chapter starts with an introduction on biosignals for emotion detection. Next, a state-of-the-art review is presented on automatic emotion classification. Moreover, guidelines are presented for affective MMI. Subsequently, a research is presented that explores the use of EDA and three facial EMG signals to determine neutral, positive, negative, and mixed emotions, using recordings of 21 people. A range of techniques is tested, which resulted in a generic framework for automated emotion classification with up to 61.31% correct classification of the four emotion classes, without the need of personal profiles. Among various other directives for future research, the results emphasize the need for parallel processing of multiple biosignals

    Framework for Electroencephalography-based Evaluation of User Experience

    Get PDF
    Measuring brain activity with electroencephalography (EEG) is mature enough to assess mental states. Combined with existing methods, such tool can be used to strengthen the understanding of user experience. We contribute a set of methods to estimate continuously the user's mental workload, attention and recognition of interaction errors during different interaction tasks. We validate these measures on a controlled virtual environment and show how they can be used to compare different interaction techniques or devices, by comparing here a keyboard and a touch-based interface. Thanks to such a framework, EEG becomes a promising method to improve the overall usability of complex computer systems.Comment: in ACM. CHI '16 - SIGCHI Conference on Human Factors in Computing System, May 2016, San Jose, United State

    Machine learning-based affect detection within the context of human-horse interaction

    Get PDF
    This chapter focuses on the use of machine learning techniques within the field of affective computing, and more specifically for the task of emotion recognition within the context of human-horse interaction. Affective computing focuses on the detection and interpretation of human emotion, an application that could significantly benefit quantitative studies in the field of animal assisted therapy. The chapter offers a thorough description, an experimental design, and experimental results on the use of physiological signals, such as electroencephalography (EEG), electrocardiography (ECG), and electromyography (EMG) signals, for the creation and evaluation of machine learning models for the prediction of the emotional state of an individual during interaction with horses

    What we can and cannot (yet) do with functional near infrared spectroscopy

    Get PDF
    Functional near infrared spectroscopy (NIRS) is a relatively new technique complimentary to EEG for the development of brain-computer interfaces (BCIs). NIRS-based systems for detecting various cognitive and affective states such as mental and emotional stress have already been demonstrated in a range of adaptive human–computer interaction (HCI) applications. However, before NIRS-BCIs can be used reliably in realistic HCI settings, substantial challenges oncerning signal processing and modeling must be addressed. Although many of those challenges have been identified previously, the solutions to overcome them remain scant. In this paper, we first review what can be currently done with NIRS, specifically, NIRS-based approaches to measuring cognitive and affective user states as well as demonstrations of passive NIRS-BCIs. We then discuss some of the primary challenges these systems would face if deployed in more realistic settings, including detection latencies and motion artifacts. Lastly, we investigate the effects of some of these challenges on signal reliability via a quantitative comparison of three NIRS models. The hope is that this paper will actively engage researchers to acilitate the advancement of NIRS as a more robust and useful tool to the BCI community
    corecore