498 research outputs found

    Can a Humanoid Face be Expressive? A Psychophysiological Investigation

    Get PDF
    Non-verbal signals expressed through body language play a crucial role in multi-modal human communication during social relations. Indeed, in all cultures, facial expressions are the most universal and direct signs to express innate emotional cues. A human face conveys important information in social interactions and helps us to better understand our social partners and establish empathic links. Latest researches show that humanoid and social robots are becoming increasingly similar to humans, both esthetically and expressively. However, their visual expressiveness is a crucial issue that must be improved to make these robots more realistic and intuitively perceivable by humans as not different from them. This study concerns the capability of a humanoid robot to exhibit emotions through facial expressions. More specifically, emotional signs performed by a humanoid robot have been compared with corresponding human facial expressions in terms of recognition rate and response time. The set of stimuli included standardized human expressions taken from an Ekman-based database and the same facial expressions performed by the robot. Furthermore, participants’ psychophysiological responses have been explored to investigate whether there could be differences induced by interpreting robot or human emotional stimuli. Preliminary results show a trend to better recognize expressions performed by the robot than 2D photos or 3D models. Moreover, no significant differences in the subjects’ psychophysiological state have been found during the discrimination of facial expressions performed by the robot in comparison with the same task performed with 2D photos and 3D models

    Aerospace medicine and biology: A continuing bibliography with indexes (supplement 359)

    Get PDF
    This bibliography lists 164 reports, articles and other documents introduced into the NASA Scientific and Technical Information System during Jan. 1992. Subject coverage includes: aerospace medicine and physiology, life support systems and man/system technology, protective clothing, exobiology and extraterrestrial life, planetary biology, and flight crew behavior and performance

    How Can Physiological Computing Benefit Human-Robot Interaction?

    Get PDF
    As systems grow more automatized, the human operator is all too often overlooked. Although human-robot interaction (HRI) can be quite demanding in terms of cognitive resources, the mental states (MS) of the operators are not yet taken into account by existing systems. As humans are no providential agents, this lack can lead to hazardous situations. The growing number of neurophysiology and machine learning tools now allows for efficient operators' MS monitoring. Sending feedback on MS in a closed-loop solution is therefore at hands. Involving a consistent automated planning technique to handle such a process could be a significant asset. This perspective article was meant to provide the reader with a synthesis of the significant literature with a view to implementing systems that adapt to the operator's MS to improve human-robot operations' safety and performance. First of all, the need for this approach is detailed as regards remote operation, an example of HRI. Then, several MS identified as crucial for this type of HRI are defined, along with relevant electrophysiological markers. A focus is made on prime degraded MS linked to time-on-task and task demands, as well as collateral MS linked to system outputs (i.e. feedback and alarms). Lastly, the principle of symbiotic HRI is detailed and one solution is proposed to include the operator state vector into the system using a mixed-initiative decisional framework to drive such an interaction

    Robotic-based well-being monitoring and coaching system for the elderly in their daily activities

    Get PDF
    The increasingly ageing population and the tendency to live alone have led science and engineering researchers to search for health care solutions. In the COVID 19 pandemic, the elderly have been seriously affected in addition to suffering from isolation and its associated and psychological consequences. This paper provides an overview of the RobWell (Robotic-based Well-Being Monitoring and Coaching System for the Elderly in their Daily Activities) system. It is a system focused on the field of artificial intelligence for mood prediction and coaching. This paper presents a general overview of the initially proposed system as well as the preliminary results related to the home automation subsystem, autonomous robot navigation and mood estimation through machine learning prior to the final system integration, which will be discussed in future works. The main goal is to improve their mental well-being during their daily household activities. The system is composed of ambient intelligence with intelligent sensors, actuators and a robotic platform that interacts with the user. A test smart home system was set up in which the sensors, actuators and robotic platform were integrated and tested. For artificial intelligence applied to mood prediction, we used machine learning to classify several physiological signals into different moods. In robotics, it was concluded that the ROS autonomous navigation stack and its autodocking algorithm were not reliable enough for this task, while the robot’s autonomy was sufficient. Semantic navigation, artificial intelligence and computer vision alternatives are being sought.This research was funded by the Spanish Ministerio de Ciencia, Innovación y Univesidades, Agencia Estatal de Investigación (AEI) and the European Regional Development Fund (ERDF) under project ROBWELL (RTI2018-095599-A-C22) and by the Wallenberg AI, Autonomous Systems and Software Program (WASP) funded by the Knut and Alice Wallenberg Foundation

    Cross validation of bi-modal health-related stress assessment

    Get PDF
    This study explores the feasibility of objective and ubiquitous stress assessment. 25 post-traumatic stress disorder patients participated in a controlled storytelling (ST) study and an ecologically valid reliving (RL) study. The two studies were meant to represent an early and a late therapy session, and each consisted of a "happy" and a "stress triggering" part. Two instruments were chosen to assess the stress level of the patients at various point in time during therapy: (i) speech, used as an objective and ubiquitous stress indicator and (ii) the subjective unit of distress (SUD), a clinically validated Likert scale. In total, 13 statistical parameters were derived from each of five speech features: amplitude, zero-crossings, power, high-frequency power, and pitch. To model the emotional state of the patients, 28 parameters were selected from this set by means of a linear regression model and, subsequently, compressed into 11 principal components. The SUD and speech model were cross-validated, using 3 machine learning algorithms. Between 90% (2 SUD levels) and 39% (10 SUD levels) correct classification was achieved. The two sessions could be discriminated in 89% (for ST) and 77% (for RL) of the cases. This report fills a gap between laboratory and clinical studies, and its results emphasize the usefulness of Computer Aided Diagnostics (CAD) for mental health care

    Affective Brain-Computer Interfaces

    Get PDF

    Psychophysiological responses to eye contact with a humanoid robot: Impact of perceived intentionality

    Get PDF
    Eye contact with a social robot has been shown to elicit similar psychophysiological responses to eye contact with another human. However, it is becoming increasingly clear that the attention- and affect-related psychophysiological responses differentiate between direct (toward the observer) and averted gaze mainly when viewing embodied faces that are capable of social interaction, whereas pictorial or pre-recorded stimuli have no such capability. It has been suggested that genuine eye contact, as indicated by the differential psychophysiological responses to direct and averted gaze, requires a feeling of being watched by another mind. Therefore, we measured event-related potentials (N170 and frontal P300) with EEG, facial electromyography, skin conductance, and heart rate deceleration responses to seeing a humanoid robot's direct versus averted gaze, while manipulating the impression of the robot's intentionality. The results showed that the N170 and the facial zygomatic responses were greater to direct than to averted gaze of the robot, and independent of the robot's intentionality, whereas the frontal P300 responses were more positive to direct than to averted gaze only when the robot appeared intentional. The study provides further evidence that the gaze behavior of a social robot elicits attentional and affective responses and adds that the robot's seemingly autonomous social behavior plays an important role in eliciting higher-level socio-cognitive processing.Peer reviewe

    Affective Man-Machine Interface: Unveiling human emotions through biosignals

    Get PDF
    As is known for centuries, humans exhibit an electrical profile. This profile is altered through various psychological and physiological processes, which can be measured through biosignals; e.g., electromyography (EMG) and electrodermal activity (EDA). These biosignals can reveal our emotions and, as such, can serve as an advanced man-machine interface (MMI) for empathic consumer products. However, such a MMI requires the correct classification of biosignals to emotion classes. This chapter starts with an introduction on biosignals for emotion detection. Next, a state-of-the-art review is presented on automatic emotion classification. Moreover, guidelines are presented for affective MMI. Subsequently, a research is presented that explores the use of EDA and three facial EMG signals to determine neutral, positive, negative, and mixed emotions, using recordings of 21 people. A range of techniques is tested, which resulted in a generic framework for automated emotion classification with up to 61.31% correct classification of the four emotion classes, without the need of personal profiles. Among various other directives for future research, the results emphasize the need for parallel processing of multiple biosignals

    A USER’S COGNITIVE WORKLOAD PERSPECTIVE IN NEGOTIATION SUPPORT SYSTEMS: AN EYE-TRACKING EXPERIMENT

    Get PDF
    Replying to several research calls, I report promising results from an initial experiment which com-pares different negotiation support system approaches concerning their potential to reduce a user’s cognitive workload. Using a novel laboratory-based non-intrusive objective measurement technique which derives the user’s cognitive workload from pupillary responses and eye-movements, I experi-mentally evaluated a standard, a chat-based, and an argumentation-based negotiation support system and found that a higher assistance level of negotiation support systems actually leads to a lower user’s cognitive workload. In more detail, I found that an argumentation-based system which fully automates the generation of the user’s arguments significantly decreases the user’s cognitive workload compared to a standard system. In addition I found that a negotiation support system implementing an additional chat function significantly causes higher cognitive workload for users compared to a standard system
    corecore