84 research outputs found
EMOTIONAL SYNCHRONIZATION-BASED HUMAN-ROBOT COMMUNICATION AND ITS EFFECTS
This paper presents a natural and comfortable communication system between human and robot based on synchronization to human emotional state using human facial expression recognition. The system consists of three parts: human emotion recognition, robotic emotion generation, and robotic emotion expression. The robot recognizes human emotion through human facial expressions, and robotic emotion is generated and synchronized with human emotion dynamically using a vector field of dynamics. The robot makes dynamically varying facial expressions to express its own emotions to the human. A communication experiment was conducted to examine the effectiveness of the proposed system. The authors found that subjects became much more comfortable after communicating with the robot with synchronized emotions. Subjects felt somewhat uncomfortable after communicating with the robot with non-synchronized emotions. During emotional synchronization, subjects communicated much more with the robot, and the communication time was double that during non-synchronization. Furthermore, in the case of emotional synchronization, subjects had good impressions of the robot, much better than the impressions in the case of non-synchronization. It was confirmed in this study that emotional synchronization in human-robot communication can be effective in making humans comfortable and makes the robot much more favorable and acceptable to humans.ArticleINTERNATIONAL JOURNAL OF HUMANOID ROBOTICS. 10(1):1350014 (2013)journal articl
Gaze Guidance Using a Facial Expression Robot
This paper describes the gaze guidance with emotional expression of a head robot, called Kamin-FA1. We propose to use not only the gaze control of the robot, but also the facial expression to guide a human being's gaze to the target. We provide the information of the target of gaze intuitively to the human based on joint attention with Kamin-FA1. The robot has a facial expression function using a curved surface display. We examined the effect of emotional expression on the gaze guidance in terms of the accuracy and reaction speed. We conducted experiments of human gaze measurement during the gaze guidance with emotional expression to evaluate the role of emotional expression. The results of the gaze guidance experiments showed that gaze guidance with emotional expression caused a more accurate and quicker response than that without emotional expression. In particular, the expression of surprise has better performance in the gaze guidance compared with the normal expression. Furthermore, emotional expressions of angry and surprise impressed the subjects in dangerous situations, while normal and happy situations gave the impression of a safe situation at the target of gaze.Advanced robotics. 23(14):1831-1848 (2009)journal articl
Choreographic and Somatic Approaches for the Development of Expressive Robotic Systems
As robotic systems are moved out of factory work cells into human-facing
environments questions of choreography become central to their design,
placement, and application. With a human viewer or counterpart present, a
system will automatically be interpreted within context, style of movement, and
form factor by human beings as animate elements of their environment. The
interpretation by this human counterpart is critical to the success of the
system's integration: knobs on the system need to make sense to a human
counterpart; an artificial agent should have a way of notifying a human
counterpart of a change in system state, possibly through motion profiles; and
the motion of a human counterpart may have important contextual clues for task
completion. Thus, professional choreographers, dance practitioners, and
movement analysts are critical to research in robotics. They have design
methods for movement that align with human audience perception, can identify
simplified features of movement for human-robot interaction goals, and have
detailed knowledge of the capacity of human movement. This article provides
approaches employed by one research lab, specific impacts on technical and
artistic projects within, and principles that may guide future such work. The
background section reports on choreography, somatic perspectives,
improvisation, the Laban/Bartenieff Movement System, and robotics. From this
context methods including embodied exercises, writing prompts, and community
building activities have been developed to facilitate interdisciplinary
research. The results of this work is presented as an overview of a smattering
of projects in areas like high-level motion planning, software development for
rapid prototyping of movement, artistic output, and user studies that help
understand how people interpret movement. Finally, guiding principles for other
groups to adopt are posited.Comment: Under review at MDPI Arts Special Issue "The Machine as Artist (for
the 21st Century)"
http://www.mdpi.com/journal/arts/special_issues/Machine_Artis
Robotics 2010
Without a doubt, robotics has made an incredible progress over the last decades. The vision of developing, designing and creating technical systems that help humans to achieve hard and complex tasks, has intelligently led to an incredible variety of solutions. There are barely technical fields that could exhibit more interdisciplinary interconnections like robotics. This fact is generated by highly complex challenges imposed by robotic systems, especially the requirement on intelligent and autonomous operation. This book tries to give an insight into the evolutionary process that takes place in robotics. It provides articles covering a wide range of this exciting area. The progress of technical challenges and concepts may illuminate the relationship between developments that seem to be completely different at first sight. The robotics remains an exciting scientific and engineering field. The community looks optimistically ahead and also looks forward for the future challenges and new development
KEER2022
AvanttÃtol: KEER2022. DiversitiesDescripció del recurs: 25 juliol 202
In Sync: Exploring Synchronization to Increase Trust Between Humans and Non-humanoid Robots
When we go for a walk with friends, we can observe an interesting effect:
From step lengths to arm movements - our movements unconsciously align; they
synchronize. Prior research found that this synchronization is a crucial aspect
of human relations that strengthens social cohesion and trust. Generalizing
from these findings in synchronization theory, we propose a dynamical approach
that can be applied in the design of non-humanoid robots to increase trust. We
contribute the results of a controlled experiment with 51 participants
exploring our concept in a between-subjects design. For this, we built a
prototype of a simple non-humanoid robot that can bend to follow human
movements and vary the movement synchronization patterns. We found that
synchronized movements lead to significantly higher ratings in an established
questionnaire on trust between people and automation but did not influence the
willingness to spend money in a trust game.Comment: To appear in Proceedings of the 2023 CHI Conference on Human Factors
in Computing Systems (CHI 23), April 23-28, 2023, Hamburg, Germany. ACM, New
York, NY, USA, 14 page
Advances in Human-Robot Interaction
Rapid advances in the field of robotics have made it possible to use robots not just in industrial automation but also in entertainment, rehabilitation, and home service. Since robots will likely affect many aspects of human existence, fundamental questions of human-robot interaction must be formulated and, if at all possible, resolved. Some of these questions are addressed in this collection of papers by leading HRI researchers
Facial Emotion Expressions in Human-Robot Interaction: A Survey
Facial expressions are an ideal means of communicating one's emotions or intentions to others. This overview will focus on human facial expression recognition as well as robotic facial expression generation. In the case of human facial expression recognition, both facial expression recognition on predefined datasets as well as in real-time will be covered. For robotic facial expression generation, hand-coded and automated methods i.e., facial expressions of a robot are generated by moving the features (eyes, mouth) of the robot by hand-coding or automatically using machine learning techniques, will also be covered. There are already plenty of studies that achieve high accuracy for emotion expression recognition on predefined datasets, but the accuracy for facial expression recognition in real-time is comparatively lower. In the case of expression generation in robots, while most of the robots are capable of making basic facial expressions, there are not many studies that enable robots to do so automatically. In this overview, state-of-the-art research in facial emotion expressions during human-robot interaction has been discussed leading to several possible directions for future research
Facial emotion expressions in human-robot interaction: A survey
Facial expressions are an ideal means of communicating one's emotions or
intentions to others. This overview will focus on human facial expression
recognition as well as robotic facial expression generation. In case of human
facial expression recognition, both facial expression recognition on predefined
datasets as well as in real time will be covered. For robotic facial expression
generation, hand coded and automated methods i.e., facial expressions of a
robot are generated by moving the features (eyes, mouth) of the robot by hand
coding or automatically using machine learning techniques, will also be
covered. There are already plenty of studies that achieve high accuracy for
emotion expression recognition on predefined datasets, but the accuracy for
facial expression recognition in real time is comparatively lower. In case of
expression generation in robots, while most of the robots are capable of making
basic facial expressions, there are not many studies that enable robots to do
so automatically.Comment: Pre-print version. Accepted in International Journal of Social
Robotic
- …