7 research outputs found

    Gaze Detection and Analysis for Initiating Joint Activity in Industrial Human-Robot Collaboration

    Full text link
    Collaborative robots (cobots) are widely used in industrial applications, yet extensive research is still needed to enhance human-robot collaborations and operator experience. A potential approach to improve the collaboration experience involves adapting cobot behavior based on natural cues from the operator. Inspired by the literature on human-human interactions, we conducted a wizard-of-oz study to examine whether a gaze towards the cobot can serve as a trigger for initiating joint activities in collaborative sessions. In this study, 37 participants engaged in an assembly task while their gaze behavior was analyzed. We employ a gaze-based attention recognition model to identify when the participants look at the cobot. Our results indicate that in most cases (84.88\%), the joint activity is preceded by a gaze towards the cobot. Furthermore, during the entire assembly cycle, the participants tend to look at the cobot around the time of the joint activity. To the best of our knowledge, this is the first study to analyze the natural gaze behavior of participants working on a joint activity with a robot during a collaborative assembly task.Comment: First draft for a paper submitted to Frontiers in Robotics and A

    A Deep Learning Approach for Multi-View Engagement Estimation of Children in a Child-Robot Joint Attention Task

    Get PDF
    International audienceIn this work we tackle the problem of child engagement estimation while children freely interact with a robot in a friendly, room-like environment. We propose a deep-based multi-view solution that takes advantage of recent developments in human pose detection. We extract the child's pose from different RGB-D cameras placed regularly in the room, fuse the results and feed them to a deep neural network trained for classifying engagement levels. The deep network contains a recurrent layer, in order to exploit the rich temporal information contained in the pose data. The resulting method outperforms a number of baseline classifiers, and provides a promising tool for better automatic understanding of a child's attitude, interest and attention while cooperating with a robot. The goal is to integrate this model in next generation social robots as an attention monitoring tool during various Child Robot Interaction (CRI) tasks both for Typically Developed (TD) children and children affected by autism (ASD)

    Context change and triggers for human intention recognition

    Get PDF
    In human-robot interaction, understanding human intention is important to smooth interaction between humans and robots. Proactive human-robot interactions are the trend. They rely on recognising human intentions to complete tasks. The reasoning is accomplished based on the current human state, environment and context, and human intention recognition and prediction. Many factors may affect human intention, including clues which are difficult to recognise directly from the action but may be perceived from the change in the environment or context. The changes that affect human intention are the triggers and serve as strong evidence for identifying human intention. Therefore, detecting such changes and identifying such triggers are the promising approach to assist in human intention recognition. This paper discusses the current state of art in human intention recognition in human-computer interaction and illustrates the importance of context change and triggers for human intention recognition in a variety of examples

    Behavioral patterns in robotic collaborative assembly: comparing neurotypical and Autism Spectrum Disorder participants

    Get PDF
    Introduction: In Industry 4.0, collaborative tasks often involve operators working with collaborative robots (cobots) in shared workspaces. Many aspects of the operator's well-being within this environment still need in-depth research. Moreover, these aspects are expected to differ between neurotypical (NT) and Autism Spectrum Disorder (ASD) operators. Methods: This study examines behavioral patterns in 16 participants (eight neurotypical, eight with high-functioning ASD) during an assembly task in an industry-like lab-based robotic collaborative cell, enabling the detection of potential risks to their well-being during industrial human-robot collaboration. Each participant worked on the task for five consecutive days, 3.5 h per day. During these sessions, six video clips of 10 min each were recorded for each participant. The videos were used to extract quantitative behavioral data using the NOVA annotation tool and analyzed qualitatively using an ad-hoc observational grid. Also, during the work sessions, the researchers took unstructured notes of the observed behaviors that were analyzed qualitatively. Results: The two groups differ mainly regarding behavior (e.g., prioritizing the robot partner, gaze patterns, facial expressions, multi-tasking, and personal space), adaptation to the task over time, and the resulting overall performance. Discussion: This result confirms that NT and ASD participants in a collaborative shared workspace have different needs and that the working experience should be tailored depending on the end-user's characteristics. The findings of this study represent a starting point for further efforts to promote well-being in the workplace. To the best of our knowledge, this is the first work comparing NT and ASD participants in a collaborative industrial scenario
    corecore