5 research outputs found
Robots Taking Initiative in Collaborative Object Manipulation: Lessons from Physical Human-Human Interaction
Physical Human-Human Interaction (pHHI) involves the use of multiple sensory
modalities. Studies of communication through spoken utterances and gestures are
well established. Nevertheless, communication through force signals is not well
understood. In this paper, we focus on investigating the mechanisms employed by
humans during the negotiation through force signals, which is an integral part
of successful collaboration. Our objective is to use the insights to inform the
design of controllers for robot assistants. Specifically, we want to enable
robots to take the lead in collaboration. To achieve this goal, we conducted a
study to observe how humans behave during collaborative manipulation tasks.
During our preliminary data analysis, we discovered several new features that
help us better understand how the interaction progresses. From these features,
we identified distinct patterns in the data that indicate when a participant is
expressing their intent. Our study provides valuable insight into how humans
collaborate physically, which can help us design robots that behave more like
humans in such scenarios
Human in the AI loop via xAI and Active Learning for Visual Inspection
Industrial revolutions have historically disrupted manufacturing by introducing automation into production. Increasing automation reshapes the role of the human worker. Advances in robotics and artificial intelligence open new frontiers of human-machine collaboration. Such collaboration can be realized considering two sub-fields of artificial intelligence: active learning and explainable artificial intelligence. Active learning aims to devise strategies that help obtain data that allows machine learning algorithms to learn better. On the other hand, explainable artificial intelligence aims to make the machine learning models intelligible to the human person. The present work first describes Industry 5.0, human-machine collaboration, and state-of-the-art regarding quality inspection, emphasizing visual inspection. Then it outlines how human-machine collaboration could be realized and enhanced in visual inspection. Finally, some of the results obtained in the EU H2020 STAR project regarding visual inspection are shared, considering artificial intelligence, human digital twins, and cybersecurity
Human in the AI loop via xAI and Active Learning for Visual Inspection
Industrial revolutions have historically disrupted manufacturing by introducing automation into production. Increasing automation reshapes the role of the human worker. Advances in robotics and artificial intelligence open new frontiers of human-machine collaboration. Such collaboration can be realized considering two sub-fields of artificial intelligence: active learning and explainable artificial intelligence. Active learning aims to devise strategies that help obtain data that allows machine learning algorithms to learn better. On the other hand, explainable artificial intelligence aims to make the machine learning models intelligible to the human person. The present work first describes Industry 5.0, human-machine collaboration, and state-of-the-art regarding quality inspection, emphasizing visual inspection. Then it outlines how human-machine collaboration could be realized and enhanced in visual inspection. Finally, some of the results obtained in the EU H2020 STAR project regarding visual inspection are shared, considering artificial intelligence, human digital twins, and cybersecurity