2,568 research outputs found
A real-time human-robot interaction system based on gestures for assistive scenarios
Natural and intuitive human interaction with robotic systems is a key point to develop robots assisting people in an easy and effective way. In this paper, a Human Robot Interaction (HRI) system able to recognize gestures usually employed in human non-verbal communication is introduced, and an in-depth study of its usability is performed. The system deals with dynamic gestures such as waving or nodding which are recognized using a Dynamic Time Warping approach based on gesture specific features computed from depth maps. A static gesture consisting in pointing at an object is also recognized. The pointed location is then estimated in order to detect candidate objects the user may refer to. When the pointed object is unclear for the robot, a disambiguation procedure by means of either a verbal or gestural dialogue is performed. This skill would lead to the robot picking an object in behalf of the user, which could present difficulties to do it by itself. The overall system — which is composed by a NAO and Wifibot robots, a KinectTM v2 sensor and two laptops — is firstly evaluated in a structured lab setup. Then, a broad set of user tests has been completed, which allows to assess correct performance in terms of recognition rates, easiness of use and response times.Postprint (author's final draft
Towards safety in physically assistive robots: eating assistance
Safety is one of the base elements to build trust in robots. This paper studies remedies to unavoidable collisions using robotics assistive feeding as an example task. Firstly, we propose an attention mechanism so the user can control the robot using gestures and thus prevent collisions. Secondly, when unwanted contacts are unavoidable we compare two safety strategies: active safety, using a force sensor to monitor maximum allowed forces; and passive safety using compliant controllers. Experimental evaluation shows that the gesture mechanism is effective to control the robot. Also, the impact forces obtained with both methods are similar and thus can be used independently. Additionally, users experimenting on purpose impacts declared that the impact was not harmful.Peer ReviewedPostprint (author's final draft
Overcoming barriers and increasing independence: service robots for elderly and disabled people
This paper discusses the potential for service robots to overcome barriers and increase independence of
elderly and disabled people. It includes a brief overview of the existing uses of service robots by disabled and elderly
people and advances in technology which will make new uses possible and provides suggestions for some of these new
applications. The paper also considers the design and other conditions to be met for user acceptance. It also discusses
the complementarity of assistive service robots and personal assistance and considers the types of applications and
users for which service robots are and are not suitable
Conducting neuropsychological tests with a humanoid robot: design and evaluation
International audience— Socially assistive robot with interactive behavioral capability have been improving quality of life for a wide range of users by taking care of elderlies, training individuals with cognitive disabilities or physical rehabilitation, etc. While the interactive behavioral policies of most systems are scripted, we discuss here key features of a new methodology that enables professional caregivers to teach a socially assistive robot (SAR) how to perform the assistive tasks while giving proper instructions, demonstrations and feedbacks. We describe here how socio-communicative gesture controllers – which actually control the speech, the facial displays and hand gestures of our iCub robot – are driven by multimodal events captured on a professional human demonstrator performing a neuropsychological interview. Furthermore, we propose an original online evaluation method for rating the multimodal interactive behaviors of the SAR and show how such a method can help designers to identify the faulty events
Multimodal Man-machine Interface and Virtual Reality for Assistive Medical Systems
The results of research the intelligence multimodal man-machine interface and virtual reality means for
assistive medical systems including computers and mechatronic systems (robots) are discussed. The gesture
translation for disability peoples, the learning-by-showing technology and virtual operating room with 3D
visualization are presented in this report and were announced at International exhibition "Intelligent and Adaptive
Robots–2005"
- …