10 research outputs found

    Systematic analysis of video data from different human-robot interaction studies: A categorisation of social signals during error situations

    Get PDF
    Human–robot interactions are often affected by error situations that are caused by either the robot or the human. Therefore, robots would profit from the ability to recognize when error situations occur. We investigated the verbal and non-verbal social signals that humans show when error situations occur in human–robot interaction experiments. For that, we analyzed 201 videos of five human–robot interaction user studies with varying tasks from four independent projects. The analysis shows that there are two types of error situations: social norm violations and technical failures. Social norm violations are situations in which the robot does not adhere to the underlying social script of the interaction. Technical failures are caused by technical shortcomings of the robot. The results of the video analysis show that the study participants use many head movements and very few gestures, but they often smile, when in an error situation with the robot. Another result is that the participants sometimes stop moving at the beginning of error situations. We also found that the participants talked more in the case of social norm violations and less during technical failures. Finally, the participants use fewer non-verbal social signals (for example smiling, nodding, and head shaking), when they are interacting with the robot alone and no experimenter or other human is present. The results suggest that participants do not see the robot as a social interaction partner with comparable communication skills. Our findings have implications for builders and evaluators of human–robot interaction systems. The builders need to consider including modules for recognition and classification of head movements to the robot input channels. The evaluators need to make sure that the presence of an experimenter does not skew the results of their user studies

    User-centred design and evaluation of a tele-operated echocardiography robot

    Get PDF
    We present the collected findings of a user-centred approach for developing a tele-operated robot for remote echocardiography examinations. During the three-year development of the robot, we involved users in all development stages of the robot, to increase the usability of the system for the doctors. For requirement compilation, we conducted a literature review, observed two traditional examinations, arranged focus groups with doctors and patients, and conducted two online surveys. During the development of the robot, we regularly involved doctors in usability tests to receive feedback from them on the user interface for the robot and on the robot’s hardware. For evaluation of the robot, we conducted two eye tracking studies. In the first study, doctors executed a traditional echocardiography examination. In the second study, the doctors conducted a remote examination with our robot. The results of the studies show that all doctors were able to successfully complete a correct ultrasonography examination with the tele-operated robot. In comparison to a traditional examination, the doctors on average only need a short amount of additional time to successfully examine a patient when using our remote echocardiography robot. The results also show that the doctors fixate considerably more often, but with shorter fixation times, on the USG screen in the traditional examination compared to the remote examination. We found further that some of the user-centred design methods we applied had to be adjusted to the clinical context and the hectic schedule of the doctors. Overall, our experience and results suggest that the usage of user-centred design methodology is well suited for developing medical robots and leads to a usable product that meets the end users’ needs

    Transferring Human-Human Interaction Studies to HRI Scenarios in Public Space

    No full text
    Part 1: Long and Short PapersInternational audienceThis paper presents the contextual analysis of the user requirements for a mobile navigation robot in public space. Three human-human interaction studies were conducted in order to gain a holistic understanding of the public space as interaction context for itinerary requests. All three human-human requirement studies were analyzed with respect to retrieve guidelines for human-robot interaction. This empirical work should contribute by: (1) providing recommendations for a communication structure from a communication studies perspective, (2) providing recommendations for navigation principles for human-robot interaction in public space from a socio-psychological and a HRI perspective, and (3) providing recommendations regarding (confounding) contextual variables from an HCI perspective

    The Interactive Urban Robot: User-centered development and final field trial of a direction requesting robot

    No full text
    In this article, we present the user-centered development of the service robot IURO. IURO’s goal is to find the way to a designated place in town without any previous map knowledge, just by retrieving information from asking pedestrians for directions. We present the 3-years development process,which involved a series of studies on its appearance, communication model, feedback modalities, and social navigation mechanisms. Our main contribution lies within the final field trial.With the autonomous IURO platform, we performed a series of six way-finding runs (over 24 hours of run-time in total) in the city center of Munich, Germany. The robot interacted with approximately 100 pedestrians of which 36 interactions included a full route dialogue. A variety of empirical methods was used to explore reactions of primary users (pedestrians who actually interacted with the robot) and secondary users (bystanders who observed others interacting). The gathered data provides insights into usability, user experience, and acceptance of IURO and allowed us deriving recommendations for the development of other socially interactive robots

    The Interactive Urban Robot : User-centered development and final field trial of a direction requesting robot

    No full text
    In this article, we present the user-centered development of the service robot IURO. IUROs goal is to find the way to a designated place in town without any previous map knowledge, just by retrieving information from asking pedestrians for directions. We present the 3-years development process,which involved a series of studies on its appearance, communication model, feedback modalities, and social navigation mechanisms. Our main contribution lies within the final field trial.With the autonomous IURO platform, we performed a series of six way-finding runs (over 24 hours of run-time in total) in the city center of Munich, Germany. The robot interacted with approximately 100 pedestrians of which 36 interactions included a full route dialogue. A variety of empirical methods was used to explore reactions of primary users (pedestrians who actually interacted with the robot) and secondary users (bystanders who observed others interacting). The gathered data provides insights into usability, user experience, and acceptance of IURO and allowed us deriving recommendations for the development of other socially interactive robots
    corecore