335 research outputs found

    Speech-Gesture Mapping and Engagement Evaluation in Human Robot Interaction

    Full text link
    A robot needs contextual awareness, effective speech production and complementing non-verbal gestures for successful communication in society. In this paper, we present our end-to-end system that tries to enhance the effectiveness of non-verbal gestures. For achieving this, we identified prominently used gestures in performances by TED speakers and mapped them to their corresponding speech context and modulated speech based upon the attention of the listener. The proposed method utilized Convolutional Pose Machine [4] to detect the human gesture. Dominant gestures of TED speakers were used for learning the gesture-to-speech mapping. The speeches by them were used for training the model. We also evaluated the engagement of the robot with people by conducting a social survey. The effectiveness of the performance was monitored by the robot and it self-improvised its speech pattern on the basis of the attention level of the audience, which was calculated using visual feedback from the camera. The effectiveness of interaction as well as the decisions made during improvisation was further evaluated based on the head-pose detection and interaction survey.Comment: 8 pages, 9 figures, Under review in IRC 201

    The perception of emotion in artificial agents

    Get PDF
    Given recent technological developments in robotics, artificial intelligence and virtual reality, it is perhaps unsurprising that the arrival of emotionally expressive and reactive artificial agents is imminent. However, if such agents are to become integrated into our social milieu, it is imperative to establish an understanding of whether and how humans perceive emotion in artificial agents. In this review, we incorporate recent findings from social robotics, virtual reality, psychology, and neuroscience to examine how people recognize and respond to emotions displayed by artificial agents. First, we review how people perceive emotions expressed by an artificial agent, such as facial and bodily expressions and vocal tone. Second, we evaluate the similarities and differences in the consequences of perceived emotions in artificial compared to human agents. Besides accurately recognizing the emotional state of an artificial agent, it is critical to understand how humans respond to those emotions. Does interacting with an angry robot induce the same responses in people as interacting with an angry person? Similarly, does watching a robot rejoice when it wins a game elicit similar feelings of elation in the human observer? Here we provide an overview of the current state of emotion expression and perception in social robotics, as well as a clear articulation of the challenges and guiding principles to be addressed as we move ever closer to truly emotional artificial agents

    Humanoid Robot Soccer Locomotion and Kick Dynamics: Open Loop Walking, Kicking and Morphing into Special Motions on the Nao Robot

    Get PDF
    Striker speed and accuracy in the RoboCup (SPL) international robot soccer league is becoming increasingly important as the level of play rises. Competition around the ball is now decided in a matter of seconds. Therefore, eliminating any wasted actions or motions is crucial when attempting to kick the ball. It is common to see a discontinuity between walking and kicking where a robot will return to an initial pose in preparation for the kick action. In this thesis we explore the removal of this behaviour by developing a transition gait that morphs the walk directly into the kick back swing pose. The solution presented here is targeted towards the use of the Aldebaran walk for the Nao robot. The solution we develop involves the design of a central pattern generator to allow for controlled steps with realtime accuracy, and a phase locked loop method to synchronise with the Aldebaran walk so that precise step length control can be activated when required. An open loop trajectory mapping approach is taken to the walk that is stabilized statically through the use of a phase varying joint holding torque technique. We also examine the basic princples of open loop walking, focussing on the commonly overlooked frontal plane motion. The act of kicking itself is explored both analytically and empirically, and solutions are provided that are versatile and powerful. Included as an appendix, the broader matter of striker behaviour (process of goal scoring) is reviewed and we present a velocity control algorithm that is very accurate and efficient in terms of speed of execution

    Humanoid Robot handling Hand-Signs Recognition

    Get PDF
    Recent advancements in human-robot interaction have led to tremendous improvement for humanoid robots but still lacks social acceptance among people. Though verbal communication is the primary means of human-robot interaction, non-verbal communication that is proven to be an integral part of the human interactions is not widely used in humanoid robots. This thesis aims to achieve human-robot interaction via non-verbal communication, especially using hand-signs. It presents a prototype system that simulates hand-signs recognition in the NAO humanoid robot, and further an online questionnaire is used to examine people's opinion on the use of non-verbal communication to interact with a humanoid robot. The positive results derived from the study indicates people's willingness to use non-verbal communication as a means to communicate with humanoid robots, thus encouraging robot designers to use non-verbal communications for enhancing human-robot interaction

    Kolaboratif robotlarda güven özelliği: Sanal insan robot etkileşim ortamında, sözsüz ipuçlarının deneysel araştırması

    Get PDF
    This thesis reports the development of non-verbal HRI (Human-Robot Interaction) behaviors on a robotic manipulator, evaluating the role of trust in collaborative assembly tasks. Towards this end, we developed four non-verbal HRI behaviors, namely gazing, head nodding, tilting, and shaking, on a UR5 robotic manipulator. We used them under different degrees of trust of the user to the robot actions. Specifically, we used a certain head-on neck posture for the cobot using the last three links along with the gripper. The gaze behavior directed the gripper towards the desired point in space, alongside with the head nodding and shaking behaviors. We designed a remote setup to experiment subjects interacting with the cobot remotely via Zoom teleconferencing. In a simple collaborative scenario, the efficacy of these behaviors was assessed in terms of their impact on the formation of trust between the robot and the user and task performance. Nineteen people participated in the experiment with varying ages and genders.Bu tez insan robot arası etkileşimi geliştirmek amacıyla, yardımcı UR5 robotunun manipülatörü ile, bakış ve kafa davranışları yaratmayı ve etkilerini montaj senaryosu altında test etmeyi hedeflemektedir. Bu doğrultuda çeşitli sözlü olmayan robot davranışları UR5 robotu ve Robotiq çene kıskacı kullanılarak geliştirildi, bunlar; yana ve öne kafa sallama, kafa eğme ve bakış davranışıdır. Bu davranışları uygulayabilmek için daha önceden dizayn edilmiş bir robot duruşu kullanıldı ve son üç robot eklemi, çene kıskacı kullanılarak baş-boyun yapısına çevrildi. Bu duruş yapısı ile birlikte çene kıskacı uzayda bir noktaya doğrultularak bakış davranışı yapabilmektedir. Bakış davranışına ek olarak kafa yapısı ile birlikte kafa sallama gibi davranışlarda modellendi, bunun yanında katılımcıların aktif olarak cobot ile birlikte telekonferans programı olan Zoom üzerinden etkileşime geçebileceği özgün bir deney ortamı geliştirildi. Ortak çalışmaya dayalı bir senaryoda bu davranışların güven kazanımı ve performans üzerindeki etkisi test edildi. Farklı yaş ve cinsiyet gruplarından 19 katılımcı ile birlikte deneyler gerçekleştirildi.M.S. - Master of Scienc

    Impact of Iris Size and Eyelids Coupling on the Estimation of the Gaze Direction of a Robotic Talking Head by Human Viewers

    No full text
    International audiencePrimates - and in particular humans-are very sensitive to the eye direction of congeners. Estimation of gaze of others is one of the basic skills for estimating goals, intentions and desires of social agents, whether they are humans or avatars. When building robots, one should not only supply them with gaze trackers but also check for the readability of their own gaze by human partners. We conducted experiments that demonstrate the strong impact of the iris size and the position of the eyelids of an iCub humanoid robot on gaze reading performance by human observers. We comment on the importance of assessing the robot's ability of displaying its intentions via clearly legible and readable gestures

    Touching a mechanical body: tactile contact with body parts of a humanoid robot is physiologically arousing

    Get PDF
    A large literature describes the use of robots’ physical bodies to support communication with people. Touch is a natural channel for physical interaction, yet it is not understood how principles of interpersonal touch might carry over to human-robot interaction. Ten students participated in an interactive anatomy lesson with a small, humanoid robot. Participants either touched or pointed to an anatomical region of the robot in each of 26 trials while their skin conductance response was measured. Touching less accessible regions of the robot (e.g., buttocks and genitals) was more physiologically arousing than touching more accessible regions (e.g., hands and feet). No differences in physiological arousal were found when just pointing to those same anatomical regions. Social robots can elicit tactile responses in human physiology, a result that signals the power of robots, and should caution mechanical and interaction designers about positive and negative effects of human-robot interactions
    corecore