61 research outputs found

    Kolaboratif robotlarda güven özelliği: Sanal insan robot etkileşim ortamında, sözsüz ipuçlarının deneysel araştırması

    Get PDF
    This thesis reports the development of non-verbal HRI (Human-Robot Interaction) behaviors on a robotic manipulator, evaluating the role of trust in collaborative assembly tasks. Towards this end, we developed four non-verbal HRI behaviors, namely gazing, head nodding, tilting, and shaking, on a UR5 robotic manipulator. We used them under different degrees of trust of the user to the robot actions. Specifically, we used a certain head-on neck posture for the cobot using the last three links along with the gripper. The gaze behavior directed the gripper towards the desired point in space, alongside with the head nodding and shaking behaviors. We designed a remote setup to experiment subjects interacting with the cobot remotely via Zoom teleconferencing. In a simple collaborative scenario, the efficacy of these behaviors was assessed in terms of their impact on the formation of trust between the robot and the user and task performance. Nineteen people participated in the experiment with varying ages and genders.Bu tez insan robot arası etkileşimi geliştirmek amacıyla, yardımcı UR5 robotunun manipülatörü ile, bakış ve kafa davranışları yaratmayı ve etkilerini montaj senaryosu altında test etmeyi hedeflemektedir. Bu doğrultuda çeşitli sözlü olmayan robot davranışları UR5 robotu ve Robotiq çene kıskacı kullanılarak geliştirildi, bunlar; yana ve öne kafa sallama, kafa eğme ve bakış davranışıdır. Bu davranışları uygulayabilmek için daha önceden dizayn edilmiş bir robot duruşu kullanıldı ve son üç robot eklemi, çene kıskacı kullanılarak baş-boyun yapısına çevrildi. Bu duruş yapısı ile birlikte çene kıskacı uzayda bir noktaya doğrultularak bakış davranışı yapabilmektedir. Bakış davranışına ek olarak kafa yapısı ile birlikte kafa sallama gibi davranışlarda modellendi, bunun yanında katılımcıların aktif olarak cobot ile birlikte telekonferans programı olan Zoom üzerinden etkileşime geçebileceği özgün bir deney ortamı geliştirildi. Ortak çalışmaya dayalı bir senaryoda bu davranışların güven kazanımı ve performans üzerindeki etkisi test edildi. Farklı yaş ve cinsiyet gruplarından 19 katılımcı ile birlikte deneyler gerçekleştirildi.M.S. - Master of Scienc

    로봇의 고개를 움직이는 동작과 타이밍이 인간과 로봇의 상호작용에 미치는 효과

    Get PDF
    학위논문(석사) -- 서울대학교대학원 : 인문대학 협동과정 인지과학전공, 2023. 2. Sowon Hahn.In recent years, robots with artificial intelligence capabilities have become ubiquitous in our daily lives. As intelligent robots are interacting closely with humans, social abilities of robots are increasingly more important. In particular, nonverbal communication can enhance the efficient social interaction between human users and robots, but there are limitations of behavior expression. In this study, we investigated how minimal head movements of the robot influence human-robot interaction. We newly designed a robot which has a simple shaped body and minimal head movement mechanism. We conducted an experiment to examine participants' perception of robots different head movements and timing. Participants were randomly assigned to one of three movement conditions, head nodding (A), head shaking (B) and head tilting (C). Each movement condition included two timing variables, prior head movement of utterance and simultaneous head movement with utterance. For all head movement conditions, participants' perception of anthropomorphism, animacy, likeability and intelligence were higher compared to non-movement (utterance only) condition. In terms of timing, when the robot performed head movement prior to utterance, perceived naturalness was rated higher than simultaneous head movement with utterance. The findings demonstrated that head movements of the robot positively affects user perception of the robot, and head movement prior to utterance can make human-robot conversation more natural. By implementation of head movement and movement timing, simple shaped robots can have better social interaction with humans.최근 인공지능 로봇은 일상에서 흔하게 접할 수 있는 것이 되었다. 인간과의 교류가 늘어남에 따라 로봇의 사회적 능력은 더 중요해지고 있다. 인간과 로봇의 사회적 상호작용은 비언어적 커뮤니케이션을 통해 강화될 수 있다. 그러나 로봇은 비언어적 제스처의 표현에 제약을 갖는다. 또한 로봇의 응답 지연 문제는 인간이 불편한 침묵의 순간을 경험하게 한다. 본 연구를 통해 로봇의 고개 움직임이 인간과 로봇의 상호작용에 어떤 영향을 미치는지 알아보았다. 로봇의 고개 움직임을 탐구하기 위해 단순한 형상과 고개를 움직이는 구조를 가진 로봇을 새롭게 디자인하였다. 이 로봇을 활용하여 로봇의 머리 움직임과 타이밍이 참여자에게 어떻게 지각되는지 실험하였다. 참여자들은 3가지 움직임 조건인, 끄덕임 (A), 좌우로 저음 (B), 기울임 (C) 중 한 가지 조건에 무작위로 선정되었다. 각각의 고개 움직임 조건은 두 가지 타이밍(음성보다 앞선 고개 움직임, 음성과 동시에 일어나는 고개 움직임)의 변수를 갖는다. 모든 타입의 고개 움직임에서 움직임이 없는 조건과 비교하여 로봇의 인격화, 활동성, 호감도, 감지된 지능이 향상된 것을 관찰하였다. 타이밍은 로봇의 음성보다 고개 움직임이 앞설 때 자연스러움이 높게 지각되는 것으로 관찰되었다. 결과적으로, 로봇의 고개 움직임은 사용자의 지각에 긍정적인 영향을 주며, 앞선 타이밍의 고개 움직임이 자연스러움을 향상시키는 것을 확인하였다. 고개를 움직이는 동작과 타이밍을 통해 단순한 형상의 로봇과 인간의 상호작용이 향상될 수 있음을 본 연구를 통해 확인하였다.Chapter 1. Introduction 1 1.1. Motivation 1 1.2. Literature Review and Hypotheses 3 1.3. Purpose of Study 11 Chapter 2. Experiment 13 2.1. Methods 13 2.2. Results 22 2.3. Discussion 33 Chapter 3. Conclusion 35 Chapter 4. General Discussion 37 4.1. Theoretical Implications 37 4.2. Practical Implications 38 4.3. Limitations and Future work 39 References 41 Appendix 53 Abstract in Korean 55석

    Persuasiveness of social robot ‘Nao’ based on gaze and proximity

    Get PDF
    Social Robots have widely infiltrated the retail and public space. Mainly, social robots are being utilized across a wide range of scenarios to influence decision making, disseminate information, and act as a signage mechanism, under the umbrella of Persuasive Robots or Persuasive Technology. While there have been several studies in the afore-mentioned area, the effect of non-verbal behaviour on persuasive abilities is generally unexplored. Therefore, in this research, we report whether two key non-verbal attributes, namely proximity and gaze, can elicit persuasively, compliance, and specific personality appeals. For this, we conducted a 2 (eye gaze) x 2 (proximity) between-subjects experiment where participants viewed a video-based scenario of the Nao robot. Our initial results did not reveal any significant results based on the non-verbal attributes. However, perceived compliance and persuasion were significantly correlated with knowledge, responsiveness, and trustworthiness. In conclusion, we discuss how the design of a robot could make it more convincing as extensive marketing and brand promotion companies could use robots to enhance their advertisement operations

    Should Your Chatbot Joke? Driving Conversion Through the Humour of a Chatbot Greeting

    Get PDF
    Despite the increasing number of companies employing chatbots for tasks that previously needed human involvement, researchers and managers are only now beginning to examine chatbots in customer-brand relationship-building efforts. Not much is known, however, about how managers could modify their chatbot greeting, especially incorporating humour, to increase engagement and foster positive customer–brand interactions. The research aims to investigate how humour in a chatbot welcome message influences customers’ emotional attachment and conversion-to-lead through the mediating role of engagement. The findings of the experiment indicate that conversion-to-lead and emotional attachment rise when chatbots begin with a humorous (vs neutral) greeting. Engagement mediates this effect such that a humorous (vs neutral) greeting sparks engagement and thus makes users more emotionally attached and willing to give out their contact information to the brand. The study contributes to the existing research on chatbots, combining and expanding previous research on human–computer interaction and, more specifically, human–chatbot interaction, as well as the usage of humour in conversational marketing contexts. This study provides managers with insight into how chatbot greetings can engage consumers and convert them into leads

    Motion Generation during Vocalized Emotional Expressions and Evaluation in Android Robots

    Get PDF
    Vocalized emotional expressions such as laughter and surprise often occur in natural dialogue interactions and are important factors to be considered in order to achieve smooth robot-mediated communication. Miscommunication may be caused if there is a mismatch between audio and visual modalities, especially in android robots, which have a highly humanlike appearance. In this chapter, motion generation methods are introduced for laughter and vocalized surprise events, based on analysis results of human behaviors during dialogue interactions. The effectiveness of controlling different modalities of the face, head, and upper body (eyebrow raising, eyelid widening/narrowing, lip corner/cheek raising, eye blinking, head motion, and torso motion control) and different motion control levels are evaluated using an android robot. Subjective experiments indicate the importance of each modality in the perception of motion naturalness (humanlikeness) and the degree of emotional expression

    High Social Acceptance of Head Gaze Loosely Synchronized with Speech for Social Robots

    Get PDF
    This research demonstrates that robots can achieve socially acceptable interactions, using loosely synchronized head gaze-speech, without understanding the semantics of the dialog. Prior approaches used tightly synchronized head gaze-speech, which requires significant human effort and time to manually annotate synchronization events in advance, restricting interactive dialog, and requiring the operator to act as a puppeteer. This approach has two novel aspects. First, it uses affordances in the sentence structure, time delays, and typing to achieve autonomous synchronization of head gaze-speech. Second, it is implemented within a behavioral robotics framework derived from 32 previous implementations. The efficacy of the loosely synchronized approach was validated through a 93-participant 1 x 3 (loosely synchronized head gaze-speech, tightly synchronized head gaze-speech, no-head gazespeech) between-subjects experiment using the “Survivor Buddy” rescue robot in a victim management scenario. The results indicated that the social acceptance of loosely synchronized head gaze-speech is similar to tightly synchronized head gazespeech (manual annotation), and preferred to the no head gaze-speech case. These findings contribute to the study of social robotics in three ways. First, the research overall contributes to a fundamental understanding of the role of social head gaze in social acceptance, and the production of social head gaze. Second, it shows that autonomously generated head gaze-speech coordination is both possible and acceptable. Third, the behavioral robotics framework simplifies creation, analysis, and comparison of implementations

    Measuring, analysing and artificially generating head nodding signals in dyadic social interaction

    Get PDF
    Social interaction involves rich and complex behaviours where verbal and non-verbal signals are exchanged in dynamic patterns. The aim of this thesis is to explore new ways of measuring and analysing interpersonal coordination as it naturally occurs in social interactions. Specifically, we want to understand what different types of head nods mean in different social contexts, how they are used during face-to-face dyadic conversation, and if they relate to memory and learning. Many current methods are limited by time-consuming and low-resolution data, which cannot capture the full richness of a dyadic social interaction. This thesis explores ways to demonstrate how high-resolution data in this area can give new insights into the study of social interaction. Furthermore, we also want to demonstrate the benefit of using virtual reality to artificially generate interpersonal coordination to test our hypotheses about the meaning of head nodding as a communicative signal. The first study aims to capture two patterns of head nodding signals – fast nods and slow nods – and determine what they mean and how they are used across different conversational contexts. We find that fast nodding signals receiving new information and has a different meaning than slow nods. The second study aims to investigate a link between memory and head nodding behaviour. This exploratory study provided initial hints that there might be a relationship, though further analyses were less clear. In the third study, we aim to test if interactive head nodding in virtual agents can be used to measure how much we like the virtual agent, and whether we learn better from virtual agents that we like. We find no causal link between memory performance and interactivity. In the fourth study, we perform a cross-experimental analysis of how the level of interactivity in different contexts (i.e., real, virtual, and video), impacts on memory and find clear differences between them

    Becoming Human with Humanoid

    Get PDF
    Nowadays, our expectations of robots have been significantly increases. The robot, which was initially only doing simple jobs, is now expected to be smarter and more dynamic. People want a robot that resembles a human (humanoid) has and has emotional intelligence that can perform action-reaction interactions. This book consists of two sections. The first section focuses on emotional intelligence, while the second section discusses the control of robotics. The contents of the book reveal the outcomes of research conducted by scholars in robotics fields to accommodate needs of society and industry
    corecore