130 research outputs found

    How language of interaction affects the user perception of a robot

    Full text link
    Spoken language is the most natural way for a human to communicate with a robot. It may seem intuitive that a robot should communicate with users in their native language. However, it is not clear if a user's perception of a robot is affected by the language of interaction. We investigated this question by conducting a study with twenty-three native Czech participants who were also fluent in English. The participants were tasked with instructing the Pepper robot on where to place objects on a shelf. The robot was controlled remotely using the Wizard-of-Oz technique. We collected data through questionnaires, video recordings, and a post-experiment feedback session. The results of our experiment show that people perceive an English-speaking robot as more intelligent than a Czech-speaking robot (z = 18.00, p-value = 0.02). This finding highlights the influence of language on human-robot interaction. Furthermore, we discuss the feedback obtained from the participants via the post-experiment sessions and its implications for HRI design.Comment: ICSR 202

    Preliminary validation of the European Portuguese version of the Robotic Social Attributes Scale (RoSAS)

    Get PDF
    Background: Peopleโ€™s perception of social robots is essential in determining their responses and acceptance of this type of agent. Currently, there are few instruments validated for the European Portuguese population that measures the perception of social robots. Method: Our goal was to translate, validate, and evaluate the psychometric properties of the Robotic Social Attributes Scale (RoSAS) to European Portuguese. To achieve this goal, we conducted a validation study using a sample of 185 participants. We measured the temporal validity of the scale (over a two-week interval) and its divergent and convergent validity using the Portuguese Negative Attitudes towards Robots Scale (PNARS) and the Godspeed scales. Results: Our data analysis resulted in a shortened version of the Portuguese RoSAS with 11 items while retaining the original three-factor structure. The scale presented poor to acceptable levels of temporal reliability. We found a positive correlation between the warmth and competence dimensions. Further validation studies are needed to investigate the psychometric properties of this scale.info:eu-repo/semantics/publishedVersio

    A Systematic Literature Review of User Experience Evaluation Scales for Human-Robot Collaboration

    Get PDF
    In the last decade, the field of Human-Robot Collaboration (HRC) has received much attention from both research institutions and industries. Robot technologies are in fact deployed in many different areas (e.g., industrial processes, people assistance) to support an effective collaboration between humans and robots. In this transdisciplinary context, User eXperience (UX) has inevitably to be considered to achieve an effective HRC, namely to allow the robots to better respond to the usersโ€™ needs and thus improve the interaction quality. The present paper reviews the evaluation scales used in HRC scenarios, focusing on the application context and evaluated aspects. In particular, a systematic review was conducted based on the following questions: (RQ1) which evaluation scales are adopted within the HRI scenario with collaborative tasks?, and (RQ2) how the UX and user satisfaction are assessed?. The records analysis highlighted that the UX aspects are not sufficiently examined in the current HRC design practice, particularly in the industrial field. This is most likely due to a lack of standardized scales. To respond to this recognized need, a set of dimensions to be considered in a new UX evaluation scale were proposed

    Evaluating interactions with a cognitively biased robot in a creative collaborative task

    Get PDF

    Social robots: The influence of human and robot characteristics on acceptance

    Get PDF
    Research in social robotics is focused on the development of robots that can provide physical and cognitive support in a socially interactive way. Whilst some studies have previously investigated the importance of user characteristics (age, gender, education, robot familiarity, mood) in the acceptance of social robots as well as the influence a robot's displayed emotion (positive, negative, neutral) has on the interaction, these two aspects are rarely combined. Therefore, this study attempts to highlight the need to consider the influence that both human and robot attributes can have on social robot acceptance. Eighty-six participants completed implicit and explicit measures of mood before viewing one of three video clips containing a positive, negative or neutral social robot (Pepper) followed byquestionnaires on robot acceptance and perception. Gender and education were not associated with acceptance; however, several constructs of the acceptance questionnaire significantly correlated with age and mood. For example, those younger and those experiencing sadness or loneliness were more dependent on the opinions of others (as measured by the social influence construct of the acceptance questionnaire). This highlights the importance of mood in the introduction of social robots into vulnerable populations. Robot familiarity also correlated with robot acceptance with those more familiar finding the robot less useful and less enjoyable, this is important as robots become more prominent in society. Displayed robot emotion significantly influenced acceptance and perception with the positive robot appearing more childlike than the negative and neutral robot, and the neutral robot the least helpful. These findings emphasise the importance of both user and robot characteristics in the successful integration of social robots

    Natural language generation for social robotics: Opportunities and challenges

    Get PDF
    In the increasingly popular and diverse research area of social robotics, the primary goal is to develop robot agents that exhibit socially intelligent behaviour while interacting in a face-to-face context with human partners. An important aspect of face-to-face social conversation is fluent, flexible linguistic interaction: as Bavelas et al. [1] point out, face-to-face dialogue is both the basic form of human communication and the richest and most flexible, combining unrestricted verbal expression with meaningful non-verbal acts such as gestures and facial displays, along with instantaneous, continuous collaboration between the speaker and the listener. In practice, however, most developers of social robots tend not to use the full possibilities of the unrestricted verbal expression afforded by face-to-face conversation; instead, they generally tend to employ relatively simplistic processes for choosing the words for their robots to say. This contrasts with the work carried out Natural Language Generation (NLG), the field of computational linguistics devoted to the automated production of high-quality linguistic content: while this research area is also an active one, in general most effort in NLG is focussed on producing high-quality written text. This article summarises the state-of-the-art in the two individual research areas of social robotics and natural language generation. It then discusses the reasons why so few current social robots make use of more sophisticated generation techniques. Finally, an approach is proposed to bringing some aspects of NLG into social robotics, concentrating on techniques and tools that are most appropriate to the needs of socially interactive robots

    ๋กœ๋ด‡์˜ ์‹ ์ฒด ์–ธ์–ด๊ฐ€ ์‚ฌํšŒ์  ํŠน์„ฑ๊ณผ ์ธ๊ฐ„ ์œ ์‚ฌ์„ฑ์— ๋ฏธ์น˜๋Š” ์˜ํ–ฅ

    Get PDF
    ํ•™์œ„๋…ผ๋ฌธ (์„์‚ฌ) -- ์„œ์šธ๋Œ€ํ•™๊ต ๋Œ€ํ•™์› : ์‚ฌํšŒ๊ณผํ•™๋Œ€ํ•™ ์‹ฌ๋ฆฌํ•™๊ณผ, 2021. 2. Sowon Hahn.The present study investigated the role of robotsโ€™ body language on perceptions of social qualities and human-likeness in robots. In experiment 1, videos of a robotโ€™s body language varying in expansiveness were used to evaluate the two aspects. In experiment 2, videos of social interactions containing the body languages in experiment 1 were used to further examine the effects of robotsโ€™ body language on these aspects. Results suggest that a robot conveying open body language are evaluated higher on perceptions of social characteristics and human-likeness compared to a robot with closed body language. These effects were not found in videos of social interactions (experiment 2), which suggests that other features play significant roles in evaluations of a robot. Nonetheless, current research provides evidence of the importance of robotsโ€™ body language in judgments of social characteristics and human-likeness. While measures of social qualities and human-likeness favor robots that convey open body language, post-experiment interviews revealed that participants expect robots to alleviate feelings of loneliness and empathize with them, which require more diverse body language in addition to open body language. Thus, robotic designers are encouraged to develop robots capable of expressing a wider range of motion. By enabling complex movements, more natural communications between humans and robots are possible, which allows humans to consider robots as social partners.๋ณธ ์—ฐ๊ตฌ๋Š” ๋กœ๋ด‡์˜ ์‹ ์ฒด ์–ธ์–ด๊ฐ€ ์‚ฌํšŒ์  ํŠน์„ฑ๊ณผ ์ธ๊ฐ„๊ณผ์˜ ์œ ์‚ฌ์„ฑ์— ๋Œ€ํ•œ ์ธ๊ฐ„์˜ ์ธ์‹์— ๋ฏธ์น˜๋Š” ์˜ํ–ฅ์„ ํƒ์ƒ‰ํ•˜์˜€๋‹ค. ์‹คํ—˜ 1์—์„œ๋Š” ๋กœ๋ด‡์˜ ๊ฐœ๋ฐฉ์  ์‹ ์ฒด ์–ธ์–ด๊ฐ€ ๋ฌ˜์‚ฌ๋œ ์˜์ƒ๊ณผ ํ์‡„์  ์‹ ์ฒด ์–ธ์–ด๊ฐ€ ๋ฌ˜์‚ฌ๋œ ์˜์ƒ์„ ํ†ตํ•ด ์ด๋Ÿฌํ•œ ์„ธ ๊ฐ€์ง€ ์ธก๋ฉด์„ ์‚ดํŽด๋ณด์•˜๋‹ค. ์‹คํ—˜ 2์—์„œ๋Š” ์‹คํ—˜ 1์˜ ์‹ ์ฒด ์–ธ์–ด๊ฐ€ ํฌํ•จ๋œ ๋กœ๋ด‡๊ณผ ์‚ฌ๋žŒ ๊ฐ„์˜ ์ƒํ˜ธ์ž‘์šฉ ์˜์ƒ์„ ํ™œ์šฉํ•˜์—ฌ ๋กœ๋ด‡์˜ ์‹ ์ฒด ์–ธ์–ด๊ฐ€ ์œ„ ๋‘ ๊ฐ€์ง€ ์ธก๋ฉด์— ๋ฏธ์น˜๋Š” ์˜ํ–ฅ์„ ํƒ์ƒ‰ํ•˜์˜€๋‹ค. ๊ฒฐ๊ณผ์ ์œผ๋กœ, ์‚ฌ๋žŒ๋“ค์€ ํ์‡„์  ์‹ ์ฒด ์–ธ์–ด๋ฅผ ํ‘œํ˜„ํ•˜๋Š” ๋กœ๋ด‡์— ๋น„ํ•ด ๊ฐœ๋ฐฉ์  ์‹ ์ฒด ์–ธ์–ด๋ฅผ ํ‘œํ˜„ํ•˜๋Š” ๋กœ๋ด‡์„ ์‚ฌํšŒ์  ํŠน์„ฑ๊ณผ ์ธ๊ฐ„๊ณผ์˜ ์œ ์‚ฌ์„ฑ์— ๋Œ€ํ•œ ์ธ์‹ ๋ฉด์—์„œ ๋” ๋†’๊ฒŒ ํ‰๊ฐ€ํ•œ๋‹ค๋Š” ๊ฒƒ์„ ํ™•์ธํ•˜์˜€๋‹ค. ๊ทธ๋Ÿฌ๋‚˜ ์‚ฌ๋žŒ๊ณผ์˜ ์ƒํ˜ธ์ž‘์šฉ์„ ๋‹ด์€ ์˜์ƒ์„ ํ†ตํ•ด์„œ๋Š” ์ด๋Ÿฌํ•œ ํšจ๊ณผ๊ฐ€ ๋ฐœ๊ฒฌ๋˜์ง€ ์•Š์•˜์œผ๋ฉฐ, ์ด๋Š” ์‹คํ—˜ 2์— ํฌํ•จ๋œ ์Œ์„ฑ ๋“ฑ์˜ ๋‹ค๋ฅธ ํŠน์ง•์ด ๋กœ๋ด‡์— ๋Œ€ํ•œ ํ‰๊ฐ€์— ์ค‘์š”ํ•œ ์—ญํ• ์„ ํ•œ๋‹ค๋Š” ๊ฒƒ์„ ์‹œ์‚ฌํ•œ๋‹ค. ๊ทธ๋Ÿผ์—๋„ ๋ถˆ๊ตฌํ•˜๊ณ , ๋ณธ ์—ฐ๊ตฌ๋Š” ๋กœ๋ด‡์˜ ์‹ ์ฒด ์–ธ์–ด๊ฐ€ ์‚ฌํšŒ์  ํŠน์„ฑ ๋ฐ ์ธ๊ฐ„๊ณผ์˜ ์œ ์‚ฌ์„ฑ์— ๋Œ€ํ•œ ์ธ์‹์˜ ์ค‘์š”ํ•œ ์š”์ธ์ด ๋œ๋‹ค๋Š” ๊ทผ๊ฑฐ๋ฅผ ์ œ๊ณตํ•œ๋‹ค. ์‚ฌํšŒ์  ํŠน์„ฑ๊ณผ ์ธ๊ฐ„๊ณผ์˜ ์œ ์‚ฌ์„ฑ์˜ ์ฒ™๋„์—์„œ๋Š” ๊ฐœ๋ฐฉ์  ์‹ ์ฒด ์–ธ์–ด๋ฅผ ํ‘œํ˜„ํ•˜๋Š” ๋กœ๋ด‡์ด ๋” ๋†’๊ฒŒ ํ‰๊ฐ€๋˜์—ˆ์ง€๋งŒ, ์‹คํ—˜ ํ›„ ์ธํ„ฐ๋ทฐ์—์„œ๋Š” ๋กœ๋ด‡์ด ์™ธ๋กœ์šด ๊ฐ์ •์„ ์™„ํ™”ํ•˜๊ณ  ๊ณต๊ฐํ•˜๊ธฐ๋ฅผ ๊ธฐ๋Œ€ํ•˜๋Š” ๊ฒƒ์œผ๋กœ ๋‚˜ํƒ€๋‚˜ ์ด ์ƒํ™ฉ๋“ค์— ์ ์ ˆํ•œ ํ์‡„์  ์‹ ์ฒด ์–ธ์–ด ๋˜ํ•œ ๋ฐฐ์ œํ•  ์ˆ˜ ์—†๋‹ค๊ณ  ํ•ด์„ํ•  ์ˆ˜ ์žˆ๋‹ค. ์ด์— ๋”ฐ๋ผ ๋ณธ ์—ฐ๊ตฌ์—์„œ๋Š” ๋กœ๋ด‡ ๋””์ž์ด๋„ˆ๋“ค์ด ๋”์šฑ ๋‹ค์–‘ํ•œ ๋ฒ”์œ„์˜ ์›€์ง์ž„์„ ํ‘œํ˜„ํ•  ์ˆ˜ ์žˆ๋Š” ๋กœ๋ด‡์„ ๊ฐœ๋ฐœํ•˜๋„๋ก ์žฅ๋ คํ•œ๋‹ค. ๊ทธ๋ ‡๋‹ค๋ฉด ์„ฌ์„ธํ•œ ์›€์ง์ž„์— ๋”ฐ๋ฅธ ์ž์—ฐ์Šค๋Ÿฌ์šด ์˜์‚ฌ์†Œํ†ต์„ ํ†ตํ•ด ์ธ๊ฐ„์ด ๋กœ๋ด‡์„ ์‚ฌํšŒ์  ๋™๋ฐ˜์ž๋กœ ์ธ์‹ํ•  ์ˆ˜ ์žˆ์„ ๊ฒƒ์ด๋‹ค.Chapter 1. Introduction 1 1. Motivation 1 2. Theoretical Background and Previous Research 3 3. Purpose of Study 12 Chapter 2. Experiment 1 13 1. Objective and Hypotheses 13 2. Methods 13 3. Results 21 4. Discussion 31 Chapter 3. Experiment 2 34 1. Objective and Hypotheses 34 2. Methods 35 3. Results 38 4. Discussion 50 Chapter 4. Conclusion 52 Chapter 5. General Discussion 54 References 60 Appendix 70 ๊ตญ๋ฌธ์ดˆ๋ก 77Maste

    HREyes: Design, Development, and Evaluation of a Novel Method for AUVs to Communicate Information and Gaze Direction

    Full text link
    We present the design, development, and evaluation of HREyes: biomimetic communication devices which use light to communicate information and, for the first time, gaze direction from AUVs to humans. First, we introduce two types of information displays using the HREye devices: active lucemes and ocular lucemes. Active lucemes communicate information explicitly through animations, while ocular lucemes communicate gaze direction implicitly by mimicking human eyes. We present a human study in which our system is compared to the use of an embedded digital display that explicitly communicates information to a diver by displaying text. Our results demonstrate accurate recognition of active lucemes for trained interactants, limited intuitive understanding of these lucemes for untrained interactants, and relatively accurate perception of gaze direction for all interactants. The results on active luceme recognition demonstrate more accurate recognition than previous light-based communication systems for AUVs (albeit with different phrase sets). Additionally, the ocular lucemes we introduce in this work represent the first method for communicating gaze direction from an AUV, a critical aspect of nonverbal communication used in collaborative work. With readily available hardware as well as open-source and easily re-configurable programming, HREyes can be easily integrated into any AUV with the physical space for the devices and used to communicate effectively with divers in any underwater environment with appropriate visibility.Comment: Under submission at ICRA2
    • โ€ฆ
    corecore