3,452 research outputs found
No Grice: Computers that Lie, Deceive and Conceal
In the future our daily life interactions with other people, with computers, robots and smart environments will be recorded and interpreted by computers or embedded intelligence in environments, furniture, robots, displays, and wearables. These sensors record our activities, our behavior, and our interactions. Fusion of such information and reasoning about such information makes it possible, using computational models of human behavior and activities, to provide context- and person-aware interpretations of human behavior and activities, including determination of attitudes, moods, and emotions. Sensors include cameras, microphones, eye trackers, position and proximity sensors, tactile or smell sensors, et cetera. Sensors can be embedded in an environment, but they can also move around, for example, if they are part of a mobile social robot or if they are part of devices we carry around or are embedded in our clothes or body. \ud
\ud
Our daily life behavior and daily life interactions are recorded and interpreted. How can we use such environments and how can such environments use us? Do we always want to cooperate with these environments; do these environments always want to cooperate with us? In this paper we argue that there are many reasons that users or rather human partners of these environments do want to keep information about their intentions and their emotions hidden from these smart environments. On the other hand, their artificial interaction partner may have similar reasons to not give away all information they have or to treat their human partner as an opponent rather than someone that has to be supported by smart technology.\ud
\ud
This will be elaborated in this paper. We will survey examples of human-computer interactions where there is not necessarily a goal to be explicit about intentions and feelings. In subsequent sections we will look at (1) the computer as a conversational partner, (2) the computer as a butler or diary companion, (3) the computer as a teacher or a trainer, acting in a virtual training environment (a serious game), (4) sports applications (that are not necessarily different from serious game or education environments), and games and entertainment applications
Social robot tutoring for child second language learning
An increasing amount of research is being conducted
to determine how a robot tutor should behave socially in educa- tional interactions with children. Both human-human and human- robot interaction literature predicts an increase in learning with increased social availability of a tutor, where social availability has verbal and nonverbal components. Prior work has shown that greater availability in the nonverbal behaviour of a robot tutor has a positive impact on child learning. This paper presents a study with 67 children to explore how social aspects of a tutor robotâs speech influences their perception of the robot and their language learning in an interaction. Children perceive the difference in social behaviour between âlowâ and âhighâ verbal availability conditions, and improve significantly between a pre- and a post-test in both conditions. A longer-term retention test taken the following week showed that the children had retained almost all of the information they had learnt. However, learning was not affected by which of the robot behaviours they had been exposed to. It is suggested that in this short-term interaction context, additional effort in developing social aspects of a robotâs verbal behaviour may not return the desired positive impact on learning gains
Journal of Communication Pedagogy, Complete Volume 4, 2021
This is the complete volume 4 of the Journal of Communication Pedagogy
Tactile Interactions with a Humanoid Robot : Novel Play Scenario Implementations with Children with Autism
Acknowledgments: This work has been partially supported by the European Commission under contract number FP7-231500-ROBOSKIN. Open Access: This article is distributed under the terms of the Creative Commons Attribution License which permits any use, distribution, and reproduction in any medium, provided the original author(s) and the source are credited.The work presented in this paper was part of our investigation in the ROBOSKIN project. The project has developed new robot capabilities based on the tactile feedback provided by novel robotic skin, with the aim to provide cognitive mechanisms to improve human-robot interaction capabilities. This article presents two novel tactile play scenarios developed for robot-assisted play for children with autism. The play scenarios were developed against specific educational and therapeutic objectives that were discussed with teachers and therapists. These objectives were classified with reference to the ICF-CY, the International Classification of Functioning â version for Children and Youth. The article presents a detailed description of the play scenarios, and case study examples of their implementation in HRI studies with children with autism and the humanoid robot KASPAR.Peer reviewedFinal Published versio
Nonverbal immediacy as a characterisation of social behaviour for human-robot interaction
An increasing amount of research has started
to explore the impact of robot social behaviour on the
outcome of a goal for a human interaction partner, such
as cognitive learning gains. However, it remains unclear
from what principles the social behaviour for such robots
should be derived. Human models are often used, but
in this paper an alternative approach is proposed. First,
the concept of nonverbal immediacy from the communication
literature is introduced, with a focus on how it
can provide a characterisation of social behaviour, and
the subsequent outcomes of such behaviour. A literature
review is conducted to explore the impact on learning
of the social cues which form the nonverbal immediacy
measure. This leads to the production of a series
of guidelines for social robot behaviour. The resulting
behaviour is evaluated in a more general context, where
both children and adults judge the immediacy of humans
and robots in a similar manner, and their recall of
a short story is tested. Children recall more of the story
when the robot is more immediate, which demonstrates
an eïżœffect predicted by the literature. This study provides
validation for the application of nonverbal immediacy
to child-robot interaction. It is proposed that nonverbal
immediacy measures could be used as a means of
characterising robot social behaviour for human-robot
interaction
Towards the Safety of Human-in-the-Loop Robotics: Challenges and Opportunities for Safety Assurance of Robotic Co-Workers
The success of the human-robot co-worker team in a flexible manufacturing
environment where robots learn from demonstration heavily relies on the correct
and safe operation of the robot. How this can be achieved is a challenge that
requires addressing both technical as well as human-centric research questions.
In this paper we discuss the state of the art in safety assurance, existing as
well as emerging standards in this area, and the need for new approaches to
safety assurance in the context of learning machines. We then focus on robotic
learning from demonstration, the challenges these techniques pose to safety
assurance and indicate opportunities to integrate safety considerations into
algorithms "by design". Finally, from a human-centric perspective, we stipulate
that, to achieve high levels of safety and ultimately trust, the robotic
co-worker must meet the innate expectations of the humans it works with. It is
our aim to stimulate a discussion focused on the safety aspects of
human-in-the-loop robotics, and to foster multidisciplinary collaboration to
address the research challenges identified
Developing a protocol and experimental setup for using a humanoid robot to assist children with autism to develop visual perspective taking skills
Visual Perspective Taking (VPT) is the ability to see the world from another person's perspective, taking into account what they see and how they see it, drawing upon both spatial and social information. Children with autism often find it difficult to understand that other people might have perspectives, viewpoints, beliefs and knowledge that are different from their own, which is a fundamental aspect of VPT. In this research we aimed to develop a methodology to assist children with autism develop their VPT skills using a humanoid robot and present results from our first long-term pilot study. The games we devised were implemented with the Kaspar robot and, to our knowledge, this is the first attempt to improve the VPT skills of children with autism through playing and interacting with a humanoid robot. We describe in detail the standard pre- and post- assessments that we performed with the children in order to measure their progress and also the inclusion criteria derived from the results for future studies in this field. Our findings suggest that some children may benefit from this approach of learning about VPT, which shows that this approach merits further investigation.Peer reviewe
- âŠ