2,639 research outputs found
Participation : young spice
A brilliant companion to the critically acclaimed 'Spice it Up'. Fun participation activities for the under 11's
Can my robotic home cleaner be happy? Issues about emotional expression in non-bio-inspired robots.
In many robotic applications a robot body should have a functional shape that cannot include bio-inspired elements, but it would still be important that the robot can express emotions, moods, or a character, to make it acceptable, and to involve its users. Dynamic signals from movement can be exploited to provide this expression, while the robot is acting to perform its task. A research effort has been started to find general emotion expression models for actions that could be applied to any kind of robot to obtain believable and easily detectable emotional expressions. On his path, the need for a unified representation of emotional expression emerged. A framework to define action characteristics that could be used to represent emotions is proposed in this paper. Guidelines are provided to identify quantitative models and numerical values for parameters, which can be used to design and engineer emotional robot actions. A set of robots having different shapes, movement possibilities, and goals have been implemented following these guidelines. Thanks to the proposed framework, different models to implement emotional expression could now be compared in a sound way. The question mentioned in the title can now be answered in a justified way
Machine Body Language: Expressing a Smart Speaker’s Activity with Intelligible Physical Motion
People’s physical movement and body language implicitly convey what they think and feel, are doing or are about to do. In contrast, current smart speakers miss out on this richness of body language, primarily relying on voice commands only. We present QUBI, a dynamic smart speaker that leverages expressive physical motion – stretching, nodding, turning, shrugging, wiggling, pointing and leaning forwards/backwards – to convey cues about its underlying behaviour and activities. We conducted a qualitative Wizard of Oz lab study, in which 12 participants interacted with QUBI in four scripted scenarios. From our study, we distilled six themes: (1) mirroring and mimicking motions; (2) body language to supplement voice instructions; (3) anthropomorphism and personality; (4) audio can trump motion; (5) reaffirming uncertain interpretations to support mutual understanding; and (6) emotional reactions to QUBI’s behaviour. From this, we discuss design implications for future smart speakers
Affective Communication for Socially Assistive Robots (SARs) for Children with Autism Spectrum Disorder: A Systematic Review
Research on affective communication for socially assistive robots has been conducted to
enable physical robots to perceive, express, and respond emotionally. However, the use of affective
computing in social robots has been limited, especially when social robots are designed for children,
and especially those with autism spectrum disorder (ASD). Social robots are based on cognitiveaffective models, which allow them to communicate with people following social behaviors and
rules. However, interactions between a child and a robot may change or be different compared to
those with an adult or when the child has an emotional deficit. In this study, we systematically
reviewed studies related to computational models of emotions for children with ASD. We used the
Scopus, WoS, Springer, and IEEE-Xplore databases to answer different research questions related to
the definition, interaction, and design of computational models supported by theoretical psychology
approaches from 1997 to 2021. Our review found 46 articles; not all the studies considered children
or those with ASD.This research was funded by VRIEA-PUCV, grant number 039.358/202
- …