27 research outputs found

    Systems overview of Ono: a DIY reproducible open source social robot

    Get PDF
    One of the major obstacles in the study of HRI (human-robot interaction) with social robots is the lack of multiple identical robots that allow testing with large user groups. Often, the price of these robots prohibits using more than a handful. A lot of the commercial robots do not possess all the necessary features to perform specific HRI experiments and due to the closed nature of the platform, large modifications are nearly impossible. While open source social robots do exist, they often use high-end components and expensive manufacturing techniques, making them unsuitable for easy reproduction. To address this problem, a new social robotics platform, named Ono, was developed. The design is based on the DIY mindset of the maker movement, using off-the-shelf components and more accessible rapid prototyping and manufacturing techniques. The modular structure of the robot makes it easy to adapt to the needs of the experiment and by embracing the open source mentality, the robot can be easily reproduced or further developed by a community of users. The low cost, open nature and DIY friendliness of the robot make it an ideal candidate for HRI studies that require a large user group

    Emotional design and human-robot interaction

    Get PDF
    Recent years have shown an increase in the importance of emotions applied to the Design field - Emotional Design. In this sense, the emotional design aims to elicit (e.g., pleasure) or prevent (e.g., displeasure) determined emotions, during human product interaction. That is, the emotional design regulates the emotional interaction between the individual and the product (e.g., robot). Robot design has been a growing area whereby robots are interacting directly with humans in which emotions are essential in the interaction. Therefore, this paper aims, through a non-systematic literature review, to explore the application of emotional design, particularly on Human-Robot Interaction. Robot design features (e.g., appearance, expressing emotions and spatial distance) that affect emotional design are introduced. The chapter ends with a discussion and a conclusion.info:eu-repo/semantics/acceptedVersio

    Emotional Postures for the Humanoid-Robot Nao

    Get PDF
    This paper presents the development of emotional postures for the humanoid robot Nao. The approach is based on adaptation of the postures that are developed for a virtual human body model to the case of the physical robot Nao. In the paper the association between the joints of the human body model and the joints of the Nao robot are described and the transformation of postures is explained. The non-correspondence between the joints of the actual physical robot and the joints of the human body model was a major challenge in this work. Moreover, the implementation of the postures into the robot was constrained by the physical structure and the artificial mass distribution. Postures for the three emotions of anger, sadness, and happiness are studied. Thirty two postures are generated for each emotion. Among them the best five postures for each emotion are selected based on the votes of twenty five external observers. The distribution of the votes indicates that many of the implemented postures do not convey the intended emotions. The emotional content of the selected best five postures are tested by the votes of forty observers. The intended emotions received the highest recognition rate for each group of these selected postures. This study can be considered to be the last step of a general process for developing emotional postures for robots. This process starts with qualitative descriptions of human postures, continues with encoding those descriptions in quantitative terms, and ends with adaptation of the quantitative values to a specific robot. The present study demonstrates the last step of this proces

    Modification of Gesture-Determined-Dynamic Function with Consideration of Margins for Motion Planning of Humanoid Robots

    Full text link
    The gesture-determined-dynamic function (GDDF) offers an effective way to handle the control problems of humanoid robots. Specifically, GDDF is utilized to constrain the movements of dual arms of humanoid robots and steer specific gestures to conduct demanding tasks under certain conditions. However, there is still a deficiency in this scheme. Through experiments, we found that the joints of the dual arms, which can be regarded as the redundant manipulators, could exceed their limits slightly at the joint angle level. The performance straightly depends on the parameters designed beforehand for the GDDF, which causes a lack of adaptability to the practical applications of this method. In this paper, a modified scheme of GDDF with consideration of margins (MGDDF) is proposed. This MGDDF scheme is based on quadratic programming (QP) framework, which is widely applied to solving the redundancy resolution problems of robot arms. Moreover, three margins are introduced in the proposed MGDDF scheme to avoid joint limits. With consideration of these margins, the joints of manipulators of the humanoid robots will not exceed their limits, and the potential damages which might be caused by exceeding limits will be completely avoided. Computer simulations conducted on MATLAB further verify the feasibility and superiority of the proposed MGDDF scheme

    Motion Generation during Vocalized Emotional Expressions and Evaluation in Android Robots

    Get PDF
    Vocalized emotional expressions such as laughter and surprise often occur in natural dialogue interactions and are important factors to be considered in order to achieve smooth robot-mediated communication. Miscommunication may be caused if there is a mismatch between audio and visual modalities, especially in android robots, which have a highly humanlike appearance. In this chapter, motion generation methods are introduced for laughter and vocalized surprise events, based on analysis results of human behaviors during dialogue interactions. The effectiveness of controlling different modalities of the face, head, and upper body (eyebrow raising, eyelid widening/narrowing, lip corner/cheek raising, eye blinking, head motion, and torso motion control) and different motion control levels are evaluated using an android robot. Subjective experiments indicate the importance of each modality in the perception of motion naturalness (humanlikeness) and the degree of emotional expression

    Psychophysiological responses to eye contact with a humanoid robot: Impact of perceived intentionality

    Get PDF
    Eye contact with a social robot has been shown to elicit similar psychophysiological responses to eye contact with another human. However, it is becoming increasingly clear that the attention- and affect-related psychophysiological responses differentiate between direct (toward the observer) and averted gaze mainly when viewing embodied faces that are capable of social interaction, whereas pictorial or pre-recorded stimuli have no such capability. It has been suggested that genuine eye contact, as indicated by the differential psychophysiological responses to direct and averted gaze, requires a feeling of being watched by another mind. Therefore, we measured event-related potentials (N170 and frontal P300) with EEG, facial electromyography, skin conductance, and heart rate deceleration responses to seeing a humanoid robot's direct versus averted gaze, while manipulating the impression of the robot's intentionality. The results showed that the N170 and the facial zygomatic responses were greater to direct than to averted gaze of the robot, and independent of the robot's intentionality, whereas the frontal P300 responses were more positive to direct than to averted gaze only when the robot appeared intentional. The study provides further evidence that the gaze behavior of a social robot elicits attentional and affective responses and adds that the robot's seemingly autonomous social behavior plays an important role in eliciting higher-level socio-cognitive processing.Peer reviewe

    Toward Context-Aware, Affective, and Impactful Social Robots

    Get PDF

    Communication

    Get PDF
    From a traditional engineering perspective, communication is about effecting control over a distance, and its primary concern is the reliability of transmission. This chapter reviews communication in nature, describing its evolution from the perspective of the selfish gene. Communication in nature is ubiquitous and generally honest, and arises as much from collaboration as manipulation. We show that context and relevance allow effective communication with little information transfer, particularly between organisms with similar capacities and goals. Human language differs fundamentally from the non-verbal communication we share with other animals; robots may need to accommodate both. We document progress in AI capacities to generate synthetic emotion and to sense and classify human emotion. Communication in contemporary biomimetic systems is between robots in swarm robotics, but also between robot and human for both autonomous and collaborative systems. We suggest increased future emphasis on capacities to receive and comprehend signs, and on the pragmatic utility of communication and cooperation.</p
    corecore