3,487 research outputs found

    A motion system for social and animated robots

    Get PDF
    This paper presents an innovative motion system that is used to control the motions and animations of a social robot. The social robot Probo is used to study Human-Robot Interactions (HRI), with a special focus on Robot Assisted Therapy (RAT). When used for therapy it is important that a social robot is able to create an "illusion of life" so as to become a believable character that can communicate with humans. The design of the motion system in this paper is based on insights from the animation industry. It combines operator-controlled animations with low-level autonomous reactions such as attention and emotional state. The motion system has a Combination Engine, which combines motion commands that are triggered by a human operator with motions that originate from different units of the cognitive control architecture of the robot. This results in an interactive robot that seems alive and has a certain degree of "likeability". The Godspeed Questionnaire Series is used to evaluate the animacy and likeability of the robot in China, Romania and Belgium

    Development of duplex eye contact framework for human-robot inter communication

    Get PDF
    Funding Information: This work was supported in part by the National Research Foundation of Korea-Grant funded by the Korean Government (Ministry of Science and ICT) under Grant NRF 2020R1A2B5B02002478, in part by the Sejong University through its Faculty Research Program, and in part by the Directorate of Research and Extension (DRE), Chittagong University of Engineering and Technology.Peer reviewedPublisher PD

    Meetings and Meeting Modeling in Smart Environments

    Get PDF
    In this paper we survey our research on smart meeting rooms and its relevance for augmented reality meeting support and virtual reality generation of meetings in real time or off-line. The research reported here forms part of the European 5th and 6th framework programme projects multi-modal meeting manager (M4) and augmented multi-party interaction (AMI). Both projects aim at building a smart meeting environment that is able to collect multimodal captures of the activities and discussions in a meeting room, with the aim to use this information as input to tools that allow real-time support, browsing, retrieval and summarization of meetings. Our aim is to research (semantic) representations of what takes place during meetings in order to allow generation, e.g. in virtual reality, of meeting activities (discussions, presentations, voting, etc.). Being able to do so also allows us to look at tools that provide support during a meeting and at tools that allow those not able to be physically present during a meeting to take part in a virtual way. This may lead to situations where the differences between real meeting participants, human-controlled virtual participants and (semi-) autonomous virtual participants disappear

    Feel the beat: using cross-modal rhythm to integrate perception of objects, others, and self

    Get PDF
    For a robot to be capable of development, it must be able to explore its environment and learn from its experiences. It must find (or create) opportunities to experience the unfamiliar in ways that reveal properties valid beyond the immediate context. In this paper, we develop a novel method for using the rhythm of everyday actions as a basis for identifying the characteristic appearance and sounds associated with objects, people, and the robot itself. Our approach is to identify and segment groups of signals in individual modalities (sight, hearing, and proprioception) based on their rhythmic variation, then to identify and bind causally-related groups of signals across different modalities. By including proprioception as a modality, this cross-modal binding method applies to the robot itself, and we report a series of experiments in which the robot learns about the characteristics of its own body

    Cyclic Pitch Control for the Reduction of Ultimate Loads on Wind Turbines

    Get PDF
    In this paper we study the use of individual blade pitch control as a way to reduce ultimate loads. This load alleviation strategy exploits the fact that cyclic pitching of the blades induces in general a reduction of the average loading of a wind turbine, at least for some components as the main bearing, the yaw bearing, or the tower. When ultimate loads are generated during shutdowns, the effect of the use of cyclic pitch results in reduced peak loads. In fact, as the machine starts from a less stressed condition, the response to an extreme gust or other event will result in reduced loading on its components. This form of load mitigation can be seen as a preventative load mitigation strategy: the effect on load reduction is obtained without the need to detect and react to an extreme event, but by simply unloading the machine so that, in case an extreme event happens, the result will be less severe. The effect of peak load mitigation by preventative cyclic pitch is investigated with reference to a multi-MW wind turbine, by using high-fidelity aeroelastic simulations in a variety of operating conditions

    Nonverbal communication in virtual reality: Nodding as a social signal in virtual interactions

    Get PDF
    Nonverbal communication is an important part of human communication, including head nodding, eye gaze, proximity and body orientation. Recent research has identified specific patterns of head nodding linked to conversation, namely mimicry of head movements at 600 ms delay and fast nodding when listening. In this paper, we implemented these head nodding behaviour rules in virtual humans, and we tested the impact of these behaviours, and whether they lead to increases in trust and liking towards the virtual humans. We use Virtual Reality technology to simulate a face-to-face conversation, as VR provides a high level of immersiveness and social presence, very similar to face-to-face interaction. We then conducted a study with human-subject participants, where the participants took part in conversations with two virtual humans and then rated the virtual character social characteristics, and completed an evaluation of their implicit trust in the virtual human. Results showed more liking for and more trust in the virtual human whose nodding behaviour was driven by realistic behaviour rules. This supports the psychological models of nodding and advances our ability to build realistic virtual humans

    Nonverbal communication in virtual reality: Nodding as a social signal in virtual interactions

    Get PDF
    Nonverbal communication is an important part of human communication, including head nodding, eye gaze, proximity and body orientation. Recent research has identified specific patterns of head nodding linked to conversation, namely mimicry of head movements at 600 ms delay and fast nodding when listening. In this paper, we implemented these head nodding behaviour rules in virtual humans, and we tested the impact of these behaviours, and whether they lead to increases in trust and liking towards the virtual humans. We use Virtual Reality technology to simulate a face-to-face conversation, as VR provides a high level of immersiveness and social presence, very similar to face-to-face interaction. We then conducted a study with human-subject participants, where the participants took part in conversations with two virtual humans and then rated the virtual character social characteristics, and completed an evaluation of their implicit trust in the virtual human. Results showed more liking for and more trust in the virtual human whose nodding behaviour was driven by realistic behaviour rules. This supports the psychological models of nodding and advances our ability to build realistic virtual humans

    Machine Analysis of Facial Expressions

    Get PDF
    No abstract
    corecore