93,708 research outputs found

    Developing enhanced conversational agents for social virtual worlds

    Get PDF
    In This Paper, We Present A Methodology For The Development Of Embodied Conversational Agents For Social Virtual Worlds. The Agents Provide Multimodal Communication With Their Users In Which Speech Interaction Is Included. Our Proposal Combines Different Techniques Related To Artificial Intelligence, Natural Language Processing, Affective Computing, And User Modeling. A Statistical Methodology Has Been Developed To Model The System Conversational Behavior, Which Is Learned From An Initial Corpus And Improved With The Knowledge Acquired From The Successive Interactions. In Addition, The Selection Of The Next System Response Is Adapted Considering Information Stored Into User&#39 S Profiles And Also The Emotional Contents Detected In The User&#39 S Utterances. Our Proposal Has Been Evaluated With The Successful Development Of An Embodied Conversational Agent Which Has Been Placed In The Second Life Social Virtual World. The Avatar Includes The Different Models And Interacts With The Users Who Inhabit The Virtual World In Order To Provide Academic Information. The Experimental Results Show That The Agent&#39 S Conversational Behavior Adapts Successfully To The Specific Characteristics Of Users Interacting In Such Environments.Work partially supported by the Spanish CICyT Projects under grant TRA2015-63708-R and TRA2016-78886-C3-1-R

    A virtual diary companion

    Get PDF
    Chatbots and embodied conversational agents show turn based conversation behaviour. In current research we almost always assume that each utterance of a human conversational partner should be followed by an intelligent and/or empathetic reaction of chatbot or embodied agent. They are assumed to be alert, trying to please the user. There are other applications which have not yet received much attention and which require a more patient or relaxed attitude, waiting for the right moment to provide feedback to the human partner. Being able and willing to listen is one of the conditions for being successful. In this paper we have some observations on listening behaviour research and introduce one of our applications, the virtual diary companion

    Computational models of social and emotional turn-taking for embodied conversational agents: a review

    Get PDF
    The emotional involvement of participants in a conversation not only shows in the words they speak and in the way they speak and gesture but also in their turn-taking behavior. This paper reviews research into computational models of embodied conversational agents. We focus on models for turn-taking management and (social) emotions. We are particularly interested in how in these models emotions of the agent itself and those of the others in uence the agent's turn-taking behavior and vice versa how turn-taking behavior of the partner is perceived by the agent itself. The system of turn-taking rules presented by Sacks, Schegloff and Jefferson (1974) is often a starting point for computational turn-taking models of conversational agents. But emotions have their own rules besides the "one-at-a-time" paradigm of the SSJ system. It turns out that almost without exception computational models of turn-taking behavior that allow "continuous interaction" and "natural turntaking" do not model the underlying psychological, affective, attentional and cognitive processes. They are restricted to rules in terms of a number of supercially observable cues. On the other hand computational models for virtual humans that are based on a functional theory of social emotion do not contain explicit rules on how social emotions affect turn-taking behavior or how the emotional state of the agent is affected by turn-taking behavior of its interlocutors. We conclude with some preliminary ideas on what an architecture for emotional turn-taking should look like and we discuss the challenges in building believable emotional turn-taking agents

    Learning emotions in virtual environments

    Get PDF
    A modular hybrid neural network architecture, called SHAME, for emotion learning is introduced. The system learns from annotated data how the emotional state is generated and changes due to internal and external stimuli. Part of the modular architecture is domain independent and part must be\ud adapted to the domain under consideration.\ud The generation and learning of emotions is based on the event appraisal model.\ud The architecture is implemented in a prototype consisting of agents trying to survive in a virtual world. An evaluation of this prototype shows that the architecture is capable of\ud generating natural emotions and furthermore that training of the neural network modules in the architecture is computationally feasible.\ud Keywords: hybrid neural systems, emotions, learning, agents

    On combining the facial movements of a talking head

    Get PDF
    We present work on Obie, an embodied conversational agent framework. An embodied conversational agent, or talking head, consists of three main components. The graphical part consists of a face model and a facial muscle model. Besides the graphical part, we have implemented an emotion model and a mapping from emotions to facial expressions. The animation part of the framework focuses on the combination of different facial movements temporally. In this paper we propose a scheme of combining facial movements on a 3D talking head

    Designing and Implementing Embodied Agents: Learning from Experience

    Get PDF
    In this paper, we provide an overview of part of our experience in designing and implementing some of the embodied agents and talking faces that we have used for our research into human computer interaction. We focus on the techniques that were used and evaluate this with respect to the purpose that the agents and faces were to serve and the costs involved in producing and maintaining the software. We discuss the function of this research and development in relation to the educational programme of our graduate students
    • 

    corecore