26,987 research outputs found

    Do Embodied Conversational Agents Know When to Smile?

    Get PDF
    We survey the role of humor in particular domains of human-to-human interaction with the aim of seeing whether it is useful for embodied conversational agents to integrate humor capabilities in their models of intelligence, emotions and interaction (verbal and nonverbal) Therefore we first look at the current state of the art of research in embodied conversational agents, affective computing and verbal and nonverbal interaction. We adhere to the 'Computers Are Social Actors' paradigm to assume that human conversational partners of embodied conversational agents assign human properties to these agents, including humor appreciation

    Conversational Agents, Humorous Act Construction, and Social Intelligence

    Get PDF
    Humans use humour to ease communication problems in human-human interaction and \ud in a similar way humour can be used to solve communication problems that arise\ud with human-computer interaction. We discuss the role of embodied conversational\ud agents in human-computer interaction and we have observations on the generation\ud of humorous acts and on the appropriateness of displaying them by embodied\ud conversational agents in order to smoothen, when necessary, their interactions\ud with a human partner. The humorous acts we consider are generated spontaneously.\ud They are the product of an appraisal of the conversational situation and the\ud possibility to generate a humorous act from the elements that make up this\ud conversational situation, in particular the interaction history of the\ud conversational partners

    Evaluating embodied conversational agents in multimodal interfaces

    Get PDF
    Based on cross-disciplinary approaches to Embodied Conversational Agents, evaluation methods for such human-computer interfaces are structured and presented. An introductory systematisation of evaluation topics from a conversational perspective is followed by an explanation of social-psychological phenomena studied in interaction with Embodied Conversational Agents, and how these can be used for evaluation purposes. Major evaluation concepts and appropriate assessment instruments – established and new ones – are presented, including questionnaires, annotations and log-files. An exemplary evaluation and guidelines provide hands-on information on planning and preparing such endeavours

    Computers that smile: Humor in the interface

    Get PDF
    It is certainly not the case that wen we consider research on the role of human characteristics in the user interface of computers that no attention has been paid to the role of humor. However, when we compare efforts in this area with efforts and experiments that attempt to demonstrate the positive role of general emotion modelling in the user interface, then we must conclude that this attention is still low. As we all know, sometimes the computer is a source of frustration rather than a source of enjoyment. And indeed we see research projects that aim at recognizing a user’s frustration, rather than his enjoyment. However, rather than detecting frustration, and maybe reacting on it in a humorous way, we would like to prevent frustration by making interaction with a computer more natural and more enjoyable. For that reason we are working on multimodal interaction and embodied conversational agents. In the interaction with embodied conversational agents verbal and nonverbal communication are equally important. Multimodal emotion display and detection are among our advanced research issues, and investigations in the role of humor in human-computer interaction is one of them

    Elckerlyc goes mobile - Enabling natural interaction in mobile user interfaces

    Get PDF
    The fast growth of computational resources and speech technology available on mobile devices makes it possible to entertain users of these devices in having a natural dialogue with service systems. These systems are sometimes perceived as social agents and this can be supported by presenting them on the interface by means of an animated embodied conversational agent. To take the full advantage of the power of embodied conversational agents in service systems it is important to support real-time, online and responsive interaction with the system through the embodied conversational agent. The design of responsive animated conversational agents is a daunting task. Elckerlyc is a model-based platform for the specification and animation of synchronised multi-modal responsive animated agents. This paper presents a new light-weight PictureEngine that allows to run this platform in mobile applications. We describe the integration of the PictureEngine in the user interface of two different coaching applications and discuss the findings from user evaluations. We also conducted a study to evaluate an editing tool for the specification of the agent’s communicative behaviour. Twenty one participants had to specify the behaviour of an embodied conversational agent using the PictureEngine. We may conclude that this new lightweight back-end engine for the Elckerlyc platform makes it easier to build embodied conversational interfaces for mobile devices

    Elckerlyc goes mobile - Enabling natural interaction in mobile user interfaces

    Get PDF
    The fast growth of computational resources and speech technology available on mobile devices makes it possible to entertain users of these devices in having a natural dialogue with service systems. These systems are sometimes perceived as social agents and this can be supported by presenting them on the interface by means of an animated embodied conversational agent. To take the full advantage of the power of embodied conversational agents in service systems it is important to support real-time, online and responsive interaction with the system through the embodied conversational agent. The design of responsive animated conversational agents is a daunting task. Elckerlyc is a model-based platform for the speci﬿cation and animation of synchronised multi-modal responsive animated agents. This paper presents a new light-weight PictureEngine that allows to run this platform in mobile applications. We describe the integration of the PictureEngine in the user interface of two different coaching applications and discuss the ﬿ndings from user evaluations. We also conducted a study to evaluate an editing tool for the speci﬿cation of the agent’s communicative behaviour. Twenty one participants had to specify the behaviour of an embodied conversational agent using the PictureEngine. We may conclude that this new lightweight back-end engine for the Elckerlyc platform makes it easier to build embodied conversational interfaces for mobile devices

    Conversational Agents for depression screening: a systematic review

    Get PDF
    Objective: This work explores the advances in conversational agents aimed at the detection of mental health disorders, and specifically the screening of depression. The focus is put on those based on voice interaction, but other approaches are also tackled, such as text-based interaction or embodied avatars. Methods: PRISMA was selected as the systematic methodology for the analysis of existing literature, which was retrieved from Scopus, PubMed, IEEE Xplore, APA PsycINFO, Cochrane, and Web of Science. Relevant research addresses the detection of depression using conversational agents, and the selection criteria utilized include their effectiveness, usability, personalization, and psychometric properties. Results: Of the 993 references initially retrieved, 36 were finally included in our work. The analysis of these studies allowed us to identify 30 conversational agents that claim to detect depression, specifically or in combination with other disorders such as anxiety or stress disorders. As a general approach, screening was implemented in the conversational agents taking as a reference standardized or psychometrically validated clinical tests, which were also utilized as a golden standard for their validation. The implementation of questionnaires such as Patient Health Questionnaire or the Beck Depression Inventory, which are used in 65% of the articles analyzed, stand out. Conclusions: The usefulness of intelligent conversational agents allows screening to be administered to different types of profiles, such as patients (33% of relevant proposals) and caregivers (11%), although in many cases a target profile is not clearly of (66% of solutions analyzed). This study found 30 standalone conversational agents, but some proposals were explored that combine several approaches for a more enriching data acquisition. The interaction implemented in most relevant conversational agents is textbased, although the evolution is clearly towards voice integration, which in turns enhances their psychometric characteristics, as voice interaction is perceived as more natural and less invasive.Agencia Estatal de Investigación | Ref. PID2020-115137RB-I0

    Developing enhanced conversational agents for social virtual worlds

    Get PDF
    In This Paper, We Present A Methodology For The Development Of Embodied Conversational Agents For Social Virtual Worlds. The Agents Provide Multimodal Communication With Their Users In Which Speech Interaction Is Included. Our Proposal Combines Different Techniques Related To Artificial Intelligence, Natural Language Processing, Affective Computing, And User Modeling. A Statistical Methodology Has Been Developed To Model The System Conversational Behavior, Which Is Learned From An Initial Corpus And Improved With The Knowledge Acquired From The Successive Interactions. In Addition, The Selection Of The Next System Response Is Adapted Considering Information Stored Into User&#39 S Profiles And Also The Emotional Contents Detected In The User&#39 S Utterances. Our Proposal Has Been Evaluated With The Successful Development Of An Embodied Conversational Agent Which Has Been Placed In The Second Life Social Virtual World. The Avatar Includes The Different Models And Interacts With The Users Who Inhabit The Virtual World In Order To Provide Academic Information. The Experimental Results Show That The Agent&#39 S Conversational Behavior Adapts Successfully To The Specific Characteristics Of Users Interacting In Such Environments.Work partially supported by the Spanish CICyT Projects under grant TRA2015-63708-R and TRA2016-78886-C3-1-R

    Artificial Companion: building a impacting relation

    No full text
    International audienceIn this paper we show that we are in front of an evolution from traditional human-computer interactions to a kind of intense exchange between the human user and new generation of virtual or real systems -Embodied Conversational Agents (ECAs) or affective robots- bringing the interaction to another level, the "relation level". We call these systems "companions" that is to say systems with which the user wants to build a kind of life- long relationship. We thus argue that we need to go beyond the concepts acceptability and believability of system to get closer to human and look for "impact" concept. We will see that this problematic is shared between the community of researchers in Embodied Conversational Agents (ECAs) and in affective robotics fields. We put forward a definition of an "impacting relation" that will enable believable interactive ECAs or robots to become believable impacting companions
    corecore