110,960 research outputs found

    A Chatbot that Uses a Multi-agent Organization to Support Collaborative Learning

    Get PDF
    This work investigates and apply the use of a multi-agent system to assist in the coordination of group tasks, specifically in educational environments, in which the interaction occurs indirectly, that is, asynchronously. The system has a web interface integrated with a chatbot for more natural interaction. The chatbot communicates with the multi-agent system that is responsible for the organization of the group, that is, it contains information about the tasks and members of the groups, in addition to restrictions that can be imposed according to the organization of the group, and it is also able to return the requested information in natural language through the chatbot. This approach was validated in a practical undergraduate course of software engineering. The students assessed the functionalities and usability of the system while working in groups in order to develop software collaboratively. Our system was used to assist students in a real project. With this assessment, it was found that the system was able to support the development of the group tasks, ensuring quick and consistent responses to the student’s request

    GAIML: A New Language for Verbal and Graphical Interaction in Chatbots

    Get PDF
    Natural and intuitive interaction between users and complex systems is a crucial research topic in human-computer interaction. A major direction is the definition and implementation of systems with natural language understanding capabilities. The interaction in natural language is often performed by means of systems called chatbots. A chatbot is a conversational agent with a proper knowledge base able to interact with users. Chatbots appearance can be very sophisticated with 3D avatars and speech processing modules. However the interaction between the system and the user is only performed through textual areas for inputs and replies. An interaction able to add to natural language also graphical widgets could be more effective. On the other side, a graphical interaction involving also the natural language can increase the comfort of the user instead of using only graphical widgets. In many applications multi-modal communication must be preferred when the user and the system have a tight and complex interaction. Typical examples are cultural heritages applications (intelligent museum guides, picture browsing) or systems providing the user with integrated information taken from different and heterogenous sources as in the case of the iGoogle™ interface. We propose to mix the two modalities (verbal and graphical) to build systems with a reconfigurable interface, which is able to change with respect to the particular application context. The result of this proposal is the Graphical Artificial Intelligence Markup Language (GAIML) an extension of AIML allowing merging both interaction modalities. In this context a suitable chatbot system called Graphbot is presented to support this language. With this language is possible to define personalized interface patterns that are the most suitable ones in relation to the data types exchanged between the user and the system according to the context of the dialogue

    COSMA - multi-participant NL interaction for appointment scheduling

    Get PDF
    We discuss the use of NL systems in the domain of appointment scheduling. Appointment scheduling is a problem faced daily by many people and organizations, and typically solved using communication in natural language. In general, cooperative interaction between several participants is required whose calendar data are distributed rather than centralized. In this distributed multi-agent environment, the use of NL systems makes it possible for machines and humans to cooperate in solving scheduling problems. We describe the COSMA (Cooperative Schedule Managament Agent) system, a secretarial assistant for appointment scheduling. A central part of COSMA is the reusable NL core system DISCO, which serves, in this application, as an NL interface between an appointment planning system and the human user. COSMA is fully implemented in Common Lisp and runs on Unix Workstations. Our experience with COSMA shows that it is a plausible and useful application for NL systems. However, the appointment planner was not designed for NL communication and thus makes strong assumptions about sequencing of domain actions and about the error-freeness of the communication. We suggest that further improvements of the overall COSMA functionality, especially with regard to flexibility and robustness, be based on a modified architecture

    "Hey Model!" -- Natural User Interactions and Agency in Accessible Interactive 3D Models

    Full text link
    While developments in 3D printing have opened up opportunities for improved access to graphical information for people who are blind or have low vision (BLV), they can provide only limited detailed and contextual information. Interactive 3D printed models (I3Ms) that provide audio labels and/or a conversational agent interface potentially overcome this limitation. We conducted a Wizard-of-Oz exploratory study to uncover the multi-modal interaction techniques that BLV people would like to use when exploring I3Ms, and investigated their attitudes towards different levels of model agency. These findings informed the creation of an I3M prototype of the solar system. A second user study with this model revealed a hierarchy of interaction, with BLV users preferring tactile exploration, followed by touch gestures to trigger audio labels, and then natural language to fill in knowledge gaps and confirm understanding.Comment: Paper presented at ACM CHI 2020: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, ACM, New York, April 2020; Replacement: typos correcte
    • …
    corecore