1 research outputs found

    Data Mining to Support Human-Machine Dialogue for Autonomous Agents

    No full text
    Abstract. Next-generation autonomous agents will be expected to converse with people to achieve their mutual goals. Human-machine dialogue, however, is challenged by noisy acoustic data, and by people’s preference for more natural interaction. This paper describes an ambitious project that embeds human subjects in a spoken dialogue system. It collects a rich and novel data set, including spoken dialogue, human behavior, and system features. During data collection, subjects were restricted to the same databases, action choices, and noisy automated speech recognition output as a spoken dialogue system. This paper mines that data to learn how people manage the problems that arise during dialogue under such restrictions. Two different approaches to successful, goal-directed dialogue are identified this way, from which supervised learning can predict appropriate dialogue choices. The resultant models can then be incorporated into an autonomous agent that seeks to assist its user
    corecore