1 research outputs found

    Lip Synchronization for ECA Rendering with Self-Adjusted POMDP Policies

    Get PDF
    The recent advancements in virtual reality have allowed for the creation of autonomous agents to aid humans in the retrieval and processing of useful digital information or to aid humans in requesting tasks to be completed by these autonomous agents. Known as embodied conversational agents (ECA), these intelligent agents bridge the physical and virtual worlds by providing natural verbal and non-verbal forms of communication with the user. To provide a positive user experience, it is essential for an ECA not only to appear human-like but also correctly identify the user’s intention so the ECA can correctly assist the user. This thesis continues the research done by our research group investigating the further improvement of POMDP-based dialogue management using machine learning on POMDP’s belief state history. This thesis integrates a technique to match lip movements with the rendered ECA audio alongside the automatically selected emotion. Finally, this research conducts experiments using machine learning techniques to adjust POMDP policies and compare its effectiveness in terms of dialogue lengths and successful intention discovery rates
    corecore