YesIn this work we propose a talking head system for animating facial expressions using a template face generated from Partial Differen- tial Equations (PDEs). It uses a set of preconfigured curves to calculate an internal template surface face. This surface is then used to associate various facial features with a given 3D face object. Motion retargeting is then used to transfer the deformations in these areas from the template to the target object. The procedure is continued until all the expressions in the database are calculated and transferred to the target 3D human face object. Additionally the system interacts with the user using an artificial intelligence (AI) chatterbot to generate response from a given text. Speech and facial animation are synchronized using the Microsoft Speech API, where the response from the AI bot is converted to speech
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.