1 research outputs found

    Personality-Driven Gaze Animation with Conditional Generative Adversarial Networks

    Full text link
    We present a generative adversarial learning approach to synthesize gaze behavior of a given personality. We train the model using an existing data set that comprises eye-tracking data and personality traits of 42 participants performing an everyday task. Given the values of Big-Five personality traits (openness, conscientiousness, extroversion, agreeableness, and neuroticism), our model generates time series data consisting of gaze target, blinking times, and pupil dimensions. We use the generated data to synthesize the gaze motion of virtual agents on a game engine.Comment: 7 pages, 5 figure
    corecore