Current body animation systems for Interactive Virtual Humans are mostly procedural or key-frame based.\ud Although such methods provide for a high flexibility of the animation system, often it is not possible to create\ud animations that are as realistic as animations obtained using a motion capture system. Simply using motion\ud captured animation segments in stead of key-framed gestures is not a good solution, since virtual human\ud animation systems also specify parameters of gesture that affect the style, such as for example expressing\ud emotions or stressing a part of a speech sequence. In this paper, we describe an animation system that allows for\ud the synthesis of realistic communicative body motions according to an emotional state, while still retaining the\ud flexibility of procedural gesture synthesis systems. These motions are constructed as a blend of idle motions and\ud gesture animations. Based on an animation specified for only a few joints, automatically and in real-time, the\ud dependent joint motions are calculated. Realistic balance shifts adapted from motion capture data are generated\ud on-the-fly, resulting in a fully controllable body animation, adaptable according to individual characteristics and\ud directly playable on different characters at the same time
To submit an update or takedown request for this paper, please submit an Update/Correction/Removal Request.