The Society for the Study of Artificial Intelligence and the Simulation of Behaviour
Abstract
In this paper we describe secondary behaviour, this is behaviour that is generated autonomously for an avatar. The user will control various aspects of the avatars behaviour but a truly expressive avatar must produce more complex behaviour than a user could specify in real time. Secondary behaviour provides some of this expressive behaviour autonomously. However, though it is produced autonomously it must produce behaviour that is appropriate to the actions that the user is controlling (the primary behaviour) and it must produce behaviour that corresponds to what the user wants. We describe an architecture which achieves these to aims by tagging the primary behaviour
with messages to be sent to the secondary behaviour and by allowing the user to design various aspects of the secondary behaviour before starting to use the avatar. We have implemented this general architecture in a system which adds gaze behaviour to user designed actions