You are here

ANI1: Affective Body Animation

VIEWW (Virtual worlds for well-being)

Users of virtual worlds are normally represented by their avatars. In 3D virtual environments. These avatars are mostly animated virtual characters. Also other computer-controlled entities are represented through animated characters. In many current systems, these characters can only be controlled through very basic means such as a small set of pre-recorded motions or a few different facial expressions. To create more involved experiences in virtual worlds it is essential that virtual characters can express their emotions (such as happiness) and physical state (such as tiredness) much more convincingly. In order to achieve this, not only the visualization of these aspects should be realistic, also the user should be able to steer these aspects of their avatar in an easy and natural way.

In this work package we will develop an integrated framework in which motion and emotional expressions are combined into a generic approach for affective character animation. To this end we will develop new algorithms to automatically compute synchronous facial and body motions that can express a variety emotions and physical states, where we will focus on stronger expressions like laughing, crying, shouting, heavy breathing, etc. We will also develop a mechanism in which users can steer the animation of their avatars through a simple interface such as a few sensors placed on the user’s arms and legs, which drives an animation engine that translates these signals into similar avatar motions.

WP Leader: