In many virtual worlds multiple characters are present in the same scene and interact with each other. Many social activities involve multiple parties that adapt their behaviour to each other. Examples are dancing, talking, or walking in a group. Since one of the main goals of virtual environments is to create a social experience, it is crucial that characters and avatars in these environments move according to the social rules.
In this work package we investigate new techniques for computing realistic movements and animations for such socially-driven multi-character animation. This will involve new algorithms for navigation and collision avoidance among groups of close characters, models and algorithms for mirroring behaviour and for emotional interplay, and techniques to adapt animations of other characters to match the avatar movements.
The social interplay between characters is observed by the user through a virtual camera. The location and movement of the camera strongly influence the affective experience that the user had as well as the effectiveness of his/her actions. In this work package we will develop new techniques to automatically control the camera based only on global characteristics of the particular activity of the group of characters (e.g. dancing, talking, walking together), thus enhancing the user experience.