You are here

VIEWW (Virtual worlds for well-being)


Well-being is important, both for elderly and the young. Life’s speed is ever increasing, disconnecting us from other people, easily ignoring our physical fitness rendering us too little time to experience life Virtual worlds play an increasingly important role in our lives as places where you meet and make friends. They influence the way we live, learn, communicate, heal, and entertain. When designed and applied appropriately they will have a strong positive influence on our wellbeing. Current virtual worlds, however, exhibit poor affect and, therefore, do not offer a rich, emotional experience. In particular we need to increase the effect of virtual characters and provide enhanced interfaces between the real and virtual world with which users can steer their avatars and express their emotions.

The ICT challenges of this project are: 1. Create a model of complex states as well-being is. It is unclear how to combine information of different observables to measure well-being. We aim for sensors to measure aspects like smile, relaxed behavior, friendly touch but it is unclear how to combine this to a reproducible representation of such a notion. The challenge is to combine these signals into one model for an entity without ground truth. 2. Create virtual worlds. At this moment little is known about the elements of a virtual world that contributes most to a specific form of behavior. The ICT-challenge is to determine the kind of information that supports the targeted change of behavior most. 3. Introduce emotional aspects in virtual worlds. Most games are limited to large body movements, leaving the small and complicated face movement aside. The ICT challenge is to understand what small expressions in a (avatars) face will be associated with the right emotion. Furthermore, not only the expression of the emotion in a virtual world, but understanding what induces a specific emotion is the real challenge here. Before and while creating new technology, these challenges will be addressed.   

The overarching question is how to create affective virtual worlds in which users can have rich and rewarding experiences. The scientific objectives are to research new technology to measure the emotional state and intended behavior of users, mimic this in their avatars and develop algorithms to express emotion and social interaction in virtual characters. The techniques will be generically applicable. At the same time we will direct the results through demonstrators in wellbeing in encouraging and coaching people. The research questions are: a. How do we create animations (body and face) for virtual characters to express emotions and physical states and to mimic social interaction; how can this be based on biomechanical models. b. How do we extract semantic parameters from visual and musical performances to model avatar behaviour? c. How can we create effective tangible interfaces for users and for controlling avatars? d. How can we effectively use these enabling technologies to monitor, stimulate, and coach people to improve their well-being?