With Philips (Video Processing and Analysis Group), this WP will focus on human behavior analysis and interaction. The group contributes to new products in line with the "sense & simplicity" brand promise, by making them adaptive and responsive to the people using them. Sensory signals include body gestures and posture, facial expressions, and eye gaze, among others.
The concrete objective of this project is the investigation of principled models and algorithms to construct socially aware systems to support human behavior analysis. So far, human behavior analysis has only one of the above information sources.
This WP fuses the different modalities to go beyond standard, single modality human analysis enabling multi-modal semantic (human behavior) analysis in video. The potential use of the results are to recognize the emotional aspects of dance such as the tension, hope, and joy (PIL 1), to enable humans to perform selective sports or playful activities in a virtual world with real-life effects (PIL 2), and to measure gait and motion under possible various stimuli (PIL 3).
The aim is to research on and to develop new technology to measure the emotional state and behavior of humans. Further, to address the challenge of constructing socially aware systems for humans. To achieve this, we develop semantic identification technology and services to support human behavior analysis and interaction. Content-based (video) human analysis will be applied for the understanding of human behavior in different scenarios particularly for physical wellbeing and fitness.