You are here

TAI: Tangible Interfaces

VIEWW (Virtual worlds for well-being)

Everyday objects and wearable’s become equipped with sensors registering contact, pressure, heat, motion and so on. They become sensitive to the user’s touch. At the same time, they incorporate actuators that can produce the same tangible experiences.

In this work package we are concerned with studying, creating, programming and evaluating human media interactions with smart (tactile) interfaces. In particular we want to create natural feedback loops in the interaction between users and objects combining different forms of control: both explicit command and control style interaction and implicit sensing of the user experience and the user action.

One central theme in this work package will be to investigate new input devices that are sensitive to the way they are being “touched” by the users. It will research hardware, algorithms and programming language primitives that translate raw data captured by the interface into useful interaction parameters and intelligent feedback.

The other side of the interaction coin concerns output. Systems (commercial or otherwise) that produce tangible output are not as widespread. Only a few of the tactile sensations are stimulated. There exists some haptic devices for training (but these are expensive and difficult to customize) and several prototypes of systems that make use of vibration but sound and vision remain the dominant modes for output systems. This workpackage will also study the effects of the use of several modes of tactile stimulation.

Besides questions of the interpretation of the input (sensing and interpreting touch) and the effects of output (understanding tactile stimulation), this also requires the study of the intelligence (artificial emotional, social and rational intelligence) that is needed to decide on the appropriate timing and type of feedback. Making the objects and the environment smart should cater for adaptive and intuitive interfaces.
The work will proceed by a number of focused user studies in collaboration with the workpackages on scenarios and demos, the design and implementation of smart prototypes and field studies of use. The resulting knowledge will be used in various gaming and interaction systems, in particular in the pilot workpackages, and to extend the programming language for tangible interfaces.

We start with a scan of state of the art examples of interaction and user observations while performing various tasks related to the demonstrators in their work packages. Then, we investigate prototypes of hard- and software that get refined over the course of a few iterations. Meaningful ranges of parameter values will be determined and software will be written that enables applications to make use of the new interfaces.

WP Leader: