systems in two-dimensional environments, there are alternatives
that view the user as the center of human computer interaction
and arrange technology, so that the system adapts to the user's
need and no longer the user has to adjust to the system environment.
New user interfaces often use three-dimensional display techniques
combined with special input devices that allow the user to interact
with these environments.
Modern interaction styles take place in more than one modality at
a time and are therefore named 'multi-modal interactions'. To be
able to support this kind of interaction one has to gather information
from several different input devices and combine this input to build
commands.
The presentation deals with a new approach to handle problems arising
with multi-modal interaction from a software engineering point
of view. The framework resulting from this approach guarantees high
reusability of implemented software components and allows easy
integration of new input devices with existing and new applications,
freeing application programmers of low level implementations.