MPI-INF Logo
Campus Event Calendar

Event Entry

What and Who

Multiperspective and Multipresence in Computer Interfaces: Stereographic Mixed Reality

Michael Cohen
University of Aizu
AG4 Group Meeting
AG 4  
AG Audience
English

Date, Time and Location

Thursday, 27 March 2003
13:00
45 Minutes
46.1 - MPII
019
Saarbrücken

Abstract

Recently the study of computer-human interfaces has emerged into a

"full citizen" in the computer science domain, broadly spanning
research across various foci--- including multimedia, networks, and
ubiquitous computing--- and spawning its own conferences, journals,
and degree programs. Anticipating some emerging user interfaces, this
talk will survey some trends in mobile telephony, (especially
regarding mobile internet), wearable/intimate multimedia computing,
handheld/nomadic/portable interfaces, and embedded systems like
multimedia furniture and spatially immersive displays. Representative
instances will be cited, and groupware selection functions and their
predicate calculus notation are reviewed.

Then i will introduce a program inspired by such ideas: Previous
research by our group introduced the idea of stereographic panoramic
browsing, via our ``VR4U2C'' ("virtual reality for you to see")
client, subsequently extended to allow not only zooming, a simple
magnification-driven manipulation of the image, but also dollying,
allowing limited viewpoint motion--- enabled by multiple panoramic
captures and enabling looming, viewpoint-selective occlusion and
revealing of background scene objects. We have integrated this client
with a sibling client based on Java3D, allowing realtime visualization
of the dollying, and juxtaposition of multiperspective (ego- and
exocentric) stereographic CG (computer graphic) and IBR (image-based
rendering) displays.

Such extended and enriched interfaces, especially coupled with
position tracking systems, encourage multipresence, the inhabiting by
sources and sinks of multiple spaces simultaneously, allowing, for
instance, a user to monitor several aligned spaces at once
(conferences, entertainment, navigation, warnings, etc.). Our group
is developing narrowcasting commands, conference selection functions
for adjusting groupware situations in which users have such
multipresence.

Bio:
~~~~
Michael Cohen received an Sc.B. in EE from Brown University, M.S. in
CS from the University of Washington, and Ph.D. in EECS from
Northwestern University. He has worked at the Air Force Geophysics
Lab (Hanscom Field, Massachusetts), Weizmann Institute (Rehovot;
Israel), Teradyne (Boston, Massachusetts), BBN (Cambridge,
Massachusetts and Stuttgart; Germany), Bellcore (Morristown and Red
Bank, New Jersey), the Human Interface Technology Lab (Seattle,
Washington), and the Audio Media Research Group at the NTT Human
Interface Lab (Musashino and Yokosuka; Japan).

He is currently an Associate Professor in the Human Interface Lab at
the University of Aizu (Aizu-Wakamatsu, Japan), where he teaches
Information Theory, Acoustic Modeling, and Computer Music. He is
researching advanced user interfaces for virtual reality and
telepresence systems, including networked multimodal spatial media
(stereographic and/or panoramic, spatial audio, force-feedback). He
is a member of the ACM, IEEE Computer Society, 3D-Forum, TUG (TeX
Users Group), and VRSJ (Virtual Reality Society of Japan). Cohen is
the author or coauthor of two patents, four book chapters, and over a
hundred technical publications.

Contact

Alexander Belyaev
--email hidden
passcode not visible
logged in users only