Hand Gesture-based User Interface in Ubiquitous Virtual Reality
Youngkyoon Yang
Max-Planck-Institut für Informatik - D4
Talk
Youngkyoon Jang is a Research Associate of Augmented Human Research Center at KAIST, S. Korea, since Aug 2015. He obtained his PhD from KAIST in 2015. His research explores novel natural user interface technologies that aim to overcome challenges in interactions between humans and computers in a wearable AR/VR environment, specializing in understanding human behaviors, understanding scenes, and identifying users based on gesture interaction, mobile & wearable computing, and visual computing.
Ubiquitous VR (UVR) research is aimed at the development of new computing paradigms for “AR/VR Life in Ubiquitous Smart Spaces”. According to the advances of recent IoT and AR/VR technologies, the concept of UVR is expected to be implemented sooner than expected. In this talk, we see (1) recent AR/VR research trends in the context of UVR, (2) a spatio-temporal classifier specifically designed for hand gesture estimation targeting VR object selection and manipulation in egocentric viewpoint. The talk focuses on hand gesture-based UI, playing a key role for novel man-machine interfaces. Estimating their posture and gesture in egocentric viewpoint is highly challenging, mainly due to the self-occlusions (missing visual information). We have tackled the problems through various novel ideas on top of cutting-edge techniques. We conclude the talk with some future direction, including face landmark detection and the combination of two modalities to understanding user’s intention.