In this talk I will give an overview on two of my recent works showing how a real-time 3D range sensor can be used to accurately recreate digital models together with their motion and deformation, and present a practical system to perform live facial puppetry. For the first project, I will emphasize on specific techniques such as deformation adaptive non-rigid registration, stable tracking in long sequences using plastic-elastic deformation model, and present a bi-resolution framework that distinguishes between transient and persistent details. In the second part, I will introduce Face/Off, the first real-time markerless facial expression retargeting system using 3D range data.
The presented projects are the result of two collaborations:
1. Robust Single-View Geometry and Motion Reconstruction with Bart Adams (KU Leuven/Stanford), Leonidas J. Guibas (Stanford), and Mark Pauly (ETH Zurich) submitted to ACM Siggraph Asia 2009
2. Face/Off: Live Facial Puppetry with Thibaut Weise (ETH Zurich), Luc Van Gool (ETH Zurich/KU Leuven), and Mark Pauly (ETH Zurich) submitted to ACM/Eurographics SCA 2009