MPI-INF Logo
Campus Event Calendar

Event Entry

New for: D1, D2, D3, D4, D5

What and Who

From the Silver Screen to the Stadium: Next Gen Motion Capture

Chris Bregler
Courant Institute - New York University
Talk

Chris Bregler is an Associate Professor of Computer Science at NYU's Courant Institute and director of the NYU Movement Lab. He received his M.S. and Ph.D. in Computer Science from U.C. Berkeley and his Diplom from Karlsruhe University. Prior to NYU he was on the faculty at Stanford University and worked for several companies including Hewlett Packard, Interval, Disney Feature Animation, and LucasFilm's ILM. His research and commercial projects push the envelope for motion capture in science and entertainment and resulted in numerous publications, patents, and awards from the National Science Foundation, Sloan Foundation, Packard Foundation, Electronic Arts, Microsoft, Google, U.S. Navy, U.S. Airforce, and other sources. He has been named Stanford Joyce Faculty Fellow, Terman Fellow, and Sloan Research Fellow. He received the Olympus Prize for achievements in computer vision and pattern recognition and was awarded the IEEE Longuet-Higgins Prize for "Fundamental Contributions in Computer Vision that have withstood the test of time". Some of his non-academic achievements include being the executive producer of Squidball.net which required building the world's largest real-time motion capture volume and a massive multi-player motion game holding several world records in The Motion Capture Society. He was the chair for the SIGGRAPH Electronic Theater and Animation Festival. He has been active in the Visual Effects industry, for example, as the lead developer of ILM's Multitrack system that has been used in many feature film productions. His work has also been featured in mainstream media such as the New York Times, Los Angeles Times, New York Observer, WIRED, Business Week, Variety, Hollywood Reporter, NBC, CBS, Discovery/Science Channel, and many other outlets.
AG 1, AG 3, AG 5, SWS, AG 2, AG 4, RG1, MMCI  
AG Audience
English

Date, Time and Location

Monday, 1 August 2011
14:00
60 Minutes
E1 4
019
Saarbrücken

Abstract

This talk will cover several research projects centered around the use
of vision and motion capture for animation, recognition, and gaming.
This includes human movements as diverse as subtle eye-blinks,
lip-motions, spine-deformations, human walks and dances, politicians,
base-ball pitchers, and the production of the largest motion capture
game to date. The technical content of the talk focuses on the
trade-off between data-driven and crowd-sourced models of human motion
vs. analytically derived and perceptually driven models using dancers,
animators, linguists, and other domain experts. This is demonstrated by
sub-pixel tracking in Hollywood productions, reading the body-language
of public figures and academics, visualizing the pitches of NY Yankees
Mariano Rivera, training pose-metric-spaces with the help of fans of a
Dutch Hip-Hop band, and the making of crowd mocap games in various cultures.

Contact

Christian Theobalt
--email hidden
passcode not visible
logged in users only

Christian Theobalt, 08/01/2011 11:23
Christian Theobalt, 07/26/2011 13:52 -- Created document.