2020 CONFERENCE

Inspiring Keynote

Future Directions in LED Lighting Reproduction

Paul Debevec – Google VR 

At SIGGRAPH 2002, we showed how to light actors with images of virtual sets displayed on RGB color LED lighting arrays surrounding the actor to create realistic composites into virtual backgrounds, a technique which has become a part of today’s most innovative virtual production projects. The future work section of our paper laid out nine avenues for improving the system: 1) Simulating the Sun, 2) Increasing the Lighting Resolution, 3) Simulating Spatially-Varying Light, 4) Recording and Projecting Light Fields, 5) Increasing the Size of the Stage, 6) Simulating Shadows, 7) Improving Matting, 8) Improving Color Rendition with Multispectral Lighting, and 9) Creating Cinematographer-Friendly Tools for Lighting Modification. In this keynote, I’ll show how far we’ve come in each of these areas, as well as in post-production relighting, and what remains to be done in making LED Lighting Reproduction the most powerful virtual production tool it can be.

Personal Profile
Paul Debevec
Senior Staff Engineer – Google VR 
Linkedin

Paul is an Adjunct Research Professor of Computer Science at the USC Institute for Creative Technologies where he founded the Vision and Graphics Laboratory, and a Senior Staff Engineer at Google. Debevec’s computer graphics research has been recognized with ACM SIGGRAPH’s first Significant New Researcher Award in 2001 for “Creative and Innovative Work in the Field of Image-Based Modeling and Rendering”, a Scientific and Engineering Academy Award in 2010 for “the design and engineering of the Light Stage capture devices and the image-based facial rendering system developed for character relighting in motion pictures”, and the SMPTE Progress Medal in 2017 in recognition of his “achievements and ongoing work in pioneering techniques for illuminating computer-generated objects based on measurement of real-world illumination and their effective commercial application in numerous Hollywood films”. In 2014, he was profiled in The New Yorker magazine’s “Pixel Perfect: The Scientist Behind the Digital Cloning of Actors”. In 2019, Paul received a second Academy Award for Scientific and Technical Achievement for the invention of the Polarized Spherical Gradient Illumination facial appearance capture method, used to create digital actors in motion pictures such as Avatar, Blade Runner: 2049, and Gemini Man.

Company Presentation
USC Institute for Creative Technologies

At the University of Southern California Institute for Creative Technologies (ICT), leaders in the artificial intelligence, graphics, virtual reality and narrative communities are working to advance immersive techniques and technologies to solve problems facing service members, students and society. ICT brings film and game industry artists together with computer and social scientists to study and develop immersive media for military training, health therapies, education and more. Research projects explore and expand how people engage with computers, through virtual characters, video games and simulated scenarios. ICT is a recognized leader in the development of virtual humans who look, think and behave like real people. ICT’s groundbreaking research and advanced technology demonstrations are both making an impact today and paving the way for what is possible in the future.

Previous

Next