Back to News

Snap Research creates a new way to digitize and render assets for Augmented Reality

August 8, 2022
NeRoic Siggraph 2022

This week, Snap’s Research team is presenting an academic paper at SIGGRAPH, the premier conference for computer graphics and interactive techniques.

The NeRoic research paper details a new way to create digital assets for augmented reality development seamlessly from photos and videos of objects sourced from online collections (like Google images).

This cutting edge technology is called Neural Object Capture and Rendering from Online Image Collections, and removes the need for images to be photographed and rendered in a physical studio, which today is a cumbersome part of the process for digital asset creation.

So, how does it work? Our researchers sourced several images and videos of an object, in this instance, for example, the Nefertiti Bust, from different angles. They used the NeRoic method to digitize the asset, eliminating the need for multi-view studio capture.

This approach could unlock the ability for an AR creator or developer to digitize any object in the world, as long as there are photos or videos available of it found in an online image collection. This is a significant step towards the goal of creating a library of AR digital assets to overlay computing on the world.

SNAP AT SIGGRAPH

This year's SIGGRAPH conference will take place at the Vancouver Convention Center where Snap will host a networking event for students and professionals working in computer graphics. If you are interested in connecting with Snap, sign up here. The Snap team will also present a Custom Landmarker Course and moderate a panel on Privacy, Safety and Wellbeing: Solutions for the Future of AR and VR.

Editor's Note: This is also posted to Snap's Newsroom Blog.