SIGGRAPH 2022 – the pre-eminent computer graphics conference – was held in Vancouver a couple of weeks ago, and with it came a slew of USD-centric announcements such as this one from NVIDIA about the Metaverse. It’s really impressive to see how computer graphics hardware and software companies are rallying behind this open source initiative.
Autodesk is no exception: after having open-sourced a web-based USD viewing implementation, earlier this year, Autodesk’s big USD-centric SIGGRAPH announcement related to the open-sourcing of a real-time path tracing renderer for USD called Aurora.
For those who don’t know how path tracing works, here’s a fun explainer courtesy of Disney:
Aurora makes use of your GPU to provide noise-free, interactive rendering with super-low latency. It does this by harnessing hardware-based ray tracing that’s now available in most modern GPUs (you may have heard of NVIDIA’s implementation of this under the moniker RTX). The real-time denoising is performed by NVIDIA’s NRD subsystem, which also works on GPUs from other vendors.
The Aurora rendering engine is really intended for in-viewport rendering of products and mechanical designs where you’re interacting with the scene (or camera) in some way: it’s not aimed at high-quality “final frame” production rendering. That doesn’t mean Aurora doesn’t generate visually impressive results, however – in fact just the opposite!
Now let’s get a sensor of the quality of image Aurora can generate. To start with, here’s a still generated by Aurora. The model in this image is courtesy of Roberto Ziche.
But to do the system justice, we need to see some real-time rendered video – that’s where Aurora shines.
Here’s a video I first saw at our recent, internal TechX event in Atlanta – there was an internal graphics summit on the final day, where Mauricio Vives took us through the impressive work he and the rest of the Aurora team had been doing. The video shows a demo application hosting Aurora to render a fairly complex scene with various reflective materials and the lighting provided by a dynamic environment background. This scene is rendered noise-free at 1080p with speeds upwards of 20 frames per second.
This particular video is actually quite old now – it was captured back in November 2021 – so it’s worth sharing Mauricio’s announcement tweet that includes a more recent one:
I have been working on a real-time GPU path tracing renderer called "Aurora" at #Autodesk, and I am happy to share that it will be made open source soon! See the announcement here, with more details to come: https://t.co/Naxoj5M1RQ #SIGGRAPH2022 pic.twitter.com/hiTbsHi1U0— Mauricio (@pixnblox) August 9, 2022
For those interested in taking a closer look at Aurora, you can use it as a preview feature in Inventor 2023 called GPU Ray Tracing, already.
Aurora’s source code will soon be shared on GitHub: if you’re thinking of integrating Aurora in your own product, then it’s worth noting that it will be shared as source rather than as a pre-built component. That doesn’t mean it will be hard to integrate, should you already be using USD for graphics display: Aurora includes a Hydra render delegate so that it can be used with Hydra scene delegates already implemented in many applications.
I’m really encouraged to see Autodesk engaging ever more deeply in open source activities such as this, and – in particular – driving forward the use of USD as an industry standard. The Aurora technology will clearly benefit our own products and customers, but this kind of work helps “lift all boats” in the industry, too. Expect to see more of this in the future!
Many thanks for Mauricio Vives, whose TechX talk I used to source a lot of the content in this post. And congratulations to Mauricio and the rest of the Aurora team on this impressive work!