Some of you may have heard about the launch of a new Autodesk product, last week, that’s focused on generating interactive 3D environments from Revit models. It’s called Autodesk LIVE and is part of the Autodesk LIVE Design umbrella (at least I think I have that the right way around).
Autodesk LIVE Design is all about making it easier to apply best-in-class visualization technology from our Media & Entertainment division – who collectively know a few things about high-end rendering and animation, admittedly – to the world of design and engineering.
Here’s a video describing how Autodesk LIVE – formerly known as Project Expo, for those of you who know it – works:
The “one click” publishing workflow from Revit sends the required content to the cloud for processing and downloads the generated 3D scene for local exploration. The local environment is based on Autodesk Stingray – our game engine tailored for design visualization – and gives you all the tools you need to explore this high quality 3D environment on your PC (or take on the road with your Windows or iPad tablet).
This is all very cool, but what’s ultimately most interesting to me is what this technology brings (or will bring) to the AR/VR table. Today people who want to take CAD data to an AR or VR environment – often based on Unity or Unreal, which are really both tailored for game content rather than design visualization – have to spend an inordinate amount of time to make the content work in these environments. And by “inordinate” we’re talking orders of magnitude more than you’d expect (I’ve heard customers talk about hundreds of hours to get a basic model ready for VR or other realtime 3D use).
Now I’m not saying Autodesk LIVE Design is going to solve all these problems overnight (or in the immediate future), but the promise is certainly there: at some point this technology is really going to reduce the pain of going from CAD to AR and VR.