On Friday I had an hour or so to swing by Autodesk’s Munich office and check out the new VR Center of Excellence. I was on my way to the AEC Hackathon at TUM – Munich’s Technical University – which was kicking off during the early evening.
As I’ve found from our much more modest VR room at Autodesk Neuchatel, even simpler VR systems can take quite a bit of maintenance, and so Friday in Munich is generally reserved for the team to fix issues that have cropped up during the week. And the Munich room has way more moving parts – as you’ll see – so frankly I’m impressed that Friday is enough!
So it was that Raphael Boehm and Simon Nagel kindly arranged for me to get a demo on Friday afternoon, despite Simon having to leave the office early and Raphael having lots going on himself. I really appreciate the effort they both put in to make this happen.
This posts contains snaps and videos so that people can get a more tangible sense of at least a few of the demos I was given.
A number of the demos rely on technology that has been around for quite some time: optical tracking of physical objects using reflective paint – the kind that embeds small glass spheres and is used to reflect headlights in safety material. The first demo tracked the movement of a clay model, allowing you to see the scene augmented with the equivalent digital model. You can use a tracked camera object – that you see placed on the leftmost car image in the below photo – to see the model from different angles.
The same tracking mechanism was also used for another experience based on quite old-school display technology, a 3D stereo TV. But showing realtime stereoscopic rendering at 60 FPS delivered by a rendering cluster running Autodesk VRED.
These are the polarised glasses that allowed the viewing position to be tracked.
There were a number of fun, visualization-related toys and gadgets on display.
The main event was the VR system, of course. This differs from our own system in a few key ways: there’s a green screen – allowing for realtime compositing of the participant – as well as a number of tracked items, such as a tripod representing a virtual table as well as the camera that’s used to capture video of what’s going on.
To get a better sense of how the system works – along with the realtime compositing – you can watch the following two videos. The first was with me trying the system – and was taken by Raphael:
The second was taken by me with Raphael going into more depth in his use of the system:
One detail you may notice… the VR room has been modelled with high fidelity and is shown inside the VR session. How cool is that?
The experience was really very impressive. I would have liked to have seen the video of the participant composited directly inside the 3D scene – which is apparently altogether possible – but we weren’t able to get that piece working on the day. As you can see from the below photo of the various feeds, there are lots of moving parts!
Before leaving, I had to try on the Infinitus Prime – a 5K horizontal resolution VR headset intended for the enterprise space. Although I just checked their website (closed) and Twitter acount (deleted) so it’s quite possible they’ve now either been acquired or gone out of business.
I didn’t actually try it running – I mainly wanted to get a photo alongside the other dummies. Yes, I thought I’d get in there first and say it before anyone else did. :-)
Thanks again to both Simon and Raphael for making this happen – it really was a very special experience!