I’m heading back across to the Bay Area on Wednesday for 10 days. There seems to be a pattern forming to my trips across: I’ll spend the first few days in San Francisco – in this case attending internal strategy meetings in our 1 Market office – and then head up after the weekend to San Rafael to work with the members of the AutoCAD engineering team based up there. I’ll still probably head back into SF for the odd day, the following week, but that’s fine: I really like commuting by ferry from Larkspur to the Embarcadero.
The weekend I’m spending in the Bay Area is looking to have a slightly different shape this time, though. Rather than just catching up with old friends (which I still hope to do), I’ve signed up for the VR Hackathon, an event that looks really interesting. I was happy to find out about this one and that it fell exactly during my stay. I’ve even roped a few colleagues into coming along, too.
Looking at the “challenges” posted for the hackathon, it seemed worth taking a look at web and mobile VR, as these look like the two that I’m most likely to be able to contribute towards. Which led me to reaching out to Jim Quanci and Cyrille Fauvel, over in the ADN team, to see what’s been happening with respect to VR platforms such as Oculus Rift and Google Cardboard.
It turns out the ADN team has invested in a few Oculus Rift Developer Kits, but was looking for someone to spend some time fooling around with integrating the new WebGL-based Autodesk 360 viewer with Google Cardboard. And as “fooling around” is my middle name, I signed up enthusiastically. :-)
For those of you who haven’t been following the VR space, lately, I think it’s fair to say that Facebook put the cat amongst the pigeons when they acquired Oculus. Google’s competitive response was very interesting: at this year’s Google I/O they announced Google Cardboard, a simple View-Master-like mount for a smartphone that can be used for AR or VR.
A few notes about the design: there are two lenses that focus the smartphone’s display – which is split in half in landscape mode, with one half for each eye – and there’s a simple magnet-based button on the left as well as an embedded NFC tag to tell the phone when to launch the Cardboard software. The rear camera has also been left clear in case you need its input for a “reality feed” in the case of AR or perhaps some additional information to help with VR.
Aside from the smartphone, the whole package can be made for a few dollars (assuming a certain economy of scale, of course) with the provided instructions. Right now you can pick them up pre-assembled for anywhere between $15 and $30 – still cheap for the capabilities provided. Which has led to the somewhat inevitable nickname of “Oculus Thrift”. :-)
The point Google is making, of course, is that you don’t need expensive, complex kit to do VR: today’s smartphones have a lot of the capabilities needed, in terms of processing power, sensors and responsive, high-resolution displays.
When looking into the possibilities for supporting Cardboard from a software perspective, there seem to be two main options: the first is to create a native Android app using their SDK, the second is to create a web-app such as those available on the Chrome Experiments site.
Given the web-based nature of the Autodesk 360 viewer, it seemed to make sense to follow the latter path. Jim and Cyrille kindly pointed me at an existing integration of Cardboard with Three.js/WebGL, which turned out to be really useful. But we’ll look at some specifics more closely in the next post.
During the rest of the week – and I expect to post each day until Thursday, at least, so check back often – I’ll cover the following topics:
- Creating a stereoscopic viewer for Google Cardboard using the Autodesk 360 viewer
- Adding tilt support for model navigation and enabling fullscreen mode
- Supporting multiple models
If I manage to get my hands on the pre-release Leap Motion SDK for Android then I’ll try to integrate that, too, at some point. Mounting a Leap Motion controller to the back of the goggles allows you to use hand gestures for additional (valuable) input in a VR environment… I’m thinking this may end up being the “killer app” for Leap Motion (not mine specifically, but VR in general).