The 2nd VR Hackathon, which took place in San Francisco over the weekend, was an absolute blast. It was held at Galvanize, a co-working space about a 15-minute walk from our 1 Market Street office. The venue was great: it had plenty of space but also with a fair amount of natural light (very important for those of us getting over our jetlag).
There were fewer people at this second event – inevitably, as it happened over the Memorial Day weekend – but there was nonetheless a great energy in the room. At the core of our team – which we named “VR Party” – was myself, Lars Schneider and Oleg Dedkow, both of whom flew across from our Potsdam office to participate. We had a few other people express some interest in joining the team, but in the end it was just us – although Jim Quanci lent a hand on the last day with testing and feedback.
Our “hack” – which I talked about previously – was to make VR a collaborative experience: to have someone curate and control the VR session for a number of consumers. Communicating design information is a really important activity for all parts of our industry, and I think VR could well become a great enabling tool.
We ended up with a “presenter” page, which allows you to open and view models via the View & Data API.
The embedded QR code allows an arbitrary number of people to open up “participant” pages on the devices of their choice (ideally using Google Cardboard to see the page in 3D):
All the events you perform – apart from changing the viewpoint, which is something we want controlled locally – get propagated to any of the connected clients via Web Sockets. So if you isolate geometry in the presenter window, all the viewers see the same thing.
The same is true for exploding the model…
… and even for sectioning!
The experience was actually really compelling – perhaps even more than expected, in some ways.
We had a bit of a scare as we entered the last hour or so of the competition: we had foolishly introduced some instability, late in the game, as we attempted to crack the issue of communicating zoom level (which is harder than you might think when people are looking at models from arbitrary directions). Thankfully we pulled it all back together in the closing minutes – thanks to some seriously impressive Git repo manipulation from Lars – enough for the demo to blow the socks off the judges, at least.
Which meant we ended up coming away with the award for the “Best Web-based VR Project”:
We were thrilled with the result – not even the award, especially, but we felt we came up with something that was actually really useful. The code is all in GitHub but we need to do a little more work – some clean-up but also to add support for multiple sessions – before sharing a link to the live site.
A big shout out to teammates Lars and Oleg (and honorary team member, Jim), as well as to Damon for organising another great hackathon. Can’t wait for the next one!