Earlier this year I came to know about a “local” company called senseFly. (When I say local, I mean they’re also in the French-speaking area of Switzerland, which is a relatively small region of a small country… I suppose that would be considered local by some standards. :-)
senseFly started as a spin-off of the prestigious Ecole Polytechnique Fédérale de Lausanne (EPFL). (EPFL is one of the two top technical universities in Switzerland, the other being Eidgenössische Technische Hochschule Zürich (ETH Zürich)). They specialise in making unmanned aerial vehicles (UAVs) – drones – and now belong to the Parrot group (the makers of the AR.Drone, although senseFly drones are clearly targeted at professionals rather than consumers).
I contacted senseFly some months ago, as I was curious about the possibility of combining the use of their drones with Autodesk technology such as ReCap Photo. senseFly already have a close technology partnership with Pix4D – another EPFL spin-off focused on 3D reconstruction – but I thought it would be interesting to see how our technology works with drone-captured data.
Before talking about that, here’s a quick video of a project that’s been in the news here in Switzerland over the last few weeks: senseFly – partnering with a few others companies – has mapped the iconic Matterhorn using a number (a swarm, according to Gizmodo, but that seems to be overstating it slightly) of their eBee drones:
Cool stuff. It seems a shame that two teams had to climb to the top to release the drones, but I imagine it’s only a matter of time before they can make it up their under their own power. And the climbers certainly seemed to enjoy it, anyway.
So, back to the topic of using ReCap Photo to post-process image data captured by UAV. senseFly provide me with a link to their publicly accessible datasets, which I then passed on to members of our Reality Capture group.
The first person to pick up the gauntlet and let ReCap Photo loose on the senseFly data was Wes Newman, who’s based in Fayetteville, Arkansas. Here are some photos of the model that he generated using ReCap Photo and visualized with Project Memento:
Next up was Pierre Diebolt, from our Sophia Antipolis office in the south of France. Pierre went ahead and ran all four of senseFly’s datasets through ReCap Photo, generating the below results (also created with the help of Project Memento):
If the images seem a little greyed out on the left, look a little closer: Pierre is showing them transitioning between the underlying mesh on the left with the fully textured model on the right.
I was really impressed by these results: I’d frankly been expecting it to work less well with imagery generated using a downward-facing, UAV-mounted camera (I had thought that having a front-facing camera was really a requirement for capturing vertical faces… it’s possible that’s still true of buildings, but the quarry’s walls look to have been captured very well). But then ReCap Photo has surprised me before, such as when capturing a car (something I also thought to be next to impossible).
With any luck we’ll be able to get hold of the ~2,200 photos taken of the Matterhorn. Now that would be a really interesting project to run through ReCap Photo… :-)