My world seems to be filled with robots, these days, whether seeing how they can be used in architecture and construction, animating them inside Forge, or seeing them 3D print steel bridges (I’m at MX3D again, this week). It makes me think I should probably dust off my HoloLens app for making robots dance in mixed reality: once we’re using data inside the Forge viewer-powered Dasher 360 to show the position of a robot at a particular moment in time, it’s hardly a huge leap to show that in XR. Anyway, I should get to the point…
Given this current trend in my activities, it seemed a good time to talk about another project that piqued my interest recently. I met Evangelos (Evan) Pantazis briefly back when I visited IaaC during the Forge accelerator in Barcelona. Evan delivered the 3rd presentation of the evening session at IaaC – the one we had to miss due to our dinner reservation – mentioned in the post I just linked to. Because the topic was so interesting to me Evan and I managed to find a time when both of us were available to talk about it (by this time Evan had found his way to LA, so this was harder than it might sound).
Evan has been doing some really interesting research into how emergent behaviours of swarms of relatively dumb robots – Bristlebots, which are typically Hexbugs with minor mods – might be harnessed in construction. Evan did a portion of his research while resident at Autodesk Boston’s BUILDSpace.
Here’s an intro video to the Bristlebot concept:
This one looks at how Bristlebots perform in different environments:
And this one looks at how computer vision can be used to detect positions of the Bristlebots on the ground from an overhead camera:
This last one is especially relevant when considering how a central “intelligence” might be used to guide the bots in their work: this could be done using simple beams of light to “attract” the bots, or even encode more elaborate instructions (this is my speculation… I think Evan’s intention, overall, is not to have to build too many smarts into his bots ;-).
If you’re interested in learning more about Evan’s research, be sure to head on over to his research group’s page. I’ll definitely be keeping tabs on what he gets up to!
Update:
Evan has provided an additional video that shows how you can use light to influence behaviour in the bots: