Robots have been on my mind quite a bit, lately. They were everywhere at last month’s Autodesk University, for instance, whether the 3D scanning robot wandering around the conference – and even showing up at the Blogger’s Social – Bot & Dolly’s impressively choreographed robot at the opening keynote or LEGO Mindstorms in the Exhibit Hall.
And Google has been on a crazy buying spree picking up robotics companies – they‘ve apparently acquired at least 8 during the past 12 months, including Bot & Dolly (announced while AU was still in progress, curiously enough) and the maker of the more-than-a-little-scary BigDog and WildCat military robots, Boston Dynamics.
In a strange coincidence (to them, anyway ;-), my kids ended up getting a programmable robot – the Swiss-invented Thymio II – for Christmas and we had a great deal of fun fooling around with it during the break.
The Thymio II was developed largely at EPFL in Lausanne (an institution I’ve mentioned recently), and is a very cool little device intended to make robotics much more accessible to a broader public as well as to spark people’s creativity:
So what can you do with this robot? To start with, it comes with a variety of sensors that allow you to control its behaviour or have it act as a Braitenberg vehicle, moving around autonomously, reacting to its environment.
To get into the robot’s specific capabilities let’s talk about its built-in behaviours, each of which is colour-coded to make it obvious which is currently active from the glow coming from various LEDs. For example, there are 5 depth sensors spaced around the front of the robot, allowing the robot to follow your hand (green), move away from it (red) or explore a maze (yellow). There are two ground sensors that stop it from falling off the edge of a table but also allow it to follow a path (cyan). The robot has capacitive touch buttons on top allowing you to give it directions as well as an infra-red sensor that makes it possible to do the same via remote control (purple). And it even has a microphone that allows it to react to the clapping of hands (blue).
If you want to go beyond the built-in behaviours you can program your own, whether using the visual programming environment for simple ones or the Aseba programming language for ones that are more complex (and you can use the source for the ones mentioned above as a starting point, of course).
I was obviously fairly disappointed that I didn’t actually need to write code to get the robot working out of the box, but still. As the robot has anchoring points for LEGO in various places (too cool!) it soon became a transporter for Star Wars mini-figures, which allowed me to modify the “explorer” behaviour (the yellow mode mentioned above) to glow with the various colours of light sabers in the Star Wars movies: blue, green, purple and red.
Beyond that, though, I’ve been thinking about the possibilities to interface – although admittedly somewhat indirectly – the Thymio II with AutoCAD. The robot has an API to read and write from the inserted microSD card, so I’m thinking about two possible integrations:
- Generate a file from AutoCAD that the robot could then read to determine a path to move along, effectively turning the robot into a very basic pen plotter.
- In something akin to “explorer” mode, capture the data coming from the robot’s accelerometer and depth sensors and write it all – probably along with the wheel movements – to the microSD card. In AutoCAD we’d have some code that interprets this data and shows a visual representation of the maze (or whatever), generating the path followed but perhaps also a (2D) point cloud with the depth sensor data. Which basically turns the robot into a very basic remote sensing device.
These are just some preliminary thoughts, for now. I’d be happy to hear suggestions from this blog’s readership… please go ahead and post a comment! :-)