Yesterday I presented a condensed sneak peek of my upcoming AU sessions on Kinect Fusion and Leap Motion to colleagues in the Neuchâtel office (I’ll do the same at our office in Gümligen next week).
It was good to make sure my various demos are working well in time for AU, as well as creating some awareness around the possibilities presented by these two technologies. We attempted a live capture of a chair using Kinect Fusion, which actually came out surprisingly well (the Kinect’s USB cable popped out as I was trying to scan the chair’s back, but the command still completed successfully, even if the scan came through without the too-reflective-to-capture chrome legs):
Completely coincidentally, timing-wise, the Leap Motion controller has just started being sold in Germany and Switzerland.
And in somewhat related news, my friend and colleague Brian Pene has been featured in a very slick video showing the integration between Leap Motion and Maya:
Nice job, Brian and team! :-)
I’ve been looking at some new technology, this afternoon (new to me, anyway). My good friend Simos very generously gave me a Texas Instruments SensorTag Development Kit when he came over for last weekend’s Halloween Party.
It’s basically an awesome, Bluetooth 4.0 sensor array in a keyring-sized package that you can connect to from a variety of environments: there are standard apps for iOS & Android, and people have managed to get it working with Raspberry Pi and Windows (although curiously it seems Windows is pretty challenging, configuration-wise).
The sensors include an accelerometer, a thermometer, a hygrometer, a magnetometer, a barometer and a gyroscope. All powered by a single coin cell battery, which should keep it going for years. Bluetooth 4.0 (also known as Bluetooth Low Energy (BLE) or Bluetooth Smart) is just that efficient: a far cry from the technology driving my current (abandoned) Bluetooth headset that discharges in hours.
I’ll be looking at this over the coming months, to see whether it’s possible to connect one of these babies to AutoCAD and manipulate the view based on data coming from the accelerometer and gyroscope. Having a physical object to move around may actually be more ergonomically satisfying that waving your hands in space, also. We’ll see.