We hit a major milestone with our research into smart infrastructure, this week. After a massive push over the last 3-4 weeks (which in itself was built on work done over several years), we were able to deploy a system that measures – and reports in realtime – the performance and usage of the world’s first 3D-printed steel bridge.
To give you a quick sense of some of the results of this work, here’s an image of Dasher 360 showing the 3D model of the bridge with skeletons walking across it with the bridge’s accelerometer readings displayed as a heatmap. Which can be animated, too, of course. :-)
I’ve talked various times about MX3D, in the past: most recently in my previous post. Today’s post focuses on Autodesk’s involvement in this incredible project. I also feel the need to make it clear that despite the fact I’m posting a bunch of cool videos on my blog and social media accounts, most of the hard work was really done by others in the team: my role in this was relatively minor.
From what I understand from our friends at MX3D, the idea for the bridge project originally came up during a brainstorming (probably over beer) session the night before a pitch to Autodesk. Originally the bridge was going to be generatively designed and then 3D-printed in-situ by industrial welding robots. While the goalposts shifted somewhat – MX3D went with a more artistic design, and it wasn’t practical to have welding bots working 24/7 in Amsterdam’s red-light district – the project itself definitely remained a moonshot: the real challenge of being able to create a structurally sound bridge from a sequence of robotic welds was anything but trivial. And while the collaboration with Autodesk around generative design didn’t play out as first hoped, we were really happy to support the project with technology to give the bridge “smarts”. So a different group of Autodesk people started interfacing with MX3D (more on this group later).
Making the bridge smart involved designing and installing a sensor network in the bridge. While this would ideally have been planned for during printing of the bridge, the reality was that that for this initial creation the knowledge needed to build the sensor conduits in from the beginning just wasn’t there. Hopefully next time!
Other partners of MX3D (most notably Imperial College London and FORCE Technology) were involved in the design and installation of the network of 70 or so sensors, although Autodesk clearly collaborated on it, too. Our primary role in the enterprise was always going to be around acquisition, processing and visualization of the sensor data.
It was only during the recent 2-week period where a number of us descended on the MX3D facility in Amsterdam that the sensor network was finally installed and connected to our back-end. The team that went comprised of Alex Tessier, Jacky Bibliowicz, Hali Larsen, Josh Cameron, Merry Wang, Pan Zhang, Liviu Calin, Mike Lee and myself. Some of us were there for less time than others: Hali, Merry and I had to leave after a few days, but the rest were at MX3D for 2 full weeks of 15+ hour days. Simon Breslav supported the team remotely from Toronto.
Here’s a breakdown of some of the tasks that we had to complete to make the bridge smart (along with who did what):
- Complete and test the wiring of the sensors installed on the bridge.
- This was a big job: it’s one thing to wire in and solder a bunch of sensors, but there are a lot of (metaphorical) moving parts needed to get that data usable.
- Alex, Josh & Mike
- Data acquisition, including spinning up a local cluster hosting our back-end time-series database.
- For Dutch Design Week, in particular, we didn’t expect to have the bandwidth required to use the cloud for sensor data storage.
- Jacky, Josh & Alex
- Create a 3D model with accurately-positioned sensors.
- This was much harder than you might think: we’re currently working with a theoretical model of this non-traditional bridge’s design, not an “as-built” model of it.
- Integrate a computer vision pipeline to generate 3D skeletons positioned in the callibrated 3D space of the bridge.
- We used 20 known points around the bridge to help with the callibration and to build a transformation between the coordinate spaces.
- Pan & Liviu
- On a side note, I really want to call this team Pan & Zoom, but I really ought to ask Liviu’s permission before assigning him a new nickname. ;-)
- Develop a dashboard showing the realtime data coming off the bridge.
- We needed to use web-sockets to deliver realtime data to the browser and show the various sensors streaming data into a series of dynamic graphs.
- Upgrades to Dasher 360 to work with the bridge data.
- Display of both historical and realtime skeleton data.
- Surface shading of sensor data onto the bridge itself.
- Simon & Kean
- Document the whole process to create a video
- Interview the participants, capture video of the work in process using various cameras and a drone.
Other important players in the project were Alec Shuldiner, Tristan Randall and Azam Khan. Along with Alex, Alec has been the project’s strongest internal champion inside Autodesk: it’s largely thanks to Alec and Alex that the project to instrument the bridge in Pier 9 happened. (In case it wasn’t obvious, the Pier 9 bridge was all about us learning how to do the MX3D sensor network and computer vision pipeline.) Alec has also been the defacto project manager for the sensor network piece, keeping all involved parties on the same page and chasing things along, when needed. Tristan has been across a number of times to Amsterdam to laser-scan the MX3D bridge: we’re hoping we’ll be able to integrate Tristan’s as-built model into a future version of the digital twin. Azam runs the Complex Systems Research team – to which most of the above people, with the exception of Alec, Tristan and Liviu – all belong. It’s thanks to Azam’s vision and support that we were able to put so much of our time and energy into this project.
Anyway, time to get back to the tech... In a nutshell, the data streaming from the bridge gets batched up and stored in our local time-series database – with roll-ups calculated for different levels of detail – with the latest values pushed out via websockets. Mike’s awesome realtime dashboard – which some people viewed via provided tablets as they walked across the bridge, to see how it responded – would subscribe to these. I used a similar mechanism (albeit for larger amounts of data encoded in a JSON string) to get realtime skeleton data and display that inside Dasher 360, along with the more traditional database access for historical (i.e. non-realtime) data.
It was cool to start to see large groups of people moving around on the 3D bridge: firstly with historical data.
One of the real “oh wow” moments was seeing the skeletons of people walking the bridge in realtime:
Another was when we managed to see various types of historic data of people walking across the bridge coordinated in the same animation: for instance skeleton information with people’s bodies walking on the surface of the bridge that’s been shaded with a heatmap of accelerometer readings. Too cool.
I’m looking forward to including some visuals from the bridge during my Forge DevCon class in Las Vegas. Be sure to sign up if you’re planning on being in Las Vegas for the Forge DevCon just prior to AU 2018!