Now that the dust has settled on another Autodesk University (the best one ever?) it’s a good time to revisit what brought our team to AU, this year. I realize that I've posted a lot of pictures of people wearing helmets and high visibility vests without really explaining what it was all about.
Our team at Autodesk Research has been very focused on understanding the human experience of the built environment, with the ultimate goal of making our software tools better at helping our customers design human-centric spaces. Spaces that emphasize occupant well-being in its various forms.
Humans are tricky: we’re all different. It’s possible to look at broad themes that make humans thrive, but there are always limits. For instance we all love biophilic design, at least until a plant starts blooming and someone who’s allergic to its pollen has a sneezing fit. We’re exploring how these broad themes can be applied during design, but that’s a separate area of our research: AU was about something we call “experiential walkthroughs”, which we’re using to capture our own subjective experiences.
Experiential walkthroughs use technology to help capture the human experience in an architectural or urban space. We fit people out with helmets that contain an array of sensors: a 360 camera mounted on top, a selfie camera focused on the participant’s face, a set of environmental sensors for ambient temperature etc. We also give the participant a phone that will prompt them with questions intermittently, and even ask them to solve a task or two. Capturing this qualitative data is a critical component of this process: sensors will only tell part of the story.
We performed a series of “walkshops” in the summer in Toronto, asking study participants to capture their experience of The Bentway, an urban space beneath the Gardiner Expressway in Toronto that has been redeveloped as community space.
We gathered around 550 GB of data during these walkshops, and synthesized it into an exhibit called From Steps to Stories - which I talked about in this recent post - that we presented in Toronto with our partners at The Bentway the week before AU.
This original exhibit used 6 vertically mounted 55” monitors to display the data, with a 27” touch screen serving as the input device.
I wish I could have travelled to the opening of the exhibit in Toronto, but the timing would have meant multiple trans-atlantic trips in the space of a few weeks or an extremely long trip to AU etc. By all accounts it went really well, though.
When we brought the exhibit to AU 2023, we used 3 horizontally mounted 55” monitors with a Surface Pro 9 tablet (which was actually driving the 27” display in Toronto) as the input device.
From a technical perspective, both displays (the main display and the controller) are actually web-pages: the main display in Toronto needed Chrome to have a special flag enabled to scale to a really high resolution, apparently. The “server” (the PC running the exhibit which has the various 55” monitors merged into an extended screen) also has a process checking for web socket input from the controller page. So as the touch-screen is used to drive the UI, the main page changes what it’s showing. Simple and neat.
Here’s a video of the exhibit in action. The controller is at the bottom, the main display at the top.
While people could explore the data captured at The Bentway during AU 2023, we also wanted to capture data about the Autodesk University exhibit hall experience and provide insights from that data to the AU team.
During the course of the 3 days, we had approximately 40 people don data capture helmets and walk across the expo to the “Sustainability Forest” and back, sharing their feedback on the experience via a smart phone (as well as us capturing their individual choices of route, etc.).
While the visualization of this data could well be interesting - in much the same was as it has been for The Bentway - the larger opportunity is to start modeling the human experience of space via some form of Machine Learning. This type of dataset will eventually prove very helpful when designing future spaces.
The Autodesk Research team involved in the creation of this project:
Ray Wang, Frederik Brudy, Bon Aseniero, Sebastian Herrera, Mike Lee, Jacky Bibliowicz, Ellen Hlozan, Matthew Spremulli, Pan Zhang, Liviu Calin, Lorenzo Villaggi, Brian Lee, Jamie Nicholson, Dagmara Szkurlat and myself.
Many thanks to others from Autodesk Research who also contributed to the running of walkshops and our exhibit at AU:
Elliott Montgomery, Lily Prasuethsut, Athena Moore, Alanna Mongkhounsavath, Yi Wang, Allin Groom and Qian Zhou.