Delivering Real-time Data

Earlier this year, Michael Vander Ploeg and I set out on a journey to discover what we can do as architects to deliver valuable data to our clients. DLR Group invested several thousand dollars to initialize what was meant to be a personal discovery effort for Michael and I.  This led to the realization that we have much more to offer our clients than just design solutions or BIM delivery methods. We have data – tons of it – and its ready for deployment.  We, in essence, created a system for streaming real-time data.

Welcome to Data Streams.

We are thinking about data in different ways. While we can always use it to help diagnose a problem, we can also use the data collected to validate design. That effort alone can lead to other steps that are more automated and less affected by errors. How, might you ask? Come visit our blog (www.datastreamspdg.com/blog) for more answers and to learn about our story.

Traditional BIM efforts are still leading the AECO industry down the path of repeated errors, out-dated methodologies and rinse & repeat design efforts. The focus of BIM has become so engrained in the tools you’re using in the process, that it slows down the people using those tools. The promise of BIM was a fully integrated effort by all stake holders and it has lead to putting more and more pressure on A/E firms to stay up to date on tools available while at the same time guaranteeing the building was designed with no errors. It doesn’t leave much time left to put design into the equation when I’m constantly training staff on the latest updates. To get to the next step, we have to go beyond BIM and beyond digital limits.

This is where data comes in. I had the opportunity to speak at Autodesk University again last year (2015) and I stopped by the Flux booth.1  I talked with Anthony Buckley-Thorp about file types, storage locations on servers, emails and he said, “Look Ryan, we’ve just been calling everything data from here out.” It struck me then that he was right. Any file can be distilled down into its components and sent between one program or another. It has been that way for a while.2 So now what? Data has this fluid ability to arm our designers with knowledge about a particular problem. Wouldn’t it be nice to have something to leverage when it comes to designing that auditorium or that shopping mall? Right now the effort relies on the experience of either a junior designer or a senior designer to dream up what a space should look like. At best, given the time available, they may come up with one or two design solutions. Solutions to what they see as the problems and with minimum time allowed for research. You might design a solution, but it might not be for the problem you are addressing. As Jordan Brandt of Autodesk put it, “Generative design is when you state the goals of your problem and have the computer create design iterations for you.” 3

What if you could create an infinite amount of design solutions by implementing the goals and objectives from the design team and client, in less time? Let the expert select the best computer generated design that meets those expectations and then develop that scheme with the time you’ve saved.  That data I was talking about earlier…it is the validation you need to prove the design meets those goals.  Google, Facebook do this all the time so why don’t we?

Data pushed into the design stream can be used for validation as well as design enhancement. This gives designers more time to pick the right solution without putting the stress on them to always get it right the first time, every time. Closing the loop is critical moving forward. While there will always be a need for BIM in the process, its big data’s time to shine.

1 – www.flux.io

2 – https://en.wikipedia.org/wiki/History_of_free_and_open-source_software

3 – http://www.wired.com/2015/09/bizarre-bony-looking-future-algorithmic-design/    (Wired Magazine Sept. 2015)