All posts by Ryan Cameron

Visualizing Building-Related Data

Do you ever feel like you’re designing in the dark sometimes? You know what you’re doing but at all the stops along the way you either find yourself fact checking, re-crunching numbers or just plain didn’t see the impact one area of your work was having on another? While a typical leader might glean you some of their wisdom and point you to what they call “insight” to help you through that step. In a closed process its actually referred to as the inevitable. The predictable – and that’s a really good thing. Predictability is something we can account for in our process and in many cases not only automate it, but automate past it when it arrives.

What does it take to put them into practice?

Knowing is half the battle. Asking better questions will help you re-frame the question, “Are we able to do this, this and this?” To asking “how can this insight help solve the problem?” After all, a major component of architecture is problem solving. With that being said I take us on a journey about tasks and supertasks. The term “super-task” itself was coined by J.F. Thomson (1954).2

A supertask is a task that consists in infinitely many component steps, but which in some sense is completed in a finite amount of time. (See Zeno’s paradox) If this sounds a bit like architecture and building design, I assure you the concept is the same. Rather than dealing with perspectives of quantum mechanics or general relativity; architects, engineers and designers are all after the many solutions to varying problems with varying degrees of complexity.

Think for a moment all the tasks an architect or engineer must go through just in the design phase. Let’s focus on just form finding and functional placement of spaces. Data from the previous phase, “Programming” must be translated and manifest itself into something physical at some point. We use digital tools to help speed this process, but not aide the process.  How many of these tasks are giving visual feedback to the person crafting the project?

In the video below, I take us through a visual depiction of just a few things I look for in a project such as: am I over or under on program square footage, which rooms am I accounting for, what levels contain the most program spaces, how does that compare to other projects I’ve done as well and finally how do all those spaces inter-relate to themselves?

 

Without a visual description of this happening, it leaves the project with a couple choices.

  • Go forward as normal and accept the outcome.
  • Spend time manually calculating things out.
  • Constantly asking for help or comments.

Would you rather see what is going on in the project or go about the process as usual? If you’re interested in how you store, access and analyze your data, then you’re willing to take that next evolutionary step towards better design through increased collaboration, new techniques and time for reflection. Future designers will need all three if they are to create a better world.

When we illuminate data, we’re talking about a direction that we can understand it and better relate to it. If we uncover hidden patterns in both human uses and building optimizations then we can better understand how to design the spaces we occupy. How are you simplifying your data analysis for your projects?

Why Innovate? 3 Reasons to Lead Innovation Efforts

Creating something new that was either just a paper scratch or a cloudy idea is enviably interesting to me.  Whether it’s an Android app, design workflow or a sophisticated data collection system, I enjoy the process of creating.  The process tells a story – decision-making and wrong turns along the way that can trigger questions and new ways of thinking. There is knowledge to be learned from the process. Between the birth of an idea and the final product, times (technology) changes so fast that it can impact the usefulness of the product you’re creating. How you use the knowledge to improve your idea or let it evolve is what’s relevant.  There is no pre-determined order of operations when you’re on the road to innovation – only a single, irregular path and recording of the story that no one was expecting. Capturing the moments where creativity leads you to the next innovative thing is ultimately more rewarding than the product, workflow or manifestation you ultimately create. So what makes an idea profound enough to be innovative? 

Before we get to that, let’s look at three reasons in response to “Why Innovate?”.

Competition

One of the obvious reasons for innovating is to out-compete your competition. In the proverbial “better mousetrap” scenario we find ourselves constantly trying to find better ways to solve problems. While this doesn’t mean you should innovate for the sake of innovation, it wouldn’t hurt to start thinking beyond what works now with the refreshing new tools that arrive on a monthly/yearly basis. What this means is you need to have creativity that matches your own leadership abilities. As Tim Brown, CEO of IDEO put it:

“Creative leadership isn’t about leaders simply becoming more creative. It’s about individuals leading for creativity. That means you, as a leader, must unlock the creative potential of your organization, no matter the industry. It’s your job to set the conditions for your organization to generate, embrace, and execute on new ideas. It’s a competitive imperative that will keep you ahead in the marketplace.” 1

Constant Change

Another reason to innovate is that change is constant. Not only that but it accelerates. Peter Drucker once coined the phrase, “The best way to predict the future is to create it.” So really what he means is make change or be changed.  Preparing for change to me means making change happen. To articulate this, move away from generic vision statements and towards goals that capture an innovative vision.

Meaning

The final reason to innovate for me, is to make innovation meaningful. Does your product or process actually produce the effects you claim? Would you use it yourself? Are you making more work or less work? Clearly by now you have a product in mind that you’d like to change. You probably use it enough to claim some comfort level as to describe what you use it for and how you could make it more innovative. Fostering your innovative talent can not only help evolve it but will ultimately make it more meaningful for you in the end.

Great design for a great cause

2Unless your innovative product motivates potential customers to pay more, saves them money, frustration, time or provides some larger societal benefit (health, community, etc..) it is not creating value. Ask yourself often, are we just making a product perform better or are we actually accomplishing something bigger together? Why are we making this product; or this software or workflow?

Technological innovation is a huge creator of economic value and a driver of competitive advantage.  So how do I go about creating innovation?  My first step is to jump into my sandbox. No, not an actual sandbox filled with sand.  For the uninitiated the term 3“Sandbox” generally refers to a testing environment that isolates untested theories and experiments from the production environment into a safe repository guarded from interfering with real world work. You can probably imagine it’s similar to the idea of a scientist experimenting in a lab.

So lets go back to the original question. What makes an idea profound enough to be innovative? There can be many answers to this question so please feel free to leave your comments below. I’ll try to answer it this way. The idea must have a positive impact – something that can be felt not only on a local level but a global level as well. Ideas like the solar tiles, home battery system and electric car from Tesla CEO Elon Musk. This fundamentally changes the way we consume energy. Granted, right now the electric car is shifting our resources from oil to coal, but that won’t always be the case. Just like Tesla, your idea has to be planted in the ground in order for it to grow and cultivate to reach its potential.

You’ve got to start somewhere. What are your 3 reasons to lead innovation efforts?

Sources

1 www.medium.com/ideo-stories Tim Brown, CEO of IDEO

2 The Lean Startup – Eric Ries

3 https://en.wikipedia.org/wiki/Sandbox_(software_development)

Lessons Learned From A Beta-Tester

It started as pure curiosity for me. I wanted to get in the know about the future of the software I’ve been using for so many years.  When I was working at a start-up architecture firm (now-defunct) they allowed me the freedom to pursue pre-releases of programs like Revit.  I credit this interest to working with dRofus Managing Director David Patera, at the time, a recent UNL Architecture grad like myself. David always seemed to have a leg up on our competitors by knowing what the next release of many software’s were capable of. My first experience of this was seeing the revolutionary GUI change from Revit 2009 to Revit 2010.  We were able to experiment with the new “Ribbon” interface as Autodesk called it, long before it was released. So when it did come out to market we were not only used to it but were perceived as experts to our consultants and peers. Why wouldn’t we be? We were using it for 7 months before the release, and nearly 20 months before many firms adopted it.  Speed, high-comfort level and no-cost training were the pay-offs.

Lesson Learned: Early adoption has its downside too.  One of those lessons is that it helps to load the beta software’s on a separate hard-drive partition or a completely separate computer. I can count on both hands how many times I accidentally opened a project in a pre-release only to realize I couldn’t save it back to our actual production version of the software. Ctrl+Alt+Del “End Process” and I became close friends.

In the world of architecture, you also travel down the path of graphics. This was my introduction to web design and even a slew of online web development kits. One of those being CubeGL. In 2011-12 I was running a research program about size and proportion of patient rooms.  It was selected by Healthcare Design Conference 2013 as a topic for me to present that year in Orlando. http://www.healthcaredesignmagazine.com/blogs/jennifer-kovacs-silvis/are-patient-rooms-too-big

With the help of my wife Ashley, who happens to be an expert web designer, we were able to experiment with this idea of 3D rooms on a webpage that allowed for interaction by the user.  You should see what she’s up to now! This was years before the advent of Google Cardboard (released June 2014). We were able to use an experimental “spherical” room generator plugin for WordPress and develop multiple sizes, layouts and proportions of patient rooms and also allow for survey participants to type comments.  http://patientroomdesigns.com/

Lesson Learned: This technology only really worked with modern browsers such as Chrome, Safari and Firefox. At the time, most everyone was still using Internet Explorer. Choose your technology wisely and make sure you have a plan to implement cutting edge concepts to your users. Our way around this was to have a pop-up screen appear when the user opened the link in Explorer, prompting them to download Chrome. This was also a challenge, so we eventually did get it to work in Internet Explorer.

Now back to architecture. One of the big changes in architecture is the advent of rapid prototyping or “3D printing” as it’s known.  At UNL I was one of the first, if not the first student to 3D print a competition project that I had recently won. (AIAS 2006 – Boston Kiosk) Time and time again my model failed to make itself “watertight” meaning that it was a solid mass with no hollow spots water could seep in.  This led to a lot of deep research on how I could streamline that process, learn about the many different types of 3D printers, the cost, time and materials it takes to produce a print.

Over and over again I became increasingly better at understanding the intrinsic details of how to model and what to avoid.  Eventually I applied to be selected for Autodesk’s Ember 3D printer. I tweeted direct to Carl Bass, CEO with a 140 characters convincing argument as to why I should be selected.  Needless to say, it worked.  This 3D printer is awesome! It’ll print a toothbrush, the bristles on the toothbrush if you do it right to that precision. This was always my enemy in 3D printing, minimum size. Most printers on the market will easily let you go below 1/32” or 0.03125”.  That’s tiny, but 2×4 aluminum mullions at 1/8” scale is much, much smaller than that. In the past I had to build them thicker for a “printer model” export and then remember to change them back after the export completed. With the Ember, that is not the case.

Lesson Learned: Just because something looks complete and many great examples are used, doesn’t mean you’ll have the same success.  Remember, you are beta-testing and not all the features are available, some are known to be broke and some you will have to do the updating to.  Take on these challenges only if you have time to commit to them. The reward here was much more exposure to the world of 3D printing and actual hands on experience breaking things and talking with experts from the industry.

There are many more examples of this including creating our own beta-test platform at DLR Group for Data Streams. Will I ever get tired of beta-testing? Well, let’s just say I’ve graduated from Beta to Alpha. Looking forward to the Alpha testing of Autodesk’s Generative design platform: Project Fractal.

Let me know your thoughts and the beta-projects you’re working on! Together we can share our experiences to help make dreams a reality!

Coding is the New Sketching

It wasn’t that long ago I found myself developing Android Apps for a startup company I created called Architect Machines.  The idea was simple.  I needed tools to help me do my job that didn’t exist on the market place. These machines (software apps) would help keep me organized and efficient.  I had a lot of licenses, certification numbers and renewal dates to remember so I wrote a free app called Member ID to help remind me of that information. Between it and PhotoDoc I had around 10,000 downloads (1,550 active users). By comparison to other apps out there, it failed.  My intent wasn’t to make a buck but to create something I could use and others as well.  Little did I know that would set me on a journey to explore computer programming for architecture further.  While I am not a true programmer, I’m a licensed architect with a background in engineering. My ability to put together a building far outweighs my knowledge of computer programming, but I did learn how to code.  This led to tinkering with Arduino sketches (ironically how I thought of the title quote) and that led to Raspberry Pi development and as many of you know, Data Streams w/ co-Founder Michael Vander Ploeg.

Then I saw this fantastic little tool for Revit Architecture called Dynamo at Autodesk University 2013.  You can take pre-scripted code and insert them over an open dashboard to manipulate data in Revit similar to Grasshopper for Rhino. Like most code, it’s rule based and its great at math. Two things I find myself doing a lot of in the field of architecture, engineering and building science. In reality, a lot of what we all do, especially repetitive tasks, is math based.  What we think is human thought for some types of design criteria is a series of calculations that address possible outcomes.  Take for instance a parking garage. There’s probably an assortment of ways to “make” them work, but really it gets narrowed down to 2 or 3 foundations pretty quick. At the end of the day you’re really just looking for a parking stall count and how big that is on the site, then maybe any height restrictions thrown in. The link below is one simplified scenario that allows designers to specify those parameters quickly in order to get faster answers to the team and ultimately to the client. (two story only) https://www.dynamoreach.com/share/5793c57a1cea5a924f4682fb Why would we spend 20+ hours modeling multiple variations of them in Revit or Sketchup when we have a tool like this?

Taking this one step further, we ask ourselves what drives the size of the parking lot?  Is it stadium attendees? Then with a little further coding we can have that stadium population data drive the parking data.  No need for a person to even open the parking garage model until the stadium options are ready for computation.  This idea is a slide-in to generative design.

“Coding is the new sketching”. We start to see a process of extreme speed and accuracy by cleverly coding our design outcomes to meet expectations. While I’m not discouraging napkin sketches, another way of putting it is you state the goals of your problem and have the computer create design iterations for you, as designer you can select the best option.  The old way composed of the designer having a dream and drawing it. Coding is one way to extend your idea downstream instead of stopping it on paper.

At RTC North America 2016 I was able to participate as an expert panelist for this discussion involving FormIt360 and Dynamo and the future workflow of design.  What I have been witnessing over the past few AU’s and RTC’s is this revolution of parametric driven designing.  We can expect our machines to help us with architecture in ways we didn’t think were possible even five years ago.  With the technology out there available and forthcoming, the future looks bright!

Please share your thoughts and questions!  Is coding the future of design?  How does it impact your process and what workflows have you developed to provide better results for you and your clients?

Delivering Real-time Data

Earlier this year, Michael Vander Ploeg and I set out on a journey to discover what we can do as architects to deliver valuable data to our clients. DLR Group invested several thousand dollars to initialize what was meant to be a personal discovery effort for Michael and I.  This led to the realization that we have much more to offer our clients than just design solutions or BIM delivery methods. We have data – tons of it – and its ready for deployment.  We, in essence, created a system for streaming real-time data.

Welcome to Data Streams.

We are thinking about data in different ways. While we can always use it to help diagnose a problem, we can also use the data collected to validate design. That effort alone can lead to other steps that are more automated and less affected by errors. How, might you ask? Come visit our blog (www.datastreamspdg.com/blog) for more answers and to learn about our story.

Traditional BIM efforts are still leading the AECO industry down the path of repeated errors, out-dated methodologies and rinse & repeat design efforts. The focus of BIM has become so engrained in the tools you’re using in the process, that it slows down the people using those tools. The promise of BIM was a fully integrated effort by all stake holders and it has lead to putting more and more pressure on A/E firms to stay up to date on tools available while at the same time guaranteeing the building was designed with no errors. It doesn’t leave much time left to put design into the equation when I’m constantly training staff on the latest updates. To get to the next step, we have to go beyond BIM and beyond digital limits.

This is where data comes in. I had the opportunity to speak at Autodesk University again last year (2015) and I stopped by the Flux booth.1  I talked with Anthony Buckley-Thorp about file types, storage locations on servers, emails and he said, “Look Ryan, we’ve just been calling everything data from here out.” It struck me then that he was right. Any file can be distilled down into its components and sent between one program or another. It has been that way for a while.2 So now what? Data has this fluid ability to arm our designers with knowledge about a particular problem. Wouldn’t it be nice to have something to leverage when it comes to designing that auditorium or that shopping mall? Right now the effort relies on the experience of either a junior designer or a senior designer to dream up what a space should look like. At best, given the time available, they may come up with one or two design solutions. Solutions to what they see as the problems and with minimum time allowed for research. You might design a solution, but it might not be for the problem you are addressing. As Jordan Brandt of Autodesk put it, “Generative design is when you state the goals of your problem and have the computer create design iterations for you.” 3

What if you could create an infinite amount of design solutions by implementing the goals and objectives from the design team and client, in less time? Let the expert select the best computer generated design that meets those expectations and then develop that scheme with the time you’ve saved.  That data I was talking about earlier…it is the validation you need to prove the design meets those goals.  Google, Facebook do this all the time so why don’t we?

Data pushed into the design stream can be used for validation as well as design enhancement. This gives designers more time to pick the right solution without putting the stress on them to always get it right the first time, every time. Closing the loop is critical moving forward. While there will always be a need for BIM in the process, its big data’s time to shine.

1 – www.flux.io

2 – https://en.wikipedia.org/wiki/History_of_free_and_open-source_software

3 – http://www.wired.com/2015/09/bizarre-bony-looking-future-algorithmic-design/    (Wired Magazine Sept. 2015)

Autodesk University 2014

Autodesk University 2014 (Dec. 2014)

Well so much for not presenting at conferences ever again (see last year’s posts of me saying I’m exhausted from conferencing)  My session this year will be titled: “Cloudy with a Chance of Design”.  A clever rewording of the movie, Cloudy with a Chance of Meatballs.  The session will be a roundtable discussion, loosely based around the idea of the movie.  At first, cloudy technology was great.  We could render, store and sample the different varieties of goods, all from the cloud.  But much like movie, things have gotten a little out of whack and now we need protocols in place for handling all the information before it gets out of control.  This roundtable will discuss what firms have been doing to implement the different Autodesk Cloud tools and other techniques used to keep things under controls.  It’s raining design, so join me and Bill Allen (BIM Manager with OZ Architects) for a legendary session!  (yes, legendary, according to 4 participants)  Feedback score was 9.32/10 this year, which qualified for Top 10.  That is, if AU did do the Top 10 award this year, which they didn’t.

BIM Workshops 2014

Another great year speaking at BIM Workshops! I feel more at ease with only one presentation this year. Of course, that presentation was an expanded portion of my 2D to 4D presentation from last year.  I felt I needed to explain a few steps I took when going from Revit to 3dsMax as well as document my experience moving rapidly between the two software’s.

My impression from this years conference has changed.  We now have an emcee, expanded staff and expanded speakers.  I was fortunate enough to give the introduction for two up-and-coming speakers, Mr. Bill Allen and Nathan Miller.  Both are very well versed in Rhino and Dynamo and both had excellent presentations.  I can’t thank Steven Shell enough for my introduction. He really raised the bar during his dialog which must have helped sway the attendance since I was rated Top 5 of all the speakers at CS BIM Workshops!  Like I said, another great year speakering!

Unlike last year, I recorded these sessions and they can be found on my YouTube channel at the link below.  Enjoy!

MR Rendering and iRay Video

Lighting Video

Workflow Importing Video

Materials Video

Healthcare Design Conference 2013 | Orlando, FL

Healthcare Design 2013

Wow, words can not describe how busy I will be this year.  My side project for The Effects of Proportion on Personalized Patient Rooms has been accepted as a session at this year’s HCD in Orlando.  All the hard work of the surveys (still on-going) and meetings and not having to do an IRB has begun to pay off.  My friend, Dr. Brad Stastny, Ph.D has agreed to help me develop all the necessary tools and data-scrubbing required to put the polish on this presentation.  Wish us luck!

 

HCD-Conf-Prospectus-COV-2013

Ryan Cameron – Top 5 Speakers at Central States BIM Workshops 2014

 

BIM Workshops 2014 (Aug. 2014)

Whew, only one session this year!  Revit + 3dsMax: A Winning Combination, Rendering Techniques.  In the session I take the class through the workflow I’ve been using from Revit into 3dsMax for high end rendering.  In fact, I recorded the session, mistakes and all and put them here on my youtube channel.

Materials:

https://www.youtube.com/watch?v=UQshhZueE_o

Mental Ray and iRay Rendering Techniques:

https://www.youtube.com/watch?v=7ySI9vPzIac

Workflow:

https://www.youtube.com/watch?v=KHVmD6qe7js

Lighting:

https://www.youtube.com/watch?v=gi7VYbwUbeU

 

 

Feedback score….<drum roll>  Top 5 speaker!

 

Top-5-SpEAKER-announcement-350x350

Autodesk University 2013

Autodesk University 2013 – Las Vegas (Dec. 2013)

I’m not sure what rabbit’s foot I caught next to a four leaf clover patch, but I’ve been accepted to speak at AU this year.  My session is Volumetric Modeling for your BIM Workflow.  This will involve anything from massing families, to COBie data to BIM execution plans and everything in between.

Volumetric Modeling for you BIM  Workflow

What an exciting event and I’m glad to be a part of it.  For those of you wondering, I’m a speaker at this year’s AU!  I’ve attended in years past, but this will be my first (successful) attempt at conducting a roundtable discussion.  My co-host Birgitta Foster is an expert on National BIM Standards and will help guide the session through its many facets.

Central States Revit Workshop 2013

 

BIM Workshops 2013  (Aug. 2013) (formerly Revit Workshops)

Another fantastic year at BIM Central States, formerly known as Revit Workshops. Success!  I’ve been accepted for TWO sessions at BIM Workshops 2013!  I will be speaking on Autodesk Showcase and quick development techniques I’ve been able to use for design options and touch on their Augmented Reality app.    One on implementing Showcase with Revit and another non-conventional approach to construction documents with a slew of softwares and workflows.

.

So let’s talk for a moment about Autodesk Showcase as a stand-alone product.  Having the chance to experiment with anything new is always exciting and Showcase is no let down.  A variety of tools are available for fast formulation of designs.  I step through several of these with a Revit model linked into the program.  Right away you’ll notice we’re in 3D perspective view.  Something Revit hasn’t truly mastered until the second update for 2015.

Showcasing your Revit Model

.

The major presentation with be my Goodbye 2D drawings, Hello 4D techniques using a sampling of softwares involved.  I will have to use two different projectors at the same time and present half of it from an Android tablet.  Let’s hope it all goes well since this will involve a lot of moving parts!

Goodbye 2D – Hello 4D pptx

 

To model or not to model

Often times I come across MEP firms that ask, “Why should we do this in Revit, our process is just fine the way it is?”  To which I ask, “How many projects do you have errors on because you didn’t coordinate routing?”  All of them.  The question isn’t why, but how to use Revit for engineering.  Do you need to model every bell housing, strap brace and bolt?  You don’t, the manufacturer you are specifying already is modeling it.  Raise the intelligence of your employees by having them establish relationships between all the architect’s walls, floors and ceilings by drawing in 3D, you’ll be glad you did.

Accountability of Smart Objects

For the architects and engineers out there the benefit of BIM should begin to bear fruit by now.  If not, you might still be in the “Production Only” level of thinking.  For the contractors and FM Directors out there, what this means is full accountability of all the smart objects in rooms, including the rooms themselves.  Each object represents not only the real-world object, but all the attributes associated with them such as location, cost and power requirements.  All components shown are smart models that are directly exported from Revit just waiting to be accounted for in your company’s tracking software.

Bone Creek Museum of Agrarian Art

Bone Creek is currently a National Presence on the art scene.  Located remotely in David City, Nebraska, it boasts the nations largest single collection of Agrarian-themed artwork.  My team and I produce fundraising materials as well as architectural “what-ifs” for the board.

Herman Miller Young Architect Award 2010

Have I got a story for you.  Much to my amazement I received an invitation to become a Herman Miller Young Intern Architect at Healthcare Design 2010.  But wait, there was a catch.  A few weeks went by and I hadn’t heard from anyone at AIA, AAH or Herman Miller.  After buying my airline tickets and booking my hotel I received a call from my contact at HM.  There had been some confusion and that I had not been awarded. Goodbye.

Well that seemed odd.  I had an email indicating I had won, plus a follow up phone call shortly after that.  I bought tickets and everything since it was all reimbursable.  I’d have to say I felt a little cast aside…but why? What was the mix up?  I sent everything I had onto Doug (the managing principle at the firm I was working at) and he got in touch with someone.  Since things had already been purchased and not refundable, he convinced them to follow through on their original invitation in the email.  Just like that, I was going to HCD10 in Vegas!  I still wonder sometimes as to what was said between Doug and the HM folks. Much to my disappointment though, the only time I was listed as a recipient was at the conference.  A large photo of everyone was taken and was up on HM’s healthcare site but for the most part I can not find any source siting me as a recipient.  A bit of an issue when you tell people you’re  a HM Young Architect Fellow.  At least I still have my badge from the conference.

Herman Miller Scholarship

 

Autodesk University 2008

BR and RCMind blown. Coming right out of college with a Masters of Architecture I thought I had a good grasp on the industry, software and where it was all heading. Although the recession had just started, most of this year’s AU was already set in stone so I got the full conference experience without all the cheap-outs.

I had the opportunity to take classes I would have never been able to see if it wasn’t for this chance happening. I also got to sit in and meet a lecture by none-other-than Burt Rutan!

http://www.core77.com/posts/11989/autodesk-university-2008-burt-rutan-on-innovation-11989

Most of these links are now removed, but serves as a good reminder of my experience:

A sampling of blog posts in no particular order: