Grey_Fowles

We aim for simplicity in design. Most architects would love to present a building as a single concept. But even Mies Van Der Rohe — who loved simplicity – often said, “The devil is in the details.” His design of the Seagram Building is considered the first masterpiece in modern tall building design.

But buildings are more than just art. They are complex mechanical and biomechanical systems that require detailed understanding to minimize their environmental impact.

New Building Simulations

Our current modeling systems only allow us to take all the complexities of building variables into account to a minimal degree. To truly understand the good, better, and best scenarios for reducing environmental impact we need to start re-thinking current strategies, and developing new modeling systems that will more accurately reflect the many variables that can be incorporated into building simulation.

An example of a simulation system is weather prediction. On the surface weather prediction is all about colorful maps and engaging TV personalities.

In reality, the science of weather prediction has progressed to a level of complexity that a layperson has little hope of truly understanding. These added details in prediction have, however, resulted in a huge increase in accuracy and confidence in weather forecasting.

Many of the problems with accuracy and trust in our current building simulation programs have already been addressed by the weather forecasting and simulation experts in their own modeling practices.

Currently, ASHRAE/LEED guidelines dominate the building energy simulation methodology. Upon reflection, these guidelines have some drawbacks that can and should be dealt with.

Drawbacks in Simulation models

The first of these drawbacks we will call singularity. For energy modeling one of the singularities is the TMY2 file. TMY, or Typical Meteorological Year, is the set of data that approximates the hourly average weather conditions in a given location.

This file contains only one year’s worth of data and is “static” in that the file will not take into account any variation of weather on a day to day basis or a year to year basis. This would be like the weather programmer running a simulation with a single set of inputs and getting a constant result at all times — either 100% chance of rain on Tuesday, or a 0% chance of rain on Tuesday.

There are millions of variables, so the weather programmer runs a simulation 100 times with slight but realistic changes every time. If roughly 60 of 100 of those simulations forecast rain, then the prediction is a 60% chance of rain.

Building science is just as dynamic, so why are we not taking into account small variations like the U.S. Weather Service does? Taking these variations into account and mitigating the uncertainty of the model is not as an impossible task as it may seem. As our computer power has progressed so profoundly, running these extra simulations will not take up significantly more time.

A second drawback that should be addressed is that of average ideal versus real world. The modeling programs, as we use them now, are populated with manufacturer guidelines and optimal output/productivity values.

To change this ideal input, which we all know is not realistic for the life of the building, the program user has to go through a lot of effort. There is a “best case” and “worst case,” but the reality is that most of the time we end somewhere in the middle.

Changing these ideal inputs goes against our inherent human conformation bias – trying to prove ourselves right — but it has to be done for the accuracy and integrity of our end product.

Conducting Multiple Simulations

Both of these drawbacks can be addressed, in part, by developments in computer speed and programing capability. A basic building model simulation requires a fraction of the time that it did 10 years ago. Increasing complexity and multiple simulation runs should not be seen as a burden, but rather a way of elevating our own long term positive predictive value.

Increasing positive predictability in our simulations results in increased confidence from our clients and could lead to improving the correlation between the LEED rating system and how a building actually performs during its lifetime.

Additionally, the client, owner and architect will all gain a greater and more acute understanding of how their product will perform. This can in turn lead to smarter, greener, and more profitable decisions for the life of the building.

Energy modeling for long-term occupation and building viability should be an ultimate goal of environmental building, and green design. As Mies Van Der Rohe also said, “We must be as familiar with the functions of our building as with the materials. We must learn what a building can be, what it should be, and also what it must not be…”

To truly know our buildings inside and out, to know what should be, and what it must not be, we cannot ignore the devil just because the details are mundane and lack gloss.

Grey Fowles is an Associate Green Building Consultant, CDT, LEED® AP in Paladino’s Seattle Office

Share this Post

2 Comments

  1. As an energy modeler I agree with some points and respectfully disagree with others. As computing capabilities improve it is often beneficial to take advantage of them. I agree that a sensitivity analysis on certain parameters can provide more (or less) confidence in the results. I’m not as optimistic about the additional time to do a sensitivity analysis being insignificant. eQUEST is relatively fast, but many other building energy modeling programs (like Energy Plus and Trane TRACE) still take quite some time to run for each simulation. In addition to the computer run time, the labor time to make tweaks to an energy simulation should not be underestimated either.

    We also need to be careful about putting a lot of effort into reducing the uncertainty of one input when there are other inputs that might have much larger uncertainties, like occupancy and infiltration.

    The reason for doing an energy model also affects how detailed to make the energy model. If you are trying to predict utility consumption, then a lot of detail is needed. If you are trying to evaluate the savings from a potential upgrade, then not much detail is needed for aspects of the building that are unrelated to the upgrade (like maybe exterior lighting) and will simply subtract out in the difference.

    Energy models can certainly be used as a learning tool to understand how buildings dynamically respond to changes in specific parameters. Like with Miles Van Der Rohe, we all have to deal with the tension of keeping things simple yet taking advantage of complexity when it’s beneficial.

    Reply
    1. Keith, thank you for your feedback. I understand your concerns, and know all about the labor and time to make simulation tweaks in the current programs. I am thinking towards the future of energy modeling, and the possibility of, and need for better programming.

      We need this improvement not to reduce the uncertainty in one input, but in a host of inputs. Our clients increasingly seek more accuracy with less future uncertainty when making multi-million dollar decisions. We as an industry should look to the future and ask ourselves where do we want so that we have a say in how the technology develops.

      Reply

Leave a Comment