Company Spotlight: Rex and Vibrantcy

Rex and Vibrantcy

Why am I at Vibrantcy? After a full career in the consulting engineering business, why keep working? And, why work at Vibrantcy?

·       Because they asked.

·       Matt and Colin seem to find value in my experience and skills. Being in such an environment is a validation and a celebration of my career.

·       In the world of sports, lots of coaches … just keep on working. I can relate to that.

·       It feels good to still be engaged.

·       This is a special little firm. There is passion here. Passion for our clients. Passion for the work. Passion for the well-being of our staff.

Vibrantcy is unique. This firm has the potential to become very good. Matt Higgins has a highly-developed (and practical) talent for energy simulation work and energy-related commissioning challenges. Colin Evans has a passion for sustainable design and is an experienced and capable mechanical design engineer.

A word or two about “sustainable design”. I share with the staff at Vibrantcy a love for the physical environment and a desire to be a part of a sustainable future. I have been pro-environment my entire life. The first time I realized that I was a part of this minority was at a young age, probably about 10. I have been involved with pro-environment groups and activities throughout my life. It was not very popular 30 years ago to be a pro-environment engineer... there were not very many of us. Today, it is exciting to see the younger generation with a much greater commitment for a sustainable future. I am pleased about that and optimistic that good things will come from their leadership. At Vibrantcy, I see a sincere commitment to working for a sustainable future. I am impressed by the passion that lives here.

Rex history. Even though you didn’t ask, I’m going to tell you a bit about my personal history. Please check out if you are not interested. Perhaps one or two will continue reading.

My engineering career started in Cincinnati, at General Electric, designing jet engines. The obvious(?) next choice was, as you might guess, to work at Johns Hopkins University Applied Physics Lab in Washington D.C. There I worked on the Navy Polaris and Poseidon nuclear submarines, sometimes riding in the top of the sail with the captain. An interesting job, interacting with lots of intriguing people, many of them very impressive, indeed.

After that curious beginning, I decided to enter the crazy world of consulting engineering. And, more than 30 years later, after various roles ranging from design engineer to firm owner to university engineer to firm owner to design engineer, here I am at Vibrancy.

What am I doing at Vibrantcy?  I am gratified to be here … happy that the owners of this young company seem to appreciate that this old mechanical engineer may have something of value to bring to the table. At this point in my career, it feels good. It is validating.  

I’m like Allstate Insurance… I’ve “been there…seen that” regarding quite a few things in my career.

I occupy a unique position here, which is the token old guy. Just kidding, but, apparently, something about me has struck a chord with Matt and Colin. I strive to share some of what I have learned over the years. I strive to add value to this firm as it grows and prospers.

To that end, a part of my role here is to provide a bit of QA/QC for projects. And a bit of design. A bit of assist in several things. I hope to help keep them out of trouble. I hope to help them expand their visions. I want to help them to succeed.

It is refreshing and a new challenge to work with younger professionals who seem to place value on my experience and my contribution.

More history:

I’ve been blessed to be a part of some special design projects. Four come to mind.

·       The new Riley Hospital for Children at the IU Medical Center, an amazingly capable and compassionate place for very ill children, located in my hometown of Indianapolis, IN. My wife’s aunt, as a very young girl, was the fourth child to be admitted to original version of this wonderful hospital.

·       Master Facility Plan for Butler University Fieldhouse. This basketball cathedral, built in 1929, and considered to be one of the very finest places in the country to watch college basketball, has been a part of my life since I was 8 years old. I played basketball there in high school and was awestruck, like the boys from Hickory High in the movie “Hoosiers”.

·       The Birck Nanotechnology Center at Purdue University, a $56M research facility for multiple disciplines. As the University Engineer at Purdue, it was challenging and exciting to be working with so many very talented individuals on this very important state-of-the art research facility.

·       The GSA Land Port of Entry, a new $60M LEED Gold facility in Columbus, NM. My role was to be part of the review team representing the GSA. Again, the challenge and the stimulation was to work with all the many layers of federal agencies and design consultants assembled to design and build this new border facility.

I have basked in the glow of some great clients, like the longtime University Architect for Indiana University … who, when asked by another university to comment on my capabilities, said the following “he’s not any good, but he is better than everyone else”.

I’ve enjoyed some fun gigs in my career:

·       Riding on and working on Navy submarines was a highlight.

·       I feel most fortunate to have served for 6 years on the State of Indiana Commission for Fire Prevention and Building Safety, which is responsible for establishing and enforcing the building codes for the state. What a unique and fascinating experience. I learned so much, was a part of so many intriguing code issues, and got to work with so many different fascinating people. I loved the experience.

Wrapping it up: As you can surmise, I am making an effort to celebrate my career. I have been fortunate. But I’m not ready to pack it in. So, I’m also celebrating my present gig, which is the opportunity to work with this outstanding young firm named Vibrantcy.

Rex brief Bio: Please read on, at the risk of being swept over by a wave of boredom, and only if you feel you may want to know a bit more about my professional history.

Rex is a registered mechanical engineer with over 30 years of experience. His career has included successful roles in engineering and design, project management, staff management, client advocacy, and owner representation. His background includes extensive higher education and healthcare experience. He has excellent skills in planning, problem solving, and working on complex projects.

Rex has extensive technical expertise in large and complex HVAC systems, including building central air systems, chilled water and steam systems, automatic temperature controls, and mechanical codes. He has worked on dozens of projects that have incorporated sustainable features, including many LEED projects.

Rex's dedication to achieving desired outcomes is evident in his dedication to excellence. His projects are consistently responsive to the needs of the client. His projects have an excellent record of cost management.

Selected Significant Projects:

VA Hospital OR Expansion ($10M, 5-phase renovation), Albuquerque, NM

GSA Land Port of Entry ($60M LEED Gold), Columbus, NM, Review Team for GSA

GSA Montoya Building (100KSF), Santa Fe, NM, Master Facility Plan

PHS Lincoln County Memorial Hospital Master Facility Plan and Expansion Study (120KSF)

UNM Replacement Hospital Phase I (250KSF, LEED Silver), Albuquerque, NM

Commissioning of Indiana University, Multi-Discipline Sciences Building (LEED Silver)

Commissioning of Wishard New Replacement Hospital ($600M, 1.2 MSF, LEED Silver)

Cameron Community Memorial Hospital ($35M replacement critical access hospital)

Indiana University Jacobs Music Studio Building ($40M new LEED Gold building)

Utilities Master Plan for Indiana University and IUPUI main campuses

Mechatecture – The art and science of mechanical systems that make a positive impact on the architectural experience of a building.

Here at Vibrantcy we call ourselves “a mechatecture group”.  This simple combination of the words Mechanical + Architecture does not mean that we provide both mechanical and architectural design services, but that we strive to create mechanical solutions that are complimentary and integral to the architectural elements of a building.  We are a group of engineers and energy analysts who bring the same concept of character to building systems, as architects bring to the style and appearance of a building.  People interact with buildings using all five senses, so that the resulting experience is as much affected by temperature as it is by colors and textures.  We want to change the way that people interact with mechanical systems within a building, so that energy efficiency can be achieved while increasing the comfort, habitability, and maintainability of the building as a whole.  We can realize these goals by using the integrated design process, aesthetic design consideration for all typical building systems, and by implementing mechanized building elements.

Part of this concept of mechatecture is an extension of the Integrated Design Process (IDP), which is a holistic approach to high performance building design where collaboration between disciplines begins very early in the design process.  To move beyond typical building systems and successfully implement non-traditional methods of heating and cooling, the entire team needs to be on board while the design is still flexible.  All too often a building is completely designed before handing it over to the engineers to place the equipment and systems wherever they might fit.  The IDP has been adopted by the latest versions of LEED certifications in order to better identify design improvements that can be incorporated without adding significant cost to the project.  It can be thought of as the low-hanging fruit to avoid costly systems that may be needed to work around some building element that can no longer be adjusted late in the design.  A full explanation and commentary on the integrated design process is a good topic for another blog post (maybe even by a guest blog writer, any takers?).

The construction industry dictates that using custom materials will inevitably increase the cost of a project.  The success of most design architects depends on utilizing typical materials in a unique way to produce their desired design effect.  We have a limited number of different Lego building blocks to choose from, with the goal of making the product look like it was not built completely out of Legos.  Lighting fixture designers/suppliers have come a long way in offering unique solutions, but we still see so many of the same fixture type because they are the most affordable and produced in large quantities.  Uniquely successful lighting is often achieved by designing the ceilings or walls around a basic fixture to actually hide the fixture from view and allow the light to shine from an unusual cove or shaped ceiling.  This is adaptation of the building around a system, using conventional materials.  With HVAC systems I see less innovation in using ductwork or diffusers to enhance a design, but with thoughtful consideration we can learn to use the basic building blocks in unique ways and adapt the building around mechanical systems.  Can you spot the HVAC building block in the photo below on the left?

 

Photos By: Colin Evans   Location: Mayo Clinic in Scottsdale, AZ

Photos By: Colin Evans   Location: Mayo Clinic in Scottsdale, AZ

The third concept behind mechatecture is the realization that not all architectural solutions are static, and that in order for buildings to evolve to the next level of efficiency there will be an increase in building components that are actually mechanized and automated.  Traditionally, there are not many parts of a building that people can see that actually move.  Most examples of this would be considered artistic sculptures, which aren’t necessarily responding to or affecting the environment.  As part of the design process the team might identify a structure or building component that needs to move in order have the desired effect.  The most common example of this would be mechanized solar shading, which lets in sun and light when needed and deflects it another direction when unwanted.  This example is very much an architectural element that has energy efficiency advantages, requires a scientific approach to optimize its movements, and has moving parts to be designed, built, and maintained.  Whose scope does this fall under?  The most likely answer in this case is that the supplier of the product would take most of the responsibility. 

But what if the design team came up with a solution that was less common, or had never been done before?  A solution that utilizes standard materials and simple mechanics to achieve a unique result could easily fall through the cracks without a team willing to take full responsibility for something that is outside of the box.  The simple idea of natural ventilation relies on the coordinated movements of building elements in response to a multitude of sensors inside and outside of the building.  The components and controls are ordinary, but the sequence of operation are unique to the specific micro-climate of the building.

Here at Vibrantcy we are taking on buildings’ greatest challenges with passion.  We can’t expect to remain in the traditional role on a traditional team, and achieve high performance results.  We are educating ourselves about cutting edge techniques in order to share our knowledge with colleagues, clients, and staff about what we have learned about better living in our built environment.  Our buildings can become an extension of our natural environment only when we take the chance to learn from others, instead of depending on our own limited experience and staying within our comfort zone. 

Electric and hybrid cars – welcome to the new norm?

Ever since watching the documentary “Who Killed the Electric Car?” 1, I’ve been curious as to where the electric car industry would be at this point if the plug on the GM EV1, and other Electric Vehicles (EVs), hadn’t been pulled in the mid-1990s. Even so, hybrids have been popping up more frequently, and EVs – such as Teslas – have had a positive impact within the market, with a clear demand for them.

The film points to some conspiracy theories - which I’m inclined to believe - regarding the influence that oil companies and their lobbyists have on our government’s decision making; such as making it difficult to build public charging stations. On the other side, we had auto-makers worried that the new type of vehicle will reduce revenue in regards to things such as maintenance and tune-ups. In reality, the push for more efficient vehicles has been mandated by regulations imposed by government entities to reduce emitions, and any work towards that takes away from building cheaper cars to sell for higher profits, which is what car companies are about. Even with the attempt to halt the emergence of EVs two decades ago, from 2005-2014, there was an average of 352,306 hybrid vehicles sold, according to the USDT.2 And the future looks bright for both hybrids and EVs; in March of 2016, 325,000 people put down a $1,000 deposit for Tesla’s new Model 3, which won’t even reach their buyers until late 2017. While this doesn’t tell us that everyone is onboard to buy EVs, it points towards enthusiasm about the technology and the positive impact it may have.

Perhaps the biggest sign that hybrids and electric vehicles are here to stay is the fact that racing series have been adopting these technologies in the past few years. Formula 1 brought forward the hybrid era in 2014. The switch from naturally aspirated, 2.4L V8s, to turbocharged 1.6L V6s mated to multiple Energy Recovery Systems has brought forward an unprecedented increase in efficiency. These energy systems take wasted energy from the brakes and the turbo charger and power the 120kW electric motors that provide an additional jolt throughout the lap. Earlier this year, Andy Cowell, Managing Director of Mercedes AMG High Performance Powertrains, mentioned that these small, 1.6L V6s are producing over 900bhp and have an astounding thermal efficiency of 50%.3 By comparison, these new engines deliver more horsepower than 3.0L V10s from a forgone era, and make the 30% thermal efficiency of regular gasoline engines seem a bit outdated. We’re talking about an increase in efficiency of 66% in only three years of the new engine era, with more to come in the new era of engine regulations for 2017.

Another series making its way into the racing world is Formula E. While this series hasn’t really grabbed the attention of the masses, racing series are used by automakers to design and develop these new technologies and increase their impact on their road-going vehicles. These all-electric formula-style race cars are equipped with a 200kW engines, with a 28kWh Lithium-ion battery to last half of the race. Power limits have been increased in the coming years. Currently the drivers are required to use two cars to complete the race distance, meaning drivers switch cars halfway there. One of the goals is to get rid of the need to change cars. What makes this interesting is the fact that Ferrari CEO, Sergio Marchionne, recently mentioned that Ferrari is considering joining in the future when the series has matured and when it makes sense that the use of this all-electric technology will help to improve their road cars.4

While the hybrid and EV percentage in the automotive market is still very small, it is safe to say that the development of these technologies is here to stay, which will in turn make more consumers warm up to the idea. When you hear that Ferrari, the automaker known for making some of the world’s most famous and powerful naturally aspirated gasoline engines, is talking about having their full automotive lineup go hybrid by 20195, then that tells all petrol-heads that the shift is real. Not to mention the many hybrid supercars already on the market from the likes of Porsche, McLaren, and Audi.

 

1.     Who Killed the Electric Car?, Directed by Chris Paine, Sony Pictures Home Entertainment, 2006.

2.     “Table 1-19: Sales of Hybrid Vehicles in the United States”, United States Department of Transportation,

http://www.rita.dot.gov/bts/sites/rita.dot.gov.bts/files/publications/national_transportation_statistics/html/table_01_19.html

3.     “Fuel Thermal Efficiency”,

http://www.formula1-dictionary.net/thermal_efficiency.html#index_top_page

4.     “Ferrari outlines requirements for possible Formula E entry in future”, Autosport, http://www.autosport.com/news/report.php/id/127039

5.     “Marchionne Says All Ferraris Will Be Hybrid Starting in 2019”, Road & Track, http://www.roadandtrack.com/new-cars/future-cars/news/a31483/marchionne-ferrari-hybrid-2019/

Post-Occupancy Services are worth the Investment:

Vibrantcy’s post construction services comprise a significant portion of the company’s workload via one year post-occupancy retro-commissioning (RCx) and energy measurement and verification (M&V). The diagnostics required to perform these services vary based upon building type and systems complexity, however in all cases building automation system (BAS) trends are setup and heavily scrutinized.

 

When performing RCx services BAS data is evaluated against weather-data, design intent, energy consumption, base energy-loads, and peak energy demands. Correlations among these datasets are drawn based upon the usage patterns of facilities, giving our engineers the ability to recommend small operational changes and set-points to remain within design parameters. In many cases these post-occupancy services lead to improved thermal comfort and reduced HVAC run-time.

 

More specifically in recent projects for the Central NM Community College, RCx services were performed for 450,000 square-feet of facilities resulting in over 80 energy conservation measures. In a recent post-RCx implementation instance a central chiller was observed to operate 20% less than pre-RCx assessments, extending the useful life of the chiller plant and associated equipment.

 

Similarly, Vibrantcy’s M&V services have helped facility owners understand the value of capital investments one year post-occupancy. Using data very much like the RCx datasets, M&V requires detailed energy model calibration, using design-phase models to understand how a facility is truly operating. In most instances Vibrantcy’s design-phase energy models are within a 15-20% margin of accuracy, through calibration these models are much more accurate (within 5-10%). Using as-built drawings, submittals, ASIs, and other construction phase information calibrated models allow building owners to determine the replicability of design features for future projects.

 

In one recent instance measured and quantified documentation of before and after performance suggested an annual savings of $20,086 by choosing the building systems in a Colorado High School. Should the school have implemented other alternative systems than those chosen for construction, they may have saved an additional $13,087, as indicated by calibrated energy models.

 

Measurable energy savings are best determined when sub-meters are implemented as part of construction, and are accessible subsequent to occupancy. Diligent specifications are necessary in order to develop a useful monitoring and data-archival system, which can produce extremely useful performance data for inclusion in both RCx and M&V analysis. In some cases sub-hourly sub-meter can be used to populate calibrated energy models, which is helpful when performing guaranteed savings calculations through regression analysis.

 

In either case, RCx and M&V services help building owners and manager gain a deeper understanding of actual systems performance, often spurring small operational recommendations to save significant amounts of energy. We’ve recently completed our 43rd RCx analysis for PNM customers, and would love to share our findings with you!

 

Matthew Higgins, Founder & Chief Analyst

CEM, HBDP, LEED-AP (BD+C), MBA

Can we cool our spaces without heating our planet?

Setting foot outside in Albuquerque any given day this summer has served as a constant reminder that each of the last 14 months has broken records as the hottest of its kind in history. Indeed, there is little doubt that 2016 will go down as the hottest year on record. However, perhaps the one positive piece of news associated with the atmosphere this year was the announcement that the hole in ozone layer over the Antarctic is finally beginning to heal. The Montreal Protocol worked.

Originally signed in 1987, the Montreal Protocol was the first global legislation to preserve the quality of Earth’s atmosphere. This agreement was signed by 197 countries in response to the discovery of the gaping hole created in the ozone layer caused primarily by chlorofluorocarbon (CFC) emissions. At the time, CFCs were the most widely used refrigerants with R-12 or Freon being the most well known one of these. The ozone layer is a most important part of the atmospheric construction because it reflects a great deal of UV-B radiation back into space. Without this protection, skin-cancer rates would soar, marine life would suffer, and crop yields would decrease. The protocol created a framework to phase out ozone-depleting chemicals beginning by banning CFCs.

Although the biggest culprit had been dealt with, CFCs are not the only halogen-containing refrigerants that contribute heavily to global warming. Hydrochlorofluorocarbons (HCFCs) were the cheapest alternative to CFCs, and had a lower ozone-depleting potential. The most common HCFC refrigerant is R-22, and although no new systems using R-22 have been manufactured in the US since 2010, it is still important enough for Wiley to include complete thermodynamic tables for the refrigerant in my Fundamentals of Engineering Thermodynamics textbook. R-22 will no longer be manufactured after 2020.

The hole in the ozone is healing and the most ozone-depleting chemicals have been banned and phased out, so why are the current refrigerants still under attack from environmentalists? The answer lies in the global warming potential (GWP) of these refrigerants. A GWP is assigned to a gas with carbon dioxide as the reference, and many commonly used refrigerants have GWPs in the triple and quadruple digits (i.e. If emitted, these gasses contribute to global warming hundreds to thousands times more intensely than carbon dioxide). For reference, R-22 has a GWP of 1810, R-134a has a global GWP of 1430, and highly used R-410A has a GWP of 2088. The latter two refrigerants are hydrofluorocarbons (HFCs), which are used in most commercial and residential air conditioning applications today.

Industry leaders, such as Emerson in a whitepaper published in 2014, claim that “one good option for air conditioning applications is to stay with HFC options such as R-410A until an economically viable alternative becomes available.”1 The justification for this is a metric called the Total Equivalent Warming Impact (TEWI) this takes into account not only direct emission of refrigerant, but the effects of system efficiency and source of electricity. Furthermore the total contribution of HFCs to global emissions is less than 3%.1 However, the bottom line is that it is more expensive and often more dangerous to create air conditioning systems using alternative refrigerants with similar efficiencies to those achieved with HFC refrigerants. It would appear that this excuse is losing effectiveness as the US is already proposing an amendment to phase down HFCs to 15% of baseline by the early 2030s. World climate leaders recently met in Vienna to discuss this next stage that could set the stage for a 2016 amendment to the Montreal Protocol.2

So where does the air conditioning industry proceed from here? Halogen-free refrigerants do exist. In fact, carbon dioxide (a.k.a. R-744) itself is one of the oldest refrigerants. A major hurdle in developing systems with CO2 as the refrigerant is the 30-50% efficiency hit incurred by using CO2 over HFCs in a simple thermodynamic cycle. However, CO2 does have favorable heat transfer coefficients and one effective way that systems can be designed with CO2 as the working fluid is to employ transcritical operation. CO2 does not condense at higher pressures like HFC refrigerants, so a gas cooler replaces the condenser in transcritical cycles. Initially as least, this thermodynamic cycle modification will require additional spending on R&D and a higher first cost of systems. Ammonia can also be used as a refrigerant and is attractive because it has no direct GWP. Unfortunately, ammonia’s toxicity makes it unsuitable for residential air conditioning applications. Finally, traditional hydrocarbons can be used as refrigerants, and at this point in history they are still cheap and abundant. However, their extreme flammability make them a tough sell to commercial operators.

Since 1987, the air conditioning industry has successfully continued to deal with regulations on their most common refrigerants while still improving efficiency. I commend the industry for innovating in system design rather than pushing back against regulations. It could be considered ironic, but it appears that the gas, which we have been pumping into the atmosphere since the industrial revolution thereby warming the planet, could soon be the standard working fluid coursing through thermodynamic cycles to provide cool air through our diffusers.

About the Author

                  Jeffrey Sward recently graduated from the University of New Mexico with a Bachelor of Science in Mechanical Engineering, and will continue his studies at Cornell University in the coming fall. As a New Mexico native, Jeffrey naturally enjoys the backpacking, fishing, snowboarding, skiing, running, and most other things outdoors, and he uses this as a justification for his interest in climate change and energy efficiency.

References

1 Refrigerants for Residential and Commercial Air Conditioning Applications. Tech. Emerson Climate Technologies, 2014. Electronic. http://www.emersonclimate.com/Documents/Resources/2007ECT-136.pdf

2 Ragendran, Rajan. Refrigerant and Energy Regulation Updates. Rep. Emerson Climate Technologies, 2015. Electronic. http://www.emersonclimate.com/en-us/About_Us/industry_stewardship/E360/Documents/Anaheim-Presentations/e360-anaheim-refrigerant-and-energy-regulations-update.pdf

3 Moniz, Ernest, and Gina McCarthy. "A "Cool" Way to Combat Climate Change under the Montreal Protocol." Energy.gov. US Department of Energy, 20 July 2016. Web. 21 July 2016.

Data Center Efficiency Case Studies

There is a common misunderstanding that increasing data center operating temperatures will result in degradation of IT equipment reliability and service availability. Airflow management optimization, rearrangement of air vent tiles, and wider operational temperature controls are such factors that can enable an overall reduction in energy consumption. In addition, operational cost savings of 4-5% can be achieved for every 1° F. increase in server inlet temperature.

Currently, the optimal operational temperature for IT equipment in most data centers consist of a thermal range of 68-72° F. As per the guidelines published by ASHRAE in 2008, there are new agreed upon Recommended and Allowable thermal and humidity ranges for IT equipment. The Recommended range is 65-80° F. with up to 60% relative humidity (RH). The Allowable range is 50-95° F. with up to 20-80% RH. The difference between the recommended and allowable range is the acceptance of potential reliability risks at the allowable range, such as corrosion and electrostatic discharge (ESD). These reliability risks can occur as a result of low or high humidity levels, however, effective operational protocols that combines cooling operation at both above and below 68° F. could ensure no change in the overall likelihood of equipment failure.

There are no known increases or decreases in temperature based on server utilization, both during or outside of school hours. This means that there could be excess cooling loads that are not necessary and potential areas of concern such as hot spots that have not been resolved. By examining data center improvement case studies conducted by Google, Intel, and Cisco, a similar solution can be implemented in order to achieve optimal efficiency and energy savings.

Google has a smaller data room, classified as Points of Presence (POPs), which achieved a power usage efficiency (PUE) from 2.4 to 1.5 after a series of small scale improvements. The PUE is a metric used to identify the efficiency of energy delivery from a building to the IT equipment located indoors. With an ideal PUE of 1.0, there is no facility overload energy (energy used by cooling, lighting, and power distribution), therefore every watt of power is going directly to the IT equipment. The POP configuration consisted of double-conversion uninterrupted power supplies (USPs) with an IT load of 250 kW, four 111 kW computer room air conditioners (CRACs) with a direct expansion (DX) cooling coil, and racks that contained third-party equipment such as network routers, power supplies, load balancers, and optical switches. The initial analysis showed that the data room was overcooled, overpowered, and overused.

The baseline airflow model of Google’s POP indicated an inefficient distribution of cold air over machines before reaching the hot aisle.

The baseline airflow model of Google’s POP indicated an inefficient distribution of cold air over machines before reaching the hot aisle.

After installing temperature monitors, thermal models using computational fluid dynamics (CFD) were created as to simulate airflow. Next, air vent tiles were rearranged so that the side of the room with high power racks had matching airflow with IT power. The temperature and relative humidity settings were then increased from 71° F. at 40% RH to 81° F. at 20-80% RH. Other improvements included: placement of refrigerator curtains to block off ends of cold aisles, installation of blanking plates and side panels to block cold air passing through cold racks, 48” sheet metal box extensions to all CRAC air returns, and a new CRAC controller to enable dynamic control of all CRACs. Lastly, the capital investment of $25,000 for these improvements led to a yearly energy savings of over 670MWh and $67,000. The financial payback of each improvement were all under a year.

After implementing changes to the Google POP, the final airflow model indicated an efficient cooling of machines and the network room.

After implementing changes to the Google POP, the final airflow model indicated an efficient cooling of machines and the network room.

Intel has also tested and evaluated servers in a 3,000 square feet data center to see how high-temperature ambient (HTA) operation conditions affect power consumption. Their primary goal was to find the highest potential cooling power savings and identify what effects can be encountered with the continuous operation of IT equipment at high temperatures. The initial baseline inlet server air temperature was 61/64° F and were raised to set points of 75° F., 82° F., 91° F., and 100° F. and server loads of 25, 50, 75, and 100% utilization. The results showed the following: 1) power consumption of servers began to increase on all models at temperatures above 82° F., 2) the optimal set point was found by staying within an increase of 50° F. up to a temperature of 82° F., 3) server functionality was not affected by HTA conditions of up to 100° F., and 4) at least 9% potential yearly power savings of total power. Intel also stated components of an overall power efficiency strategy in data centers: HTA design practices, air containment, airside economizers for cooling efficiency, software tools that provide analytics and dynamic management to optimize the server environment, and power-efficient servers.

Four labs at the Cisco San Jose building achieved energy savings of 13-21% for raising set points from 68-72° F. to 76-80° F. The chilled water set point was also increased from 44-48° F. Note that Cisco equipment has been tested to withstand 104° F. while non-Cisco equipment can withstand 95° F. The average power consumption of the building consisted of 2,700 kW total, 800kW cooling, and 1,800 kw IT. Similar to Google’s POP, Cisco installed power meters and temperature sensors, ran simulation models, and implemented solutions to help optimize airflow and cooling efficiency. Blanking panels were installed to reduce the amount of short-circuiting air and floor grills were placed where needed in the labs to achieve uniform temperature in aisles. Also, the elimination of cold and hot areas enabled the ability to raise the temperature set point without causing overheating issues. As a result, there were at least 910,000 kWh in overall energy savings.

It is evident that increasing the temperature set points at any data center will enable greater energy savings without compromising the functionality, reliability, or service availability of IT equipment. Given a comprehensive analysis of current IT/data rooms (temperature monitoring and simulation models), any inefficiencies such as airflow and hot spots can be identified, along with dynamic control of increased temperature set points. 

References:

  1. Data Center Efficiency and IT Equipment Reliability at Wider Operating Temperature and Humidity Ranges:http://www.thegreengrid.org/~/media/WhitePapers/WP50-Data%20Center%20Efficiency%20and%20IT%20Equipment%20Reliability%20at%20Wider%20Operating%20Temperature%20and%20Humidity%20Ranges.pdf?lang=en
  2. Google Server Room Case Study: https://www.energystar.gov/sites/default/files/asset/document/Google_Server_Room_Case_Study_0.pdf
  3. High-temperature hosting data center : https://www-ssl.intel.com/content/dam/www/public/us/en/documents/white-papers/high-temp-hosting-data-center-paper.pdf
  4. Cisco Lab Setpoint Increase: http://svlg.org/wp-content/uploads/2012/12/Cisco_cs.pdf

Achieving Projected Automation: Implementing Misfit Energy Model Recommendations

 

Energy performance modeling is a highly sought-after service in the facility design and construction industry, but is arguably becoming more so in the existing building renovation and upgrade industry. Whether in an energy savings performance contract (ESPC) or a large facility controls upgrade, savvy industry-participants are turning to sophisticated energy modeling to create a business case for bankable energy savings. This article explores these interdependencies in existing buildings, where they can be misconstrued, and in some instances a case of the blind leading the blind.

While energy modelers are not required to perform detailed walkthroughs of existing buildings, it is difficult to truly understand systems and control sequences without the in-field guidance of a building’s controls integrator. Modeling teams are scouring existing buildings throughout the world and are modeling savings from a known-quantity of energy conservation measures (ECM); often without really knowing what existing controllers are doing. If this is the case for one of your projects it is very likely that modelers know even less about what existing controllers are incapable of. It is also true that energy modelers are not completely aware of the capabilities of their project’s controls integration contractor, and vice versa. This is often the beginning stages of a poorly executed ESPC or controls upgrade, when bankable savings are a key component of financing or payback. As a result modelers are forced to assign a contingency factor to their analysis, which energy service companies (ESCO) then add their layer of contingency to. So what is really bankable in that scenario?

Energy modelers also rely on simulated control strategies during energy model calibration, matching models to actual weather and billing data, which is often performed in a vacuum-like digital environment. This is especially true if controls and integrator limitations are unknown, and more so when expectations of energy models are vague. If these two attributes are not understood and ECMs are simulated in a vacuum-like environment the proposed measures are unrealistic, compounding their eventual lack of bankability. Limitations and unstated expectations are not new to engineering or building controls, but they skew the fundamentals associated with ECMs (several of which will be discussed herein).

As an energy modeler and analyst I often come across new communication techniques to facilitate team-building, and modeling tools to account for the imperfect nature of facility automation. Whether we create our own tools, or use a great new development of our peers, one thing is clear: sophisticated modeling and modeled savings rely heavily on equally sophisticated building automation system (BAS) controls. Taking that interdependency one step further, these two digital sectors have something even more critical common, and aside from experienced software operators: their reliance on motivated and trained facility managers (FM). While FMs are not the focus of this article, their role is a lot like the base of a typical team pyramid; they can either create a reliable foundation for continued performance or exacerbate systems degradation.

Because modelers and integrators rely so heavily on FMs, it is important to also integrate expectations and limitations of a project’s FM. Project delivery and bankability becomes much more reliable once the entire team is able to admit what they, and the controls, cannot do. FMs do not necessarily need to know what a points-list is, but it is important for them to know which systems can talk to one another and which can only be read. It seems commonsensical, but not only will an informed FM be willing to step out of their comfort zone when they are part of the conversation, they will be less likely to raise issue about something that is an immovable weakness in a BAS.

Realistically Forecasting Controls Upgrades

When creating a business case for a controls upgrade, financial performance metrics like payback and cost/benefit ratio are highly dependent upon simulated BAS conditions, which rely further upon communication with FMs and controls integrators. As a result of these interrelations, it is the energy modeler’s responsibility to seek out conversation with other team-members. If modelers stay in their silos there is no communication loop, which leads to improper assumptions in a simulated environment and unreasonable assumptions when programming ECMs into a BAS.

Forecasting future building performance requires the input of all facility stakeholders before, during, and after a controls project’s implementation. The most common breakdowns occur between performance modeling and controller programming during implementation. As financial performance metrics for ECMs are provided to ESCOs and FMs, these forecasts are often provided with built in contingencies and disclaimers specifically regarding building operation. These contingencies are becoming good practice in the modeling industry, as more becomes known about the role of other team members when bankable savings are on the line.

Financial performance metrics like simple and discounted payback, return on investment, life cycle cost, opportunity cost, cost/benefit ratio, and annual energy cost savings rely heavily on good energy-modeling assumptions. Unfortunately many of these assumptions are products of research projects and databases, especially as related to maintenance cost impacts, when more accurate figures can be obtained from FMs and experienced contractors. Many facility managers, controls contractors, and building owners tend to forget that their roles in these assumptions are equally (or more) important than analysis predictions. If a BAS receives detrimental assumptions, annual energy savings benchmarks could easily be missed by 200%.

Maintenance costs, overhaul costs, and useful lives of equipment are key components of life cycle performance predictions; often driving a particular ECM ahead of the pack among a series of ECMs identified for a given facility. Regardless of the assumptions an energy modeler makes during an analysis, a breakdown has occurred somewhere if: Chiller 1 is predicted to be the best option over Chiller 2 on paper, but requires twice as much maintenance than predicted for Chiller 2. While controls integrators will receive the brunt of a FMs displeasure stemming from an underperforming system, the cause is sometimes traced back to performance models built without BAS communication inefficiencies.

If you do not operate in the ESPC market or in bankable energy cost savings projects, you may not yet see how deep energy savings often rely on superior performance in energy models. In most cases there are two primary factors that lead to underperformance, as compared to predicted consumption. First, facility managers are largely responsible for keeping to building management schedules, set-points, setback temperatures, building warm-up times, and other systems performance factors in the BAS. If an energy performance model is unaware of the fact that existing controls prohibit the dedicated outside air systems (DOAS) from reacting to signals from the heat-pump condenser water loop overheating or cooling will occur. Similarly if boiler pumps cannot communicate with signals from differential pressure sensors, excessive reheat may occur in VAV systems. In both of these cases models will assume everything is responsive and communicating, and an ECM to dynamically reset boiler output for varying VAV reheat loads is unknowingly not bankable during financial performance tests. The unfortunate breakdown in these scenarios lays among the modeler assuming open-protocol communication among controllers, and the contractor assuming the modeler somehow knows all of the details supporting the implementation of ECMs affected by the BAS.

The second impending factor related to underperformance is a result of scope-creep. Because modelers in an ESPC or a bankable savings project often rely on contractors for access to trend-logs and points-lists, some modeling professionals may be restricted to an energy modeling scope alone. More often than not, understanding trend analysis is crucial in making informed decisions about simulated building performance, including sub-meter data where available. These datasets are important when determining to include safety factors in models that keep equipment running just a bit longer than a basis of design or shop-drawings. Regardless of the industry sector or modeling purpose, parasitic night-time and weekend loads obtained from trend-analysis can be the difference between a useful energy model and a rule-of-thumb. Without the ability to run meaningful trends in an existing building, and scope to evaluate them, an energy model can be more misleading than useful.

In many cases modeling is more than changing control temperatures and features, the “whole picture” needs to be evaluated; this is especially true as trend-analysis begins to relate to inter-zone dynamics (air-flow, reheat, etc.) when modeling with powerful industry-leading software.

Pre-Communicating Findings

In existing buildings, from audit, to analysis, to install the following best practices and considerations are precursors to communicating findings to a FM or another owner’s representative.

  • Ensure that the actual controls engineer is in the ECM discussion to verify plausibility and contingency factors, introducing actual effectiveness into the conversation (i.e. actual air distribution and limitations to comfort and control).
  • Energy models assume that the BAS has complete ability to flawlessly control building systems and need to be told that older existing buildings do not recover as quickly as a new construction, more infiltration is likely, and ECMs often change a building’s stability as a result of affects to its mass-balance.
  • Bring light to the fact that some analysis tools are sometimes flawed, especially with VFDs, is important to diagnose complicated systems in hourly output reports (in lieu of default monthly or annual reports). Teams should be asking questions like: “Can a pump or fan realistically turn down that low? Is building pressure going to become an issue with a new sequence? Is a terminal unit’s control damper really able to be that dynamic?
  • In digital-pneumatic hybrid facility or in a partial upgrade project, before qualitative implications from ECMs are discussed, specific and fundamental questions such as “How can our controller go from analog to digital, on a legacy version, of a marginally open protocol? And are there enough points in the controller module to accomplish this new sequence?
  • Performance curves for major pieces of existing equipment should be modified to reflect existing power draws, efficiencies, and outputs; simulation programs should not be allowed to use default performance – which assumes that equipment is functioning like new. Systems degradation and fouling/scaling factors are important in areas with poor water quality or poor preventative maintenance practices.

Prior to project implementation a successful team should take the time to review assumptions and limitations of the proposed controls system, as well as the means used to calculate energy savings. If an energy modeler is using a dynamic load reset schedule to achieve cooling-plant savings, each of the inputs should be clearly stated and either accepted or adjusted by the controls integrator. Many times a single measure in an energy model involves more than a dozen field implications, some of which may not be possible. In this regard it is helpful to create a how-to-guide framework for communicating project hand-off.

A successful example of handoff documents should include the following five parts. (1) A brief explanation of a proposed measure, to be used as initial conversation with the building owner in order to obtain authorization to proceed, (2) A list of temperatures, schedule-implications, and power-demand savings, (3) a list of all affected equipment associated with the measure, (4) known limitations or approximations of the energy modeling software or calculation, and (5) a written review of how to implement the measure by the controls integration engineer. While it is ideal for the individual responsible for implementing the savings measure to provide the review in step five, many ingrained workflows do not allow for this type of interaction. Many times automation contractors rely on project managers to interface with a project team, and be a liaison to the building owner and installation team.

When communication is perfect traditional workflows are acceptable, but perfect communication is rarely the case and controls-project managers often juggle more than one project at a time. For this reason, a written five or six step guide is a tangible piece of documentation to refer to, and more importantly is a basis of design. A recommend sixth step would be a post-implementation trend-analysis by the energy modeler to confirm successful implementation. When written plans are not presented and agreed upon, the field-integrator responsible for programming new or existing controls may be forced to assume a set-point or schedule in order to finish his or her work on time.

While it does not necessarily make sense for modelers to follow controls integrators into the field to observe the installation of new controls, a project close-out procedure should take place. Should scope-creep or tight budgets be a concern it may be difficult for the energy modeler to generate post-installation trend-logs, and communication breakdowns occur at the end of an ESPC or bankable controls upgrade. When a modeler is able to observe post-installation trends, systems adjustments are typically necessary, which is especially helpful to know before the controls contractor receives a punch-list or final payment. A post-install facility walkthrough will also help confirm installed conditions as compared to the basis of design, closing the loop of bankability.

Matthew Higgins, CEM, ASHRAE-HBDP, LEED-AP (BD&C), MBA

Founder & Chief Analyst

 

Vibrantcy: Collaborative Engineering for Mechatecture

Mr. Higgins founded Vibrantcy after experience working in both a specialty sustainability consulting firm and a large commercial and government M/E/P engineering firm. Mr. Higgins has worked on over 300 new and existing building energy modeling projects, over 100 of which had an associated LEED certification goal. His expertise also includes extensive energy measurement and verification studies, Energy Star building certifications, life cycle cost analysis, creation of specialize analysis tools, and a breadth of public speaking experience throughout the southwest.