Innovation in Evaluation: Cracking the Code

Sustained Savings Energy Efficiency

Innovation in Evaluation: Cracking the Code

Gita Subramony for Zondits, March 19, 2014

The American Recovery and Reinvestment Act (ARRA), also known as the stimulus bill, allocated substantial amounts of funding for energy efficiency and clean energy projects. According to the Alliance to Save Energy, $3.08 billion was appropriated for the State Energy Program and was used to fund traditional energy efficiency initiatives, such as boiler replacements and weatherization. One aspect of the program was a bit different than typical energy efficiency initiatives and instead targeted improved energy code adoption, compliance, and enforcement. These initiatives funded training and provided material support to a range of energy code stakeholders and also came with a requirement to adopt the 2009 IECC model building code – which in many states represented a significant advancement.

Traditionally, efforts to adopt the latest energy codes have lacked support and funding, and compliance with existing codes has remained low (less than 50% in many jurisdictions). In addition, code enforcement procedures are inconsistent and are at the mercy of the limited resources available on the jurisdictional level. However, a task force led by the Institute for Market Transformation (IMT) has demonstrated that funding for code update and compliance efforts results in an excellent return on investment; they claim that each dollar spent on code support activities results is $6 in energy savings. Additionally, a recent report by Pacific Northwest National Laboratory (PNNL) also shows impressive returns from dollars spent on code update initiatives. The report examines federal funding for code advocacy and support from 1992 to 2012 and finds that each dollar spent resulted in $400 in energy cost savings.

The ARRA-funded program assisted State Energy Offices across the USA to advocate for the adoption of recent IECC or ASHRAE codes on the state level, to educate builders and architects on the code requirements, and to provide training to building code officials on enforcement best practices. By accelerating the adoption of more stringent energy codes and by providing training for builders, architects, engineers, and code officials, the ARRA-funded initiatives have led to significant energy savings.

Although it is clear that these types of program do result in energy savings, quantifying these savings and attributing them to the initiatives is complicated. For the ARRA-funded State Energy Program, Oak Ridge National Laboratories was tasked with overseeing the evaluation process, with ERS as a subcontractor leading the energy code impact evaluation. The evaluation team’s primary goal was to quantify the energy-saving impacts of these code-update and -support initiatives.

This type of impact evaluation has a number of unique challenges. Since the program affected different states and regions across the USA, evaluators had to grapple with a diversity of climates, building stocks, codes, enforcement regimes, and politics. The culture of code update processes varied across the country. Some states did not even have a building code in place at the start of the program, let alone an energy code, while others diligently updated their codes at regular intervals.

Given these challenges and a limited budget, the team developed a uniform method that was sufficiently adaptable to be useful in even the most exceptional circumstances. The evaluation team conducted a series of in-depth interviews with program administrators and code experts in both the public and private sectors. The interviews addressed a range of political and technical topics, including compliant and noncompliant building practices at the measure level. The results were bolstered utilizing Delphi process techniques to improve precision on the quantitative aspects of interviewee responses. The Delphi process anonymously shares answers to interview questions with the entire group of interviewees, who then can revise their original responses. In addition, evaluators collected data on building stock growth by building type and differences in energy use intensities broken down by code version. The interview responses combined with this additional data allowed the evaluators to develop savings values on a state-by-state basis.

The team was also tasked with figuring out what portion of these overall energy savings was a direct result of the ARRA-funded program. Essentially, the evaluators were trying to determine how vital the program lobbying efforts were to the eventual adoption of updated codes. The team also had to examine and parse out other market influences on energy efficiency such as consumer demand and the growing trend of green building.

Overall, this evaluation effort provided a compelling and cost-effective road map for developing savings and attribution methodologies for code-based programs. This approach could be used at the state or local level to assess the impact of policies and programs aimed at improving code enforcement or accelerating code adoption.

Though the study is still in process, check back to read the final evaluation results.