Ask the Expert: Next Generation Utility Program Measurement and What it Means for ESG

GianGabriel Masoni Dobles, Zondits staff, 5/23/2023

Zondits interviewed Nathan Phillips, a senior analytics engineer on the analytics and data science team at DNV. His work spans data engineering, building software, and assembling the tooling that helps utilities and other customers achieve energy savings. To serve both utility clients and colleagues at DNV, he navigates a software stack that includes multiple analytical and development tools, and a growing amount of front-end software development work. His passion for emissions reductions ignited three years ago through work with Watttime. In that work he tested data as a proxy for energy savings, and quickly discovered something else entirely; the very anatomy of programs was going to change in a lasting way down the road. 


Can you tell us about your work with Watttime? 

Watttime is one of several providers of emissions factors available in the United States and now globally. It’s a mission-driven non-profit owned by the Rocky Mountain Institute, and they champion the use of a marginal emissions factor, which captures pounds of CO2 related to a change in load on a grid region at a certain time. Rather than using an average value, they offer an average of all electricity generation on a given grid region at one time. If your goal is to reduce emissions overall, then you want to use the factor that is associated to the generation that would have to come online to meet new demand at any given moment, adding load to that specific grid region. The value it conveys is typically a higher pound of CO2 value because the margins are oftentimes met with peaker plants and fossil fuels. They’re just one of the data sources that DNV leverages most of the time. 

You’ve had a hand in the development of the EVOLVE tool, what can you tell us about that? 

The EVOLVE suite was a huge accomplishment over several years of development by DNV. The real fruit of this effort within EVOLVE is yet to be fully realized. Watttime is currently used in parts of the EVOLVE suite, basically as an accounting metric to capture compounding hourly savings value in kWh or NWH into pounds of CO2. It’s complemented with our DERIVE tool which captures DER modeling, and our customer scorecard. It allows for utilities to bounce back and forth between years of energy consumption versus that same energy consumption translated into carefully calculated emissions from that energy consumption. 

The real promise of what you could do with it is helping to shift your entire energy efficiency portfolio from solely an energy savings target, to encompassing a comprehensive emissions target- a direction that major utilities are moving in. This allows the utility a chance to toggle between examining if they’re meeting state mandated energy targets and aligning assets to mitigate and manage the emissions in a respective service region. This can make a utility more nimble and helps manage in real time in concert with a nuanced energy mix and load. 

How are the tools you work with used in accounting for GHG emissions? 

DNV is seeing an interest in how you might shift a portfolio to set an emissions goal and then actually reach it. Shareholders are increasingly demanding tactical plans in achieving emissions reduction in an industry where goals are mandated by regulatory bodies. Utilities in proactive states are eager to develop methodologies to show receipts. 

In an atmosphere of enhanced transparency, as wheels meet the road with ESG, it will be on utilities to validate claims of emissions reduction. From the DNV team’s perspective, this likely is done in more detail using marginal emissions factor versus average emissions factor. In measures beyond your conventional energy efficiency measures such as a comprehensive lighting retrofit, we can perform a more detailed analysis factoring things like whether lights would be running during business hours, when the sun is shining the most, resulting effect on load, etc. The margins might actually be the cleanest during the middle of the day when all those lights are on. However, your grid may become almost entirely fossil fuel based in the middle of the night. Thus some change to your might be the greatest way to reduce your emissions, even if it doesn’t reduce the greatest number of kW from your overall load. 

This examination might prompt a major paradigm shift in how pilot programs are used or in how evaluating existing programs factor in overall alignment. Is there a misalignment? As utilities start really getting aggressive about setting targets and collaborating with people in groups like DNV to develop plans to meet those targets we’re looking a process where one can set a target off historical data, establish a benchmark, figure out what you’re going to use to measure, come up with the package of the energy efficiency technologies you want to use and then build a program around that. 

How do you view the utility perspective on emissions accounting? 

Utilities are conservative entities. While it’s great that some are taking the initiative to be leading utilities in this space, it’s not clear what the carrots or the sticks are for them without top-down pressure. This is not a money-making endeavor that leaves offsets or tariffs the subject of great discussion. But in the meantime, DNV’s job is to best equip industry leaders to thrive in the near future and break from the pack. 

What roles do machine learning & AI play? 

Machine learning has a huge place in being able to effectively answer some of these questions. A simple example of that is most people agree that at least an hour granularity is required to determine the real impact of a decision, whether that’s like an energy efficiency technology, solar PV or beyond. Hourly is an emerging standard for what a utility can glean from strategic decision-making in near real-time. That is a lot of data to comb through, but we can use various machine learning techniques to detect patterns. By conducting what amounts to a signal analysis of 60-80 profiles, a utility becomes less worried about the total magnitude, and can focus on the pattern of the energy consumption which makes them more responsive. Given a midday peak or two peaks, you can parse through multiple shifts and use machine learning (ML) to flag 18 or 20 distinct internet use patterns. This simplifies the recommendations that drive targeted marketing to customer segments or remedy operational concerns. 

Conversely, if there’s no peak, then there’s not that much that you’re going to be able to do at all to offset the emissions, but at least you know and can get creative. Given two peaks, it becomes a game of load shifting. With the careful application of machine learning, a utility can simplify the surface area of the problem they’re trying to solve. 

What’s really exciting you right now given the policy climate? 

One thing that’s interesting to me, is that some of the legislation that is coming down the pipeline is not super aggressive. It just uses annual emissions factors, seldom specific to the grid region. Given a much speculated upon SEC ruling on ESG related disclosures, it doesn’t really prescribe that you would use hourly emissions factors. And at the same with the GHG protocol, which is kind of the gold standard for how you would perform an emissions analysis and accounting for a company that does not use hourly values, that uses annual averages. The use of annual averages presents a problem in how we will get to successful reporting which can use a more granular analysis. We’ll have to be there prepared to assist companies when that happens. 


There are so many different data sources that we can use. There’s Watttime, the comparable Electricity Map and several open-source initiatives right now. That’s to say that no one truly agrees on a single basis for calculating emissions factor should be and whether it’s a black box approach or not. A data analyst like Nate can fundamentally ingest any of them. Meanwhile, DNV’s posture has been to be agnostic, acknowledge the pros and cons of these data sources, and proceed slowly as a consensus emerges.