How Co-ops Can Mitigate Their Risk from Data Center Growth Webinar Summary

Teri Viswanath

February 14, 2025

Data Center

Mindful that many U.S. electric cooperatives have recently been approached to serve new data centers over the past six months, my colleague Jeff Johnston and I sat down with two of the industry’s leading experts to gauge how much electricity will ultimately be required and how to mitigate the operational and financial risks associated with that growth. Here’s an abbreviated version of our webinar interview with Dr. Arman Shehabi and Andy Satchwell from the U.S. Department of Energy Lawrence Berkeley National Laboratories. 

What You Need to Know about the Electricity Demand from Data Centers, with Dr. Arman Shehabi

Teri Viswanath: Arman and Andy, what do you think is important for our electric co-ops to know about what we're going to say today?

Arman Shehabi: I think two things that I want to have the audience take away today is that, in a lot of ways, we've been here before with this type of surge. We saw this in the early aughts with a large increase in data center electricity growth. We saw how that went up. It calmed down over a period. After a period of surge, it flattened out as different efficiency measures came into play.

One thing to highlight is that we've been here before. The other is this time is a little bit different in that this growth in electricity use is also happening while there is an increase in demand for electricity coming from other parts of the economy, thinking about vehicles, thinking about heating in buildings, thinking about electrification of industry and on-shoring of manufacturing here in the US. All of that has to be taken into consideration when we think about where things are going in the future.

Andy Satchwell: First and foremost, there are a lot of tools that exist today and a lot of approaches from a tariff perspective, from a rates perspective that can be deployed to address these operational and financial risks. As it is true for US being a patchwork of energy policy, different utilities, and regulatory approaches, the ways in which utilities are engaging with and mitigating these risks for data centers using rate designs and tariffs, it's just going to be a patchwork.

Jeff Johnston: These are just some data points here to put into perspective what we're dealing with, which is what I consider to be profound growth in the energy complex, thanks to AI and data centers. Correct?

Arman Shehabi: When we look at these different scenarios of what could happen, there's different combinations and it allows us to have a range of where we see the electricity use growing in the near future. That range here shows that it could increase from under 7% to about 12% of US electricity use by 2028.

Jeff Johnston: Dr. Shehabi, on your starting point (that the country is currently witnessing data centers account for 4.4% of total demand), correct me if I'm wrong, but I don't think there's a specific data center component to energy usage that they report.

Arman Shehabi: It can be tricky to estimate data center electricity use because it's something that is not reported by data centers. There's different ways that people estimate electricity use for data centers.

What we have to do is not look at it from a building point of view but rather we have to track all the equipment that's going into data centers demanding that electricity and essentially establish an installed base of this equipment. What is that equipment? We're talking about servers, storage equipment, network equipment. We have to distribute that type of equipment into different type of theoretical data centers.

Jeff Johnston: You've got a pretty wide range, with data centers accounting for roughly 7% to 12% of total electricity demand by 2028. What is driving that range of possible outcomes? 

Arman Shehabi: It's a fascinating graph to see that in the 2010s, data centers were all being operated primarily with dual processor servers. That's your typical server that has two CPU chips in it, computer processing unit chips. Around 2017, we're getting what's called accelerated servers. Those accelerated servers, what that means is they're accelerated with an additional type of chip. Those chips are GPUs, graphical processing units. Those are the ones that you're hearing about a lot in the news. Those are the ones that Nvidia makes. When those go into the servers, those increase the amount of power that are needed.

It's those type of servers, those ones with the GPUs that are being used for machine learning, for AI. It's interesting to see in 2017, there was very little demand from this segment, then in 2023 it composed half of electricity use. Then going forward, it's where all the growth is coming from. How many of these GPU servers are going to be operating is a major driver to the electricity use in data centers totally

Jeff Johnston: Just a clarification, Dr. Shehabi, the GPU count, 2, 4, and 8, can you explain what those different categories represent?

Arman Shehabi: It has to do with the size of the server. Imagine, you can have a server. It can have one or two CPUs. Those are the chips that we all have in our laptop right now. Then depending on the size of the server overall, it can be accelerated with two additional GPUs or four or eight. The eight is the one that has been growing the most recently (the last two years or so).

These eight GPU servers are the ones that are what the data centers see as scalable. They take these eight GPU servers and then they start connecting them, linking them all together to these big, large networks of AI, computing servers that do a lot of the AI training that we hear about. It's the growth of these eight GPU servers and the power that those use so much more power than the other ones. That's where the demand's coming from.

Jeff Johnston: To your point, things are changing rapidly. Every month it seems like there's more money being spent and there's new technologies and disruptors entering the market. Can you provide some perspective here?

Arman Shehabi: You can look at this figure, especially if you look at on the high end going from 2024 to 2028, looking at that increase. If things are trending up that way in five years, where is it going to be in another five years, another five years? You could see this going exponentially large. I don't expect that to happen.

Things will smooth out, even though I think demand will keep increasing. The reason why is because different efficiency measures will come into play because there is a strong market driver for that type of efficiency. It's expensive to run these chips. There's going to be a desire in the future for things to get more efficient. Historically, that is what we have seen.

Teri Viswanath: Maybe it would be helpful to understand the major players in the marketplace.

Arman Shehabi: I mentioned that those eight GPU servers were where a lot of the electricity demand was coming from because those are the fastest and the most scalable, those things are expensive. We're talking about $500,000 for each one of those. You've got to have a lot of capital to be able to buy a truckload of them. What we're seeing is where are those type of servers going into? Where is this AI training happening? It's happening with the big players.

Hyperscale being really large data centers that are owned and operated by large tech companies, like a large Google data center. A colocation large-scale data center is one that is built, owned, and operated by a third party, but it's built and contracted out for one of those large players.

 

Teri Viswanath: And the location of these data centers? And, any other concluding thoughts?

Arman Shehabi: We have some idea. We know where about 3,500 data centers are in the US. That's a lot of data centers, but we don't know exactly how much power each one uses because they don't tell us.

There's a lot of uncertainty where the power for these data centers are. There's so much growth right now that the top-tier markets that have been most popular in the past are crowded out and data centers, as probably a lot of the folks here are aware, are going out into different locations, shopping around and trying to find where could they build new data centers.

We can look a few years out, but beyond that, there's going to be large changes. As this initial arms race is going to happen, when the dust starts settling, there's going to be a desire to make things more efficient because that will make it more profitable. Finally, I do want to come back to one of my main points here, is that this growth is happening now, but it's happening while there's growth in other industries as well.

This growth in data centers is the tip of the spear. It's the beginning of what we're going to expect. It's an opportunity for us to figure out how to handle this growth so that we can be ready for it when the next wave comes.

Rate design strategies to mitigate financial and operational risks, with Andy Satchwell

Andy Satchwell: How can these tariffs be designed in a way to also attract these customers but in a way that doesn't induce or introduce a large amount of risk?

At a high level fair allocation of costs is one of these key principles and objectives for many regulators and utilities. Some tariffs employ marginal pricing to really ensure that that incremental load is paying for the full incremental costs that they incur on the system. There's a pending settlement for Indiana Michigan Power, in which the data centers are providing economic development payments. Investing in the community, providing money as a way to say we're willing to stand behind and benefit the local community to ensure that they aren't unfairly bearing the full costs of the data center.

 

Teri Viswanath: We are living through a period where large C&I customers are being much more vocal on the tariff elements that they are willing to live with, right? But what are the common elements you’re seeing?

Andy Satchwell: Industrial customers used to be just put on special tariffs with (often, a discounted energy rate) partly as an economic development objective.

I think is maybe the key elements that we're starting to see. It really is around mitigating utility and ratepayer financial risks. Many tariffs having minimum load requirements and minimum demand charges (they've ranged from 80% to 90%, they're required to pay regardless of their demand level). Upfront payments, requiring sometimes the data center customers to pay for studies to ensure that they can be added in a reliable way with sufficient resource adequacy. On the flip side, imposing fees if they exit their contract early and that those fees go to contribute to the investments and infrastructure that were made.

 

I imagine most of the folks here work with utilities. This is a top-of-mind thing. Make sure those lights stay on, right? A number of tariffs include elements just specifically for this. Again, this minimum load factor, we're seeing 85% is a pretty typical amount, but to ensure that these aren't particularly peaky operations.

Do these data center customers have behind-the-meter resources? If so, can they be used as backup so that it becomes a resource for the utility in an outage? They can actually isolate that customer, and that customer can continue to consume its own supply to energy, and then it frees up the system in some cases or using that backup generation on the grid as supplemental power.

If that backup generation is not being used by the data center customer, can it actually become a grid resource to benefit the system?

 

This is maybe one closing thought or point that I'd like to make is that I wouldn't view these elements in isolation. Are there ways to combine something like minimum demand charges so the data center customer ensures and mitigates those ratepayer risks, such they call, that a large amount of those infrastructure costs are paid for, but the data center customer still has some flexibility to use its backup generation to mitigate its own peak demand. In isolation, these things may not be sufficient, but taken together, it creates some compromise and some ability to meet the needs and the objectives of both the customer and the regulator and the utility.

 
 

Disclaimer: The information provided in this report is not intended to be investment, tax, or legal advice and should not be relied upon by recipients for such purposes. The information contained in this report has been compiled from what CoBank regards as reliable sources. However, CoBank does not make any representation or warranty regarding the content, and disclaims any responsibility for the information, materials, third-party opinions, and data included in this report. In no event will CoBank be liable for any decision made or actions taken by any person or persons relying on the information contained in this report.

 
 
 
 

Stay ahead of the game in your field. Subscribe today.

Get CoBank's industry-leading Knowledge Exchange research reports delivered straight to your inbox as soon as they're released.

Have a comment or question about these reports?

Contact CoBank's Knowledge Exchange team to ask questions, engage with analysts or receive additional information.