Skip to main content

Is AI worth the environmental costs? | The Excerpt


On a special episode (first released on November 21, 2024) of The Excerpt podcast: In the US, demand for power from AI data centers is skyrocketing, driven by the intensive computational requirements of its models, which often require vast amounts of energy for both training and operation. Then there are also AI’s carbon emissions. In many cases, the electricity used to power the AI data centers today relies on nonrenewable energy sources such as coal or gas. Can we afford AI’s huge environmental costs? Landon Marston, an associate professor at Viriginia Tech’s Environmental and Water Resources Engineering program, joins The Excerpt to discuss how engineers and policymakers are approaching solving for AI in the long-term.

Hit play on the player below to hear the podcast and follow along with the transcript beneath it.  This transcript was automatically generated, and then edited for clarity in its current form. There may be some differences between the audio and the text.

Podcasts:  True crime, in-depth interviews and more Paste BN podcasts right here

Dana Taylor:

Hello and welcome to The Excerpt. I'm Dana Taylor. Today is Thursday, November 21st, 2024, and this is a special episode of The Excerpt. As artificial intelligence technology continues to advance at a rapid pace, its environmental footprint is becoming an increasing concern for policymakers and environmentalists.

In the US, demand for power from AI data centers is skyrocketing, driven by the intensive computational requirements of AI models, which often require vast amounts of energy for both training and operation. Then there are also AI's carbon emissions. In many cases, the electricity used to power the AI data centers today rely on non-renewable energy sources such as coal or gas.

AI also demands significant cooling, which can be provided by air, water, or both, neither of which comes without downsides. Can we afford AI's huge environmental costs? Joining us to discuss all of this is Landon Marston, an associate professor at Virginia Tech's Environmental and Water Resources Engineering Program. Thanks for joining me, Landon.

Landon Marston:

Thanks for having me.

Dana Taylor:

Help us understand the quantity of energy needed by AI to do different things. There's powering these data center, there's training large AI models, and then there's asking AI to create a video, for example. Give us a sense of the scale here.

Landon Marston:

Sure, and I think it's important to put this in a broader context of not just artificial intelligence, but really computational demands needed for everything that we use, including this very conversation that we're having online requires data, and that data requires storage, and that also requires compute. And so AI is one piece of that larger puzzle. And so as far as how much energy they consume, that varies depending on what step of the process that you're looking at.

You mentioned earlier it takes a significant amount of energy in order to train these models, and so that's basically getting the models before you even see them. These companies are investing lots and lots of money and resources in order to be able to develop these models.

Dana Taylor:

As more individuals and businesses start making use of AI, what strain does that put on our power grids?

Landon Marston:

When we did our study in 2018 or the data was from 2018, we estimated that the energy requirements for data centers, so this is really before AI took off, was about 1.8% of all energy demand within the United States. Now it's around four to 5% of US energy demand comes from these data centers, cryptocurrency, and now AI. And that's expected to increase.

In fact, the International Energy Agency estimates that that will roughly double from about 450 terawatt hours to almost 1,000 terawatt hours. And so to put that in perspective, a terawatt hour is the amount of energy that would power about 100,000 households in the US each year.

It's about one-fourth of the amount of energy that Hoover Dam produces each and every year. A couple years ago when we did this study, that 1.8% of electricity consumption is roughly the equivalent of New Jersey.

Dana Taylor:

What alternative sources of energy can we turn to in order to help accommodate the energy demand for AI? Are renewables like wind and solar able to help? What about nuclear power?

Landon Marston:

That's a great question, and we're seeing some of that more recently. In fact, in the last month, we're seeing major agreements between these data center operators, Amazon, Google, Microsoft, who are making deals with power companies to secure the power in the immediate term, and that usually entails the expansion of current energy producers.

And so that might be coal or natural gas. In the case in New Jersey, they brought back on a nuclear power plant or plan to bring back on a nuclear power plant. And so in the short term, to meet those energy demands, the focus has been on trying to keep existing operations going or expanding those operations. But in the longer term, they're going to need to develop new sources of energy.

And that's going to require a lot of different types of sources of energy. Obviously, renewables are going to play a part in that, but these data centers and these AI operations are going to need firm power, meaning that it's available at all times. And that's not always the case for things like solar and wind.

play
Is AI worth the environmental costs?
AI has a huge carbon footprint today, from its use of coal and gas to cooling systems using air and water.

Dana Taylor:

Landon, the other environmental aspect of these big data centers is keeping them cool. These semiconductor chips run really hot after computing trillions of bits of information. This is where either water or air comes in. Can you tell us how these two very different methods work and what the advantages and disadvantages are to each?

Landon Marston:

As you noted, there's trade-offs that are involved between these different cooling technologies. And so if you use the air methods you're referring to, typically those require much less water, but they're often much more energy intensive. And so while you reduce the amount of water needed at the facility, you significantly increase the energy requirements. And by doing that, you also increase, depending on the energy source, the amount of greenhouse gases associated with the training of these models.

We're talking about AI or more generally with data centers, the storage and the compute associated with these data centers. And that can have implications not only on the greenhouse gas emissions, but also the water use, because energy, depending on the source of the energy, can require significant amounts of water. And so things like coal and natural gas, these are used by thermoelectric power plants and these require tremendous amount of water in order to operate.

So when we are making this trade-off between using these air systems in order to cool versus liquid cooling, which require more water on the site, we might be using less water on the site for the air systems, but we're going to be using more energy. When we use more energy, that means we're often using more water somewhere else. Sometimes it's difficult to trace where that water is actually being used.

And it can often have implications on local watersheds and ecosystem health, because not only is water being used by these thermoelectric power plants, but oftentimes it ends up elevating the temperature and you have some water quality issues that go along with that. And that can have broader implications on ecosystems as well.

Dana Taylor:

One of the less expensive methods, and this may be what you're referring to, of keeping the data centers cool is evaporating water cooling systems. Can you walk us through that technology specifically?

Landon Marston:

Evaporative cooling, and this is the same underlying technology that's used by the power plants that's used by data centers, it's basically dissipating that excess heat that builds up within the system and transfers that into the water. And that water ends up evaporating, going to the atmosphere. And so that water is effectively consumed, not going to be available for other users within the local watershed.

And so by doing that, these data centers or the power that they end up depending on end up using a significant amount of water. One thing I try to put in context when I talk about this is these data centers and AI use a significant amount of water compared to other types of commercial industry. However, compared to things like agriculture, it's a relatively small amount of water in the grand scheme of things.

Having said that, agriculture is pretty widespread across the country and across the world. These data centers are hyper-localized. And so while they might not have a tremendous impact on every watershed, much like we see for many agricultural sector, they do have really pronounced impacts on local watersheds and water availability. And this can have implications on not only the ecosystems as I alluded to earlier, but also local infrastructure.

And so oftentimes data centers need a certain quality of water, and that's often treated. The same water that comes to our houses and to our businesses will often be delivered to these data centers. And that can mean trade-offs in terms of access to water and also in terms of infrastructure capabilities and potential expansion, which might lead to additional cost, might be picked up perhaps by the data center operator, but also it could be spread across all of the different customers of that water provider.

Dana Taylor:

These are obviously huge costs for running a data center. Can AI be used to mitigate its own environmental impact?

Landon Marston:

That's a great question and one I've seen posed elsewhere. I think the verdict's still out. You see almost two camps really. On one hand, we have a group that professes that AI is going to be able to solve all our problems, and the other camp that basically I don't say ignores, but disregards some of the benefits of AI and strictly looks at these environmental consequences of AI.

I think like most things, the truth is somewhere in the middle, that AI is going to be able to solve many problems, and that's going to be including helping us become more energy efficient. But at the same time, we can't ignore the vast amount of energy and the environmental consequences of the energy use that AI is responsible for.

And so I think really, at least in the short term, the way that I see AI being most helpful is not necessarily by providing some silver bullet that's going to be able to solve all our energy woes, including the ones that it creates, but instead, I think it's going to be more helpful in getting AI tools into the hands of local or domain experts who really know their systems well. An example of this might be like at a warehouse and a facility might be able to use AI to figure out how they can personalize or individually reduce their energy consumption within their own facilities.

Dana Taylor:

How can policymakers support the development of environmentally friendly AI technologies?

Landon Marston:

I think there's a couple of things that can be done. One of those is I see AI, while it is very energy intensive, can also serve as an opportunity. And what I mean by that is we're going to need to greatly expand our energy grid in order to be able to not only sustain these AI and data centers that are going to be coming online the next several years, but also more broadly thinking about things like EVs and the electrification of many of our everyday energy demands that might not currently come from an electricity grid.

These are going to put a strain on our electricity grid and we're going to need additional energy sources. AI and the vast amount of resources that are being put behind it can afford an opportunity to more quickly clean up our electricity grid. And so by greater investments in renewables, we talked about nuclear power earlier, AI can help accelerate this with this infusion of not only demand, but also resources behind that demand for new energy.

Dana Taylor:

What steps, if any, are being taken right now to make AI development more sustainable?

Landon Marston:

I think there's a lot of things that are going on right now, particularly trying to make the processes behind both the training and the implementation of these AI tools more energy efficient. And that includes things from more advanced optimization algorithms that can work more efficiently to better computer chips and servers, and also just the facilities themselves.

We see the shift from these smaller data centers to more of these hyperscale data centers where they are operating much more efficiently. There's these economies of scale that take place where data centers are able to, as they grow, spend less and less resources toward things that aren't directly contributing to compute. In doing so, significantly reduce their energy requirements per compute.

Dana Taylor:

AI is, of course, evolving so quickly. Landon, do you have any concerns that these data centers are growing at such a rapid pace, they're going to be too big to fail?

Landon Marston:

Oh, that's an interesting question. I mean, in some degrees they kind of already are. We are so dependent on these data centers in our everyday life. From the shows that we watch to our daily work, talking to people and our family on FaceTime, all these require data centers. I think our dependency on these systems is only going to grow in the coming years. As new technologies and new tools become available to us, we are going to rely on these more and more.

And that's why it's important that we preemptively think about not only how we might use these tools in our daily life, but behind that, what are the implications of these data centers and more specifically AI in terms of how it's relating to the environment and how it's relating to the ecosystems that depend on that. The water and obviously with climate change always on the forefront of our minds. How is that impacting greenhouse gas emissions and attributing to climate change as we move forward?

Dana Taylor:

Let's end with the question we started with at the very beginning. Can we afford this environmental cost? Is it worth it?

Landon Marston:

Can we afford the environmental cost? I think that's going to be dependent upon how things progress in the coming years. As I noted previously, what we're seeing is that while there's a lot of concern, including some of our own research, it's important to note or to put these values into context.

As I noted earlier, a lot of these energy uses and the associated environmental footprints, while large for this particular industry, still are relatively small compared to things like agriculture when you talk about water use or the energy sector more broadly when you talk about greenhouse gas emissions. And so I think we have to just evaluate it like we would any other technology.

Is the value that it's bringing to society worth the trade-offs that come along with it? So that's the first question. And then the second thing is, what can we do to help mitigate those trade-offs and to reduce those trade-offs? Is there methods that we can lean into in order to reduce our environmental footprint associated with these new and emerging technologies?

It's up to all of us, including the engineers that have worked these companies, in order to be proactive about that and to think critically about not only the bottom line, but how we can improve society economically as well as the environment at the same time.

Dana Taylor:

Landon, thank you so much for walking us through this.

Landon Marston:

Thank you for having me.

Dana Taylor:

Thanks to our senior producers Shannon Rae Green and Kaely Monahan for their production assistance. Our executive producers is Laura Beatty. Let us know what you think for this episode by sending a note to podcast@usatoday.com. Thanks for listening. I'm Dana Taylor. Taylor Wilson will be back tomorrow morning with another episode of The Excerpt.