Log In

Data centers: The race to power AI

Published 2 days ago10 minute read

moving fast to deploy AI at scale. But scaling AI means more data centers—and data centers consume vast quantities of energy. On this episode of The McKinsey Podcast, McKinsey’s Jesse Noffsinger and Pankaj Sachdeva speak with Global Editorial Director Lucia Rahilly about what needs to happen to build new, bigger data centers quickly, as well as to sate these data centers’ growing hunger for power.

The McKinsey Podcast is cohosted by Lucia Rahilly and Roberta Fusaro.

The following transcript has been edited for clarity and length.

Hey, Roberta.

Hey, Lucia. It’s great to see you. Coming up, we’ll hear a conversation you had about data centers. They’re critical for scaling AI, but they require so much power and investment, so there are a lot of challenges. But before we get into all that, let’s talk about what’s new on McKinsey.com.

We have an article about the future of foreign aid and the kind of impact a decrease of many billions of dollars will have on recipient countries and organizations.1 And we have another article about five geopolitical-risk questions that all chief information officers [CIOs] should consider.2 Both articles can be found on McKinsey.com and in our show notes.

Gen AI has exploded over the past couple of years. McKinsey’s research shows that the generative AI economy is expected to create $4 trillion of value by 2030.3 But to grow, AI needs data centers—or the physical space to house and run the necessary technological equipment. And data centers need lots of power, says McKinsey Partner Jesse Noffsinger.

There’s a need for this expansion of capacity, but we also have to think through the sustainability implications and the innovations needed—things like cooling improvements or what happens with chip architecture. Players investing in the AI economy are aware of and working on investments in the data center space as well.

Investing now is key since generative AI feeds its own rapid growth. The speed of improvement has to do with AI’s ability to infer and analyze information based on how the AI model has been trained. Again, an enormous amount of computing power and data storage is needed to meet the demand of that kind of expansion. McKinsey Senior Partner Pankaj Sachdeva says that globally, we’re currently using about 70 gigawatts of data center supply. But in five years or so, we’ll use about 220 gigawatts, and that means we’ll need to find more land to build more data centers.

One of the big implications is that data centers traditionally used to be concentrated in what we’ll call tier-one markets—think Ashburn, Virginia; Chicago; Phoenix; and places like that. We’re now moving to an era where data centers are going where power is available, because most of these traditional data center cities are reaching the max capacity of power that can be made available in a short period of time. You’re now seeing data centers moving into places like Wisconsin, Oklahoma, and Wyoming, where the local demand might not be as high, but there are plenty of other supply factors, especially power, that are available.

And Pankaj says that 75 percent of this growth is driven by AI, and that AI, structurally and architecturally, is much more compute intensive.

These are power-hungry data centers, and that’s massively changing their architecture. One of the big shifts is that traditional cooling methods are not enough to cool the kind of heat that’s being produced. We’re seeing a shift to liquid cooling to be able to achieve that. There are architectural changes up and down the ecosystem that are being premised because of this increasing demand for power.

The third shift is training and inferencing. Right now, we’re in a very training-intensive environment, where a lot of training capacities are needed and being built. Over the next five years, we’ll see the shift toward inferencing. Training AI tends to require very large data centers—with the capacity of hundreds of megawatts. That takes a lot of compute capacity and a lot of data.

And then you need a whole host of edge data centers. So when you think of both the opportunities and the challenges, the opportunity gamut has increased both in size and in the types of opportunities where you can participate in the data center economy. But at the same time, there’s real value in addressing the supply chain and technological challenges.

The opportunity gamut has increased both in size and in the types of opportunities where you can participate in the data center economy.

Some of the challenges of meeting this escalating need for power include sustainability issues and limitations on reliable power sources. Jesse says the infrastructure requirements are twofold.

One is on the energy side, and the other is the actual building and commissioning of the data centers themselves. On energy, I like to focus on two things. First is increasing rack densities. This leads to a point where we’re building buildings or campuses that have much higher demand for power at a single site. If you’re looking to add 100, 200, 500 megawatts—or even a gigawatt—of power at a single campus, there aren’t that many places where you can just attach that to the grid. Think of it as the headroom or the open space being limited. So we’re going to need to build a substantial amount of grid infrastructure to support the rising AI economy. That tends to take time.

Even as the world’s consumption increases, we expect the number of data centers to be much higher.

Jesse Noffsinger

Second, it’s not just about delivering the power; as we go upstream, it’s about generating enough electricity. Even as the world’s consumption increases, we expect the number of data centers to be much higher. And in places like the US, where we haven’t seen a lot of net load growth in a while, this power demand is driving almost the entirety of the upside we see over the next few years.

Practically speaking, building data centers takes time.

Data centers take around 18 to 24 months to build. But with the delays we’re seeing, those timelines are getting longer and longer.

The challenges are significant. How should the various players think about better calibration to improve the supply–demand ratio?

We’re thinking in the range of a trillion dollars of capital that needs to be invested. The folks who are sitting on the side of energizers—the transmission developers and power generation companies—need to think about how to build and develop projects that are clean, have a high ROI, and are located in places that can support this demand.

Five to ten years is the horizon where you can really think through projects. And we’re showing a forecast out to 2030. That’s very soon when it comes to trying to build grid infrastructure and large-scale generating projects.

And to build quickly, innovation is needed to squeeze everything possible out of existing infrastructure.

This could be things like battery storage to help manage the flow on transmission lines and get more utilization out of them. We’re seeing deployment of small and modular assets that can be rapidly put out in the field—things like fuel cells, generator sets, small turbines—to power data centers on-site.

That’s in the short term. And to meet the energy demands over the next decade, midterm and long-term horizons include ideas like—

—a large-scale gas plant. But getting started on building infrastructure that can support central generation—and that includes large-scale wind and solar farms—needs to happen now. We’ve already started to see players begin to invest in what we’ll call the next generation of large-scale cleaner technologies: things like geothermal, carbon capture, and next-generation nuclear reactors.

There is real momentum, and we’d love to see it happen over the next decade—sort of that midterm period. But we think it’s more likely in the stage of demonstrating and commercializing those technologies so that in the longer term—call it 15 years or so from now—they start to play a major role in the delivery of electricity for both the AI economy and economies writ large.

And, historically, clean-energy investors have helped solar and wind development to scale.

And none of them have backed off their sustainability commitments. So while there’s pretty substantial debate about how exactly to do the accounting on clean energy—given we’re all on a shared grid, and it’s hard to tag an electron or a megawatt hour once it’s produced—we are seeing significant investment from these companies, which may help us unlock that clean-energy future.

These dynamic technologies could disrupt the AI market. For example, innovations like DeepSeek and others have led to incredibly sharp efficiencies. But they’re just the beginning of this ever-burgeoning technology. How should leaders think about investing in a space that is still so new and prone to change?

If you are an enterprise or an enterprise CTO [chief technology officer] or CIO, it’s less about adoption. Adoption is going to scale very quickly, and many enterprises are seeing a lot of adoption already. How are you going to address structural barriers from scaling within a large, complex enterprise? Having that value orientation when you’re thinking about AI is incredibly important.

If you are a supplier or a developer or an LLM [large language model] provider that’s supplying into this ecosystem, you have to think about a very customer-backed view. Who are you building your products or services for? What is going to drive the demand for your products or services? And in the end, are these for an enterprise, a CIO, a CTO? Are these for a customer or a big-tech hyperscaler? And how would they use your products?

And then if you’re an investor, you have to think about where is there a durable business case? Is the business model you’re investing in durable? Is it adjusted for some of the risks and ebbs and flows that you will see? And is it flexible enough to allow you to scale your investments up or down as demand and supply patterns shift?

And then finally, are you accounting for how geopolitical risks will impact this ecosystem in the longer term?

Given the scale and complexity of some of the challenges, Pankaj says no single company or organization will be able to solve them.

It will require an ecosystem of partnerships. We are seeing many of those emerge, but many more are needed. Thinking through how you build those partnerships or participate in that ecosystem is incredibly important.

Also important is to remember that in most parts of the world, energy and power are on a synchronized grid. Or rather, they’re a shared resource.

The same frequency that you see today in Miami looks like what you’ve got in New York. We’re sort of connected by the speed of light. So it’s important to remember that you’re playing into a system. It’s not “AI power”; for the most part, it’s power on the grid.

On the power side, the questions are: What’s the durable business model? And what’s the value you’re bringing—either to the data center and AI economy that will unlock new development, or that supports the broader growth of the power sector and its ability to meet the challenges of the energy transition?

All of this, anchored in a durable business model that’s looking through this integrated energy system lens, can carry out a lot more weight over the next decade or so.

Jesse Noffsinger

Pankaj Sachdeva

is the global editorial director of McKinsey Global Publishing and is based in the New York office, and is an editorial director in the Boston office.

Explore a career with us
Origin:
publisher logo
McKinsey & Company
Loading...
Loading...
Loading...

You may also like...