Due to the synthetic intelligence boom, latest data centers are bobbing up as quickly as firms can construct them. This has translated into huge demand for power to run and funky the servers inside. Now concerns are mounting about whether the U.S. can generate enough electricity for the widespread adoption of AI, and whether our aging grid will give you the chance to handle the load.
“If we do not start fascinated by this power problem in a different way now, we’re never going to see this dream we’ve got,” said Dipti Vachani, head of automotive at Arm. The chip company’s low-power processors have change into increasingly popular with hyperscalers like Google, Microsoft , Oracle and Amazon — precisely because they’ll reduce power use by as much as 15% in data centers.
Nvidia‘s latest AI chip, Grace Blackwell, incorporates Arm-based CPUs it says can run generative AI models on 25 times less power than the previous generation.
“Saving every last little bit of power goes to be a fundamentally different design than while you’re trying to maximise the performance,” Vachani said.
This strategy of reducing power use by improving compute efficiency, also known as “more work per watt,” is one answer to the AI ​​energy crisis. However it’s not nearly enough.
One ChatGPT query uses nearly 10 times as much energy as a typical Google search, based on a report by Goldman Sachs. Generating an AI image can use as much power as charging your smartphone.Â
This problem is not latest. Estimates in 2019 found training one large language model produced as much CO2 as your complete lifetime of 5 gas-powered cars.Â
The hyperscalers constructing data centers to accommodate this massive power draw are also seeing emissions soar. Google’s latest environmental report showed greenhouse gas emissions rose nearly 50% from 2019 to 2023 partially because of information center energy consumption, even though it also said its data centers are 1.8 times as energy efficient as a typical data center. Microsoft’s emissions rose nearly 30% from 2020 to 2024, also due partially to data centers.Â
And in Kansas City, where Meta is constructing an AI-focused data center, power needs are so high that plans to shut a coal-fired power plant are being placed on hold.
A whole bunch of ethernet cables connect server racks at a Vantage data center in Santa Clara, California, on July 8, 2024.
Katie Tarasov
Chasing power
There are greater than 8,000 data centers globally, with the best concentration within the U.S. And, due to AI, there can be way more by the top of the last decade. Boston Consulting Group estimates demand for data centers will rise 15%-20% every 12 months through 2030, once they’re expected to comprise 16% of total U.S. power consumption. That is up from just 2.5% before OpenAI’s ChatGPT was released in 2022, and it’s reminiscent of the ability utilized by about two-thirds of the overall homes within the U.S.
CNBC visited an information center in Silicon Valley to learn the way the industry can handle this rapid growth, and where it’ll find enough power to make it possible.
“We suspect that the quantity of demand that we’ll see from AI-specific applications can be as much or greater than we have seen historically from cloud computing,” said Jeff Tench, Vantage Data Center’s executive vice chairman of North America and APAC.
Many big tech firms contract with firms like Vantage to deal with their servers. Tench said Vantage’s data centers typically have the capability to make use of upward of 64 megawatts of power, or as much power as tens of 1000’s of homes.
“Lots of those are being taken up by single customers, where they’ll have the whole lot of the space leased to them. And as we take into consideration AI applications, those numbers can grow quite significantly beyond that into tons of of megawatts,” Tench said .
Santa Clara, California, where CNBC visited Vantage, has long been certainly one of the nation’s hot spots for clusters of information centers near data-hungry clients. Nvidia’s headquarters was visible from the roof. Tench said there is a “slowdown” in Northern California resulting from a “lack of availability of power from the utilities here on this area.”
Vantage is constructing latest campuses in Ohio, Texas and Georgia.
“The industry itself is searching for places where there’s either proximate access to renewables, either wind or solar, and other infrastructure that may be leveraged, whether it’s a part of an incentive program to convert what would have been a coal-fired plant into natural gas, or increasingly ways during which to offtake power from nuclear facilities,” Tench said.
Vantage Data Centers is expanding a campus outside Phoenix, Arizona, to supply 176 megawatts of capability
Vantage Data Centers
Hardening the grid
The aging grid is commonly ill-equipped to handle the load even where enough power may be generated. The bottleneck occurs in getting power from the generation site to where it’s consumed. One solution is so as to add tons of or 1000’s of miles of transmission lines.Â
“That is very costly and really time-consuming, and sometimes the price is just passed all the way down to residents in a utility bill increase,” said Shaolei Ren, associate professor of electrical and computer engineering on the University of California, Riverside.
One $5.2 billion effort to expand lines to an area of ​​​​Virginia referred to as “data center alley” was met with opposition from local ratepayers who don’t desire to see their bills increase to fund the project.
One other solution is to make use of predictive software to scale back failures at certainly one of the grid’s weakest points: the transformer.
“All electricity generated must undergo a transformer,” said VIE Technologies CEO Rahul Chaturvedi, adding that there are 60 million-80 million of them within the U.S.
The common transformer can also be 38 years old, so that they’re a standard cause for power outages. Replacing them is pricey and slow. VIE makes a small sensor that attaches to transformers to predict failures and determine which of them can handle more load so it might be shifted away from those susceptible to failure.Â
Chaturvedi said business has tripled since ChatGPT was released in 2022, and is poised to double or triple again next 12 months.
VIE Technologies CEO Rahul Chaturvedi holds up a sensor on June 25, 2024, in San Diego. VIE installs these on aging transformers to assist predict and reduce grid failures.
VIE Technologies
Cooling servers down
Generative AI data centers may even require 4.2 billion to six.6 billion cubic meters of water withdrawal by 2027 to remain cool, based on Ren’s research. That is greater than the overall annual water withdrawal of half of the U.K.
“Everybody is anxious about AI being energy intensive. We are able to solve that once we get off our ass and stop being such idiots about nuclear, right? That is solvable. Water is the basic limiting factor to what’s coming when it comes to AI,” said Tom Ferguson, managing partner at Burnt Island Ventures.
Ren’s research team found that each 10-50 ChatGPT prompts can burn through about what you’d find in a typical 16-ounce water bottle.Â
Much of that water is used for evaporative cooling, but Vantage’s Santa Clara data center has large air-con units that cool the constructing with none water withdrawal.
One other solution is using liquid for direct-to-chip cooling.
“For a variety of data centers, that requires an infinite amount of retrofit. In our case at Vantage, about six years ago, we deployed a design that may allow for us to tap into that cold water loop here on the information hall floor,” Vantage’s Tench said.
Corporations like Apple, Samsung and Qualcomm have touted the advantages of on-device AI, keeping power-hungry queries off the cloud, and out of power-strapped data centers.
“We’ll have as much AI as those data centers will support. And it could be lower than what people aspire to. But ultimately, there’s a variety of people working on finding ways to un-throttle a few of those supply constraints,” Tench said.






