Skip to Main Content
 

Major Digest Home Sustainability, grid demands, AI workloads will challenge data center growth in 2025 - Major Digest

Sustainability, grid demands, AI workloads will challenge data center growth in 2025

Sustainability, grid demands, AI workloads will challenge data center growth in 2025
Credit: Network World

Data centers this year will face several challenges as the demand for artificial intelligence introduces an evolution in AI hardware, on-premises and cloud-based strategies for training and inference, and innovations in power distributions—all while opposition to new data center developments continues to grow.

Uptime Institute details the major challenges and opportunities the data center industry will encounter this year in its report, Five Data Centers Predictions for 2025. The data center industry is up against significant challenges due in part to the rapid expansion of large data centers, according to Andy Lawrence, executive director of research at Uptime. The research firm reports that it has tracked new builds with over 100 megawatts planned globally, and the total comes to $388 billion worth of new data center projects globally, with more than 63,000 megawatts.

“Data centers are going to face intense scrutiny as they consume more energy and more water. In a way, the battle between sustainability objectives and AI and development objectives inside government and across society hasn’t really begun,” Lawrence explained on a recent webinar sharing the research firm’s predictions. “One of the outcomes from this is that we think developers and operators are going to have to be much more transparent, more collaborative, and be seen to contribute much more. It’s going to happen a lot more where there’s going to be a requirement for the data centers and the data center industry to make its case or suffer.”

The Uptime Institute’s five predictions involve:

Public scrutiny and opposition

Uptime analysts kicked off their 2025 predictions with the idea that data centers will face intense scrutiny and public opposition due to their growing resource use and environmental impact. Data centers are growing dramatically in size and number, using 20% to 30% of power resources in some regions, Uptime reports. As stated above, this will drive developers and operators to be more transparent and collaborative to address the challenges of public scrutiny.

Uptime predicts that data center projects might find themselves caught between government support for initiatives such as AI and other parts of the government that are pushing for sustainability regulations and restrictions. Advocacy groups are expected to increasingly target data centers to challenge their environmental impact at the same time as the data center industry is poised for massive expansion due to the growing AI demand.

Cloud training for AI models

Uptime believes that most AI models will be trained in the cloud rather than on dedicated enterprise infrastructure, as cloud services provide a more cost-effective way to fine-tune foundation models for specific use cases. The incremental training required to fine-tune a foundation model can be done cost-effectively on cloud platforms without the need for a large, expensive on-premises cluster. Enterprises can leverage on-demand cloud resources to customize the foundation model as needed, without investing the capital and operational costs of dedicated hardware.

“Because fine-tuning requires only a relatively small amount of training, for many it just wouldn’t make sense to buy a huge, expensive dedicated AI cluster for this purpose. The foundation model, which has already been trained by someone else, has taken the burden of most of the training away from us,” said Dr. Owen Rogers, research director for cloud computing at Uptime. “Instead, we could just use on-demand cloud services to tweak the foundation model for our needs, only paying for the resources we need for as long as we need them.”

Data center collaboration with utilities

Uptime expects new and expanded data center developers will be asked to provide or store power to support grids. That means data centers will need to actively collaborate with utilities to manage grid demand and stability, potentially shedding load or using local power sources during peak times. Uptime forecasts that data center operators “running non-latency-sensitive workloads, such as specific AI training tasks, could be financially incentivized or mandated to reduce power use when required.”

“The context for all of this is that the [power] grid, even if there were no data centers, would have a problem meeting demand over time. They’re having to invest at a rate that is historically off the charts. It’s not just data centers. It’s electric vehicles. It’s air conditioning. It’s carbonization. But obviously, they are also retiring coal plants and replacing them with renewable plants,” Uptime’s Lawrence explained. “These are much less stable, more intermittent. So, the grid has particular challenges.”

Radical data center electrification

Data centers in 2025 will need to undergo radical electrification, moving toward medium voltage systems to handle the increasing power demands of AI workloads. According to Uptime, infrastructure requirements for next-generation AI will force operators to explore new power architectures, which will drive innovations in data center power delivery. As data centers need to handle much higher power densities, it will throw facilities off balance in terms of how the electrical infrastructure is designed and laid out. For instance, AI systems are already reaching power levels of 100-120kW per rack, far exceeding typical data center densities, Uptime reports.

“We think that this is the time when the industry will have another hard look and invest more money in overall electrification,” said Daniel Bizo, Research Director at Uptime Institute. “We are looking at the possibility that a growing number of facilities will be expected to handle drags that were only around in supercomputing before.”

Nvidia alternatives will emerge

The final prediction Uptime shared for data centers in 2025 directly addressed Nvidia’s dominance in the enterprise GPU market. Uptime expects organizations to seek alternatives to these “power-hungry GPUS, especially for inference tasks that require fewer computing resources.” Uptime reports that there are signs that AI hardware will become more diverse in 2025, and there will be alternatives to Nvidia’s high-density, liquid-cooled AI systems, including distributed GPU computing, AMD GPUs, and AI chips from hyperscalers and startups targeting inference workloads.

While Nvidia’s high-density, high-power AI system has set the standard for what data centers will need to support generative AI workloads, Uptime says hosting such a specialized Nvidia system requires major changes to data center design and operations. As Nvidia’s approach might not be practical or cost-effective for many enterprises, Uptime expects enterprises to take another approach.

“We think most enterprises will most likely use a good, general-purpose pre-trained foundation model as is or they will develop a better model by fine-tuning a foundation model using cloud services,” said Max Smolaks, research analyst at Uptime. “As a result, most AI training will take place in cloud or hyperscaler infrastructure, either by vendors training their own foundation models to sell to others or by enterprises fine-tuning their own models, most organizations will settle for better rather than pursuing the costly best.”

Sources:
Published: