Bain’s 2030 forecast reveals global data center demand will double to 163 GW, but power access and construction delays threaten AI’s growth curve.
The early scramble of generative AI–driven data center demand is giving way to a more disciplined, selective, and execution-focused phase of growth among hyperscalers focused on service delivery. However, the industry will still face challenges related to power availability and construction delays, according to Bain & Company’s latest 2030 global data center forecast.
While talks of an AI bubble and frontier projects such as Stargate fill headlines, Bain’s latest baseline forecast, a scenario marked by continued strong AI demand and development prioritized amongst some scaling delays and gradual easing of power and component constraints, revealed that global data center capacity demand would reach 163 gigawatts (GW) by 2030, twice today’s demand.
By 2030, the US’ data center electricity demand could double to 409 terrawatt-hours (TWh), with AI expected to drive most of this increase, Bain finds.
“We expect there will be sufficient energy supply to meet demand,” said Aaron Denman, leader of Bain’s Americas Utilities and Renewables practice. “However, power access is now the critical gatekeeper of growth. Even as GPU and construction constraints ease, more flexible and independent sources of power will be needed. As such, the behind-the-meter (BTM) power generation has become the go-to source, shifting timelines and decision-making.”
Bain projects that by 2030, US data centers could consume about nine percent of the country’s total electricity, more than double today’s share and roughly 150 TWh above the US Energy Information Administration’s baseline outlook.
Meeting this demand will require close coordination between utilities, regulators, and data center operators. In addition to ongoing efforts to build new traditional power generation and transmission infrastructure, a diverse set of coordinated actions will be necessary.
Near-term solutions include flexible demand programs that shift consumption to off-peak periods, battery storage to manage load volatility, and BTM, such as natural gas, rooftop solar, or even nuclear unit restarts. Long-term relief will depend on grid modernization, the integration of renewable energy sources, and transmission expansion.
Also Read: Is Deregulation the New AI Gold Rush?
Flexible BTM power generation sources can effectively support smaller and more distributed data center networks, which fit the modest requirements of inference workloads. By the end of the decade, the majority of AI compute will come from inference workloads. However, mega data centers with power capacities of at least one gigawatt will become standard for frontier model training.
“The general prediction that hyperscalers would scale back investments didn’t happen in 2025. However, we are seeing more deliberate investments by hyperscalers as they scale capacity, focusing more on capital efficiency and getting more selective on locations for new deployments, particularly for AI,” said Padraic Brick, co-leader of Bain’s data center perspectives.
By 2030, Bain expects North America to still account for the largest concentration (about half) of data center capacity, fueled by hyperscalers’ capital expenditures. Meanwhile, sovereign AI mandates and enterprise adoption are driving capacity investments in other regions such as Europe and the Asia Pacific. Companies are now seeking geographic flexibility as they align compute infrastructure with latency, data sovereignty, and energy sourcing considerations, Bain finds.
Alongside power constraints, the physical build-out of data centers has become another critical challenge. Developers are encountering mounting execution hurdles. Projects are slowed by lengthy permitting processes and equipment lead times ranging from eight to 24 months. Skilled-labor shortages add further strain, but the most challenging of all is the electric utility connection with delays of up to five years.
Also Read: Google I/O 2025: Biggest AI Updates and Surprises
Analysis by Bain finds that four proven actions can slash construction timelines by up to a year – (1) identify the right markets and build a portfolio of sites, (2) opt for modular design and prefabricated equipment, (3) use cross-functional experts to optimize design and develop supply chains and (4) collaborate with suppliers and prepurchase key equipment in bulk.
“The AI data center race is no longer just about scale. Winners are taking deliberate and careful approaches to capacity investments, while at the same time, actively securing fit-for-purpose power generation and mitigating build delays,” said Peter Hanbury, leader of Bain’s global work on operations for Technology clients.


