Jump to download the white paper
Key takeaways
|
Artificial intelligence (AI) is transforming industries, but its rapid expansion is already causing a significant increase in electricity demand. Data centers that support AI model training and inference require immense computational power, putting pressure on the electric grid and raising concerns about sustainability, energy costs, and reliability. Recent projections suggest that AI-driven data centers could consume up to 9% of U.S. electricity by 2030 (equivalent to the electricity needed to power 20–40% of today’s vehicles if they were EVs), highlighting the need for policies that ensure energy-efficient, socially responsible, and environmentally sustainable development.
The emergence of DeepSeek, a highly efficient AI model, highlights a new path forward: prioritizing software and system-level optimizations to reduce energy consumption. Unlike traditional AI models that rely on massive hardware scaling, DeepSeek achieves competitive performance with a fraction of the energy use. This underscores the need for AI infrastructure planning focused on efficiency.
Traditional metrics for data center efficiency, like power usage effectiveness (PUE), are insufficient for measuring AI workloads, as they do not account for energy efficiency at the intersection of the software, hardware, and system levels. Traditional green building certification systems also cannot effectively mitigate hyperscale data centers' significant energy and water consumption. New AI-specific metrics, such as energy per AI task and grid-aware computing, must be developed to ensure that AI data centers optimize energy consumption across all levels of operation.
AI workloads vary significantly in their energy use. AI training is highly energy intensive, consuming vast amounts of power in a short time, while AI inference has a lower but growing energy demand due to widespread deployment. AI infrastructure can become more energy efficient and adaptive by aligning different AI workloads with electricity generation and consumption through strategies such as demand response and advanced scheduling.
To prevent overbuilding—and overinvesting in—inefficient infrastructure, policymakers must act now to establish a framework for resource-efficient AI data centers. Key policy actions include grid integration requirements, efficiency targets, and transparency regulations supported by well-structured incentives. These measures will ensure that AI's growth aligns with national energy goals, grid stability, and economic prosperity—key factors in maintaining a competitive edge in the global AI race.
This white paper is intended for state legislators and regulators who want to establish prudent policies to guide the growing AI data center industry. If AI data centers are overbuilt and become stranded assets, they threaten to raise other electricity customer energy costs and strain the electric grid. Energy-efficient data centers can avoid these outcomes, but only with specialized efficiency metrics based on up-to-date data from the GenAI industry.
Download the white paper
Suggested Citation |
Esram, Nora, and Camron Assadi. 2025. Future-Proof AI Data Centers, Grid Reliability, and Affordable Energy: Recommendations for States. Washington, DC: ACEEE. https://www.aceee.org/white-paper/2025/04/future-proof-ai-data-centers-grid-reliability-and-affordable-energy/. |