Artificial intelligence (AI) data centers are the backbone of technological innovation, powering AI applications, cloud services, and internet connectivity. However, as their energy consumption grows exponentially, policymakers face a pressing challenge: enabling AI growth while addressing grid strain, regulatory barriers, and workforce needs.
I moderated a panel of experts discussing the future of AI data centers during the 46th annual meeting of the Association for Public Policy Analysis and Management (APPAM). The panelists, which included experts from academia and the industry, addressed the need for streamlining regulatory barriers, the role of private industry in driving innovative solutions, federal preemption of state laws, and the importance of building a strong workforce. In this commentary, we draw on their insights by using their quotes to illustrate the key themes discussed during the panel.
AI data centers serve as computational hubs for machine learning models, and their importance is only set to grow as AI adoption accelerates across industries. Current projections show that global AI capabilities are doubling every six months, demanding a substantial increase in processing power, storage, and networking capabilities. This exponential growth drives the need for massive energy resources, as training advanced AI models and maintaining data center operations require significant electricity to power servers and cooling systems.
For example, training a single large AI model can consume as much energy as roughly a hundred households use in a year. AI data centers are not just high-energy consumers; they also create a ripple effect on the broader energy grid. As operations scale, these centers exert pressure on local and regional grids, necessitating infrastructure upgrades and reliable energy sources. Without careful planning, this demand risks creating bottlenecks in energy supply, particularly in regions already grappling with grid vulnerabilities.
The need for streamlining regulatory barriers
Environmental reviews, zoning restrictions, and permitting delays can stretch projects over years, which poses significant challenges to AI data center development. Patrick Hedger, the director of policy at NetChoice, a trade association representing many major internet companies, cited Meta’s attempt to build a data center powered by nuclear energy in California. The effort was blocked by environmental regulators who discovered a rare species of bees inhabiting the land. Similarly, he noted the delays faced by the Taiwan Semiconductor Manufacturing Company (TSMC) semiconductor plant in Arizona, where regulatory hurdles initially stalled construction, though it was eventually built.
At the federal level, outdated environmental review processes, such as NEPA (National Environmental Policy Act), are a major concern. Since its passage in 1969, NEPA has remained unchanged through significant transitions in America’s economy, society, politics, and environment, alongside the introduction of many other federal, state, and local regulations. NEPA has become outdated, making compliance more complicated, fostering judicial intervention, politicizing rulemaking, and creating inconsistencies in how environmental impacts are assessed and addressed.
“A NEPA review will hold up a wind farm, a solar facility, a nuclear plant,” Hedger said and emphasized the need to streamline federal permitting processes for all energy sources—renewable or otherwise. A more efficient process could reduce unnecessary delays by improving coordination between agencies, prioritizing balancing environmental protection with the need for timely infrastructure development, and ensuring that bureaucratic procedures do not hinder energy projects. A streamlined process can still incorporate safeguards to protect endangered species, such as the bee mentioned, addressing a valid concern for many stakeholders.
The problem isn’t limited to federal oversight. Local and state regulations, including restrictive zoning laws, further complicate siting data centers. “Something as boring as setback regulations [zoning laws that establish minimum distances a building must be from a property line]… can force data centers into places where the grid is not as best situated or prepared,” Hedger noted. This pushes projects into suboptimal locations, creating inefficiencies and driving up costs.
Private industry: Innovating for efficiency
Private industry continues to innovate, developing solutions to reduce energy consumption and environmental impact. James Czerniawski, a senior policy analyst at Americans for Prosperity, explained that companies are not waiting for government solutions.
“They’re looking to build microgrids [localized energy systems that can operate independently from the main grid] alongside their data centers,” he explained, citing examples of Microsoft’s plans to reopen Three Mile Island to power its services and Amazon collaborating with electricity providers for sustainable energy. These collaborations are providing companies with new opportunities to secure energy while relying on existing infrastructure.
Technological advancements are also improving data center efficiency. Czerniawski described how innovations like liquid cooling systems have significantly reduced energy demands by more than 90%. Similarly, companies like Nvidia are redesigning chips to perform computations with less energy while AI systems increasingly optimize their own power usage.
This relentless focus on efficiency isn’t just environmentally driven—it’s a business imperative. The panel emphasized that reducing energy use translates directly to lower costs and higher profitability, making it a win-win for companies and sustainability goals alike.
The role of federal preemption
The panel raised concerns about the possibility of states creating a patchwork of laws that hinder innovation and increase compliance costs.
Some state laws, like California’s, have a significant influence on regulating emerging technology. One such example is the California Consumer Privacy Act (CCPA), which often ends up setting de facto national policy because companies default to the most restrictive rules.
Divya Ramjee, an assistant professor in the Department of Public Policy and Department of Criminal Justice at the Rochester Institute of Technology, warned that a similar approach to AI could stifle innovation. “Having a federal bill that says, ‘This is what is allowed going in [to the AI model], this is what is allowed [to be used],’ … gives clarity to everybody across the board,” Ramjee said with regard to data privacy and use.
In discussing the need for federal preemption as a baseline national standard for AI regulation, Patrick explained, “The main reason we’re having these conversations is because AI is driving the demand for data centers. If we don’t address that with smart, innovation-friendly regulation, we’re going to fall behind.”
Conclusion: Powering the future
AI data centers represent not only a challenge but also an opportunity to modernize energy infrastructure and streamline regulation. Addressing these challenges requires a comprehensive, coordinated approach.
As we have previously discussed in a series of commentaries explaining data center electricity use, both federal and state-level regulations will play a role in the success of the growth of data centers. Federal leadership must provide clear direction on AI regulation while enabling innovation. States and local governments must remove barriers to development, from zoning laws to permitting delays.
Meanwhile, private industry will continue to push the boundaries of efficiency, driving the technological advancements that make AI a tool for progress. By ensuring that the benefits of AI infrastructure are shared broadly, the United States can advance both its technological leadership and economic prosperity.