This commentary is the fifth in a series explaining data center electricity use and the nuances in regulating it. You can read early commentaries here, here, here, and here.
The recent earnings announcement from Nvidia brings my data center electricity use series full circle:
Its now-dominant data center segment increased revenue to $26.3 billion—more than 2½ times what that business generated a year earlier. Adjusted operating income for the quarter more than doubled year over year to $19.9 billion. Nvidia’s overall top and bottom lines beat Wall Street’s targets, as did the company’s forecast for the current period ending in October.
I have some closing thoughts (but no predictions!) about how this scenario will evolve into the future, based on my reading and the previous posts in this series.
Data centers are at the nexus of significant technological and economic shifts. Their growth is intertwined with the evolving landscape of electricity generation, distribution, and regulation. Future investments will have to navigate a complex landscape where electricity costs, carbon intensity, and regulatory pressures are all in flux. Technologies will be in flux too.
The biggest uncertainty facing the data center industry is not the price and availability of GPU chips for parallel computing. It’s power.
In this series I applied lots of economics, from how the elasticity of supply and demand change with time to the effects and feasibility of greenhouse gas reduction targets. But I think the most relevant economic insight comes from institutional economics and the make-or-buy decision. That insight leads us to microgrids.
BYOG: Microgrids
Applying the make-or-buy logic, one of the most significant questions for future data center investment is whether they will increasingly produce their own electricity through microgrids: bring your own generation. The energy-intensive operations of data centers pose significant challenges for both their commercial viability and their environmental impact. Microgrids offer a potential solution by providing a more resilient, efficient, and sustainable energy supply. The concept of microgrids—localized grids that can operate independently from the broader electricity grid—offers data centers a pathway to enhanced energy security, cost control, and sustainability.
A microgrid is a localized energy system capable of operating independently or in conjunction with the broader power system. It typically integrates various distributed energy resources (DERs) such as solar panels, wind turbines, battery storage, small natural gas generation, and combined heat and power (CHP) systems. The microgrid can function autonomously in “island mode” or connect to the larger grid to provide or receive power as needed. This flexibility allows microgrids to be more resilient, to manage their energy costs, and to support the integration of low-carbon energy sources.
Microgrids are particularly valuable in critical infrastructure applications, such as hospitals, military bases, and, increasingly, data centers. For data centers, the ability to control their energy supply is not just a matter of cost but also a strategic imperative, given their need for continuous, reliable power and their growing role in efforts to reduce greenhouse gas emissions.
The economic rationale for microgrids is compelling. By producing their own electricity, data centers can insulate themselves from grid instability, rising energy costs, and regulatory risks. Microgrids powered by low-carbon energy sources also align with the decarbonization goals that many data center operators have set. Even though it seems an eternity ago in data center evolution, in 2020 Schneider Electric was arguing that microgrids would increase uptime while managing both costs and carbon.
Building Data Centers as Microgrids
Microgrids, especially microgrids that include storage, can serve the dual commercial and climate objectives in several ways:
- Enhanced Resilience: Data centers require an uninterrupted power supply. Microgrids can provide a higher level of energy resilience by allowing data centers to operate independently of the main grid during outages or disruptions. By incorporating backup generators, battery storage, and renewable energy sources, a microgrid can ensure that data centers remain operational even in the face of grid instability or natural disasters. Hurricane Beryl in Texas in July provided a recent example, in which the HEB grocery store chain was able to keep power to its stores thanks to the Enchanted Rock microgrid systems that they deploy widely. Enchanted Rock uses natural gas power generators to provide power when the distribution grid is down, and this multi-modal fuel approach creates a more resilient power system for their customers, including data centers.
- Energy Cost Optimization: Energy costs are a significant operational expense for data centers. By operating as a microgrid, data centers can manage their energy use and reduce costs through demand response strategies, energy arbitrage, and by selling excess power in either wholesale or local retail energy markets (as those emerge in the future) during peak demand periods. This capability allows data centers to manage their energy expenditures and potentially generate additional revenue. Selling into external markets requires more extensive and time-consuming interconnection procedures, and with the bottleneck that interconnection currently presents, that’s likely to be less of a revenue factor than demand response and arbitrage.
- Decarbonization: Many data center operators have committed to ambitious climate goals, including achieving carbon neutrality or running entirely on renewable energy. Microgrids enable data centers to incorporate on-site renewable energy generation, such as solar or wind power, coupled with battery storage to offset their carbon footprint. Natural gas microgrid systems also provide resilience while contributing to decarbonization, and will play an important role in the resource portfolio. If you balk at the thought of natural gas as decarbonizing, think about the opportunity cost of a choice of backup fuel: diesel generators or a natural gas microgrid?
- Grid Support and Flexibility: When connected to the main grid, microgrids can provide ancillary services, such as frequency regulation and voltage support, which help maintain grid stability. Data centers can thus play an active role in supporting the broader energy system, contributing to the grid’s flexibility and reliability.
The decision to adopt microgrids will depend on several factors, including the cost of generation technologies, the regulatory environment, and the pace of innovation in energy storage solutions.
Regulatory and Utility Barriers to Data Center Microgrids
While the potential benefits of microgrids for data centers are clear, existing utility regulations and the incumbent position of monopoly utilities present significant barriers to their widespread adoption.
In many regions, regulations governing electricity generation, distribution, and sales are designed around a conventional vertically-integrated utility model. These regulations often limit the ability of non-utility entities, such as data centers, to generate and distribute electricity. For example, some jurisdictions may have rules that restrict the sale of excess power generated by microgrids to third parties or the utility itself, reducing the economic viability of microgrid investments.
There’s also a political economy of microgrids, and decentralization in general. Monopoly utilities, which own and operate the transmission and distribution wires networks, may see microgrids as a threat to their business model. As a result, they may lobby state legislators and/or regulators for regulations that hinder the development of microgrids or impose high fees on microgrid operators for grid access. Utilities also typically have exclusive rights to serve their geographic service territory, making it difficult for data centers to establish microgrids that operate independently of the utility unless they don’t cover a lot of ground and don’t cross any public rights of way.
As I alluded to above, grid interconnection can also be a time-consuming and costly challenge. Connecting a microgrid to the broader utility grid involves navigating a complex set of technical and regulatory hurdles. Utilities often impose stringent interconnection standards that can be costly and time-consuming to meet, and the process for obtaining the necessary approvals to operate a microgrid in parallel with the utility grid can be opaque and cumbersome, discouraging data centers from pursuing microgrid projects.
At Powering Spaceship Earth, Josh Smith has an excellent analysis of, among other things, the “connect and manage” low-permission-threshold approach taken to interconnection in ERCOT, the Electric Reliability Council of Texas. It’s one of the several dimensions in which the Texas model performs better than that used elsewhere.
Finally, existing regulated rate structures often do not favor microgrid deployment. Demand charges and fixed costs embedded in electricity tariffs can diminish the financial benefits of generating power on-site. Utilities may also not offer incentives for microgrids to provide grid services, such as demand response or load shifting, further limiting the economic attractiveness of microgrid investments and the ability of the data center to serve as a grid resource and be compensated for it.
Navigating the Path
Despite these challenges, the growth of data center microgrids is likely to continue as technology advances and the pressure grows to meet demand growth and decarbonization goals simultaneously. To overcome regulatory and utility barriers, data center operators can pursue policy strategies, business model innovation, and continued technological change.
Engaging with regulators and policymakers to promote regulatory reforms that support microgrid development is crucial. This engagement includes advocating for changes to interconnection standards, rate structures, and rules governing electricity sales and distribution, making sure that regulators and policymakers are aware of the frictions and the missed opportunities that the status quo represents.
Exploring new business models, such as energy-as-a-service (EaaS), where a third party owns and operates the microgrid on behalf of the data center, can also help mitigate some of the regulatory and financial risks associated with microgrid development. This idea is itself another application of institutional and organizational economics, using a contractual relationship to achieve a mutually beneficial risk allocation. EaaS saves the data center owner from having to develop specialized energy operations expertise and keeps the generation assets off of their balance sheet, which can help with their financial risk profile.
As microgrid technologies continue to evolve, costs are likely to decrease, and the integration of advanced energy management systems will become more seamless. New data centers will face lower costs of building in liquid cooling and new data center architectures that can increase energy efficiency. These new energy systems and data center designs are conducive to microgrid architecture.
The existing regulatory landscape and the dominant position of monopoly utilities pose significant challenges to the widespread adoption of microgrids. Overcoming these barriers will require concerted efforts from data center operators, policymakers, and utilities to create an environment that fosters innovation and supports the transition to decentralized and decarbonized power systems.
AI and Decarbonization: Accelerating the Transition
Overhyped or not, artificial intelligence (AI) is already revolutionizing data center operations, offering opportunities to optimize energy use, reduce costs, and accelerate decarbonization. A lot of people, including Nvidia’s Jenson Huang, argue that AI will influence energy in many ways, not just by increasing the demand for electricity.
AI can enhance the energy efficiency of data centers by predicting energy demand, optimizing cooling systems, and integrating renewable energy sources more effectively. These capabilities will become increasingly valuable as data centers strive to meet growing demand while adhering to stricter environmental standards.
But the relationship between AI and decarbonization is complex. On the one hand, AI can significantly reduce the carbon footprint of data centers by improving operational efficiency. On the other hand, the computational intensity of AI workloads can increase electricity demand, potentially offsetting some of these gains. The timing of decarbonization efforts will therefore be influenced by how quickly AI can be harnessed to manage and reduce energy consumption.
AI-driven automation could also facilitate more dynamic interactions between data centers and the grid, enabling real-time adjustments to energy use in response to grid conditions and enabling the integration of more resilient DER.
The future of data center electricity use will be shaped by the concatenation of technological innovation, regulatory developments, and market dynamics. Investments in energy efficiency, renewable energy, and microgrids will be critical as data centers strive to balance growth with decarbonization. AI will play an important role in this transition, offering both opportunities and challenges for decarbonization. Microgrids are likely to be an important part of that story.
A version of this commentary was first published at Substack in the newsletter Knowledge Problem.