Blog Layout

Bridging the Gap: With Local Grids Stretched to the Limit, Data Centers Need a Plan for Reliable Supplemental Power

Grid2Power • August 23, 2024

Andrew Batson, Head of U.S. Data Center Research for JLL

Data center energy demands, fueled by the relentless growth of Artificial Intelligence (AI) and cloud computing, are surging across the U.S., threatening to outpace the capacity of local power grids. This is prompting data center owners and developers, as well as utilities, to scramble for solutions.

Companies searching for their next data center locations need clarity around whether the facility can realistically support their power density and cooling requirements today and into the future. In all likelihood, some facilities will need to supplement or consider non-traditional power sources to handle these growing loads, especially during times of peak usage.

Data center owners, operators and users need to be aware of what viable supplemental energy sources exist, both immediately and into the future, to bridge the expected power gap. Below is a rundown of the various power sources that can help “top off” the grid during peak capacity times, with the pros and cons of each.

Natural gas

Supplemental natural gas turbines are the most accessible and robust bridge power solution. They are reliable and can be rapidly incorporated into a power plan, helping deliver necessary power within three to six months. They are also versatile, supporting base load or peak demand, and can transition to backup power after the bridging period. Of course, any form of natural gas is dependent on a firm gas supply, whether via pipeline or virtual delivery, but in the near term it remains a scalable and economically viable option while other, perhaps cleaner, power technologies mature in the market.

Battery storage

Battery storage acts best as a peak power solution, charging during low grid demand (at a lower cost) and discharging during peak utility constraints. This storage is particularly helpful for data centers' slow and steady flat load profiles, which typically don’t experience sudden power spikes. Batteries also provide grid support by reducing the power drawn from the grid during peak periods. This flexibility helps provide overall grid stability and reduces the chances of sudden disruptions to data centers.

Fuel cells

Fuel cells are electrochemical devices that convert the chemical energy of a fuel source, such as natural gas, directly into electricity without the need for combustion. Several high-profile projects have showcased the efficiency and reliability of natural gas fuel cells as a low-emissions alternative to combustion-based power. The technology is financially competitive with other power-bridging options, particularly in high-priced natural gas markets. Implementation rates are quickly increasing, making this a technology to watch. Fuel cells have been installed at projects exceeding 100 MW, however, they are most commonly found at data centers under 50 MW.

Solar/wind power

Solar and wind’s inherently intermittent and inconsistent output means they can’t be relied on as a bridge power source for data centers. However, these clean energy sources can support grid resilience and offset peak power demands by serving as complementing power sources for existing substations.

Small modular nuclear reactors

Though promising as a future power source, small modular nuclear reactors (SMRs) are not entirely competitive for bridge power today because of their long development and government approval timelines, not to mention their high price tag. However, many industry experts expect SMRs to reach commercial viability in five to 10 years, making them potential candidates for both bridge and grid power over the long term.

Diesel generators

Diesel generators, like natural gas, deliver fast, consistent power for data centers. Despite being less environmentally friendly than other solutions, diesel’s established supply chains and operational reliability makes diesel generators a dependable choice for short-term power needs.

Factoring power into your right data center location 

Choosing the right data center location is critical for ensuring smooth operations and minimizing business disruptions. Beyond factors like connectivity and security, a crucial consideration is the site's power infrastructure. For data center occupiers and colocation providers, a reliable and robust power supply is paramount.

It’s important to evaluate all options before shortlisting your top choices. During the site selection process, here are a few of the key considerations to watch for regarding power:

  • Grid reliability:
  • What are the redundancy protections in place to ensure minimal downtime/disruptions?
  • How easily accessible is the colocation or other facility to the power grid? Is it reliable and capable enough to support the facility’s power needs?
  • How frequent and lengthy are power outages in the area, if any?
  • Access to renewable energy:
  • Does the facility have access to renewable energy sources (e.g. wind, solar) that can help lessen the environmental impact of the facility?
  • Infrastructure capacity:
  • Can the existing power grid handle the data center's projected power demands?
  • Can the facility accommodate future data center expansion?
  • Regulatory environment:
  • What local regulations govern power grid access and backup power systems? E.g. permits, technical specifications
  • Are there local regulations impacting data center power sourcing options? E.g. energy efficiency standards, power purchase agreements (or PPAs).

With so much at stake in terms of both short- and long-term data storage, the success of any location is hyper-dependent on the right power strategy – one that sees the macro trends but also understands what’s happening regionally and accounts for unique latency needs and other specific concerns.


By Grid2Power August 20, 2024
As technology demands exceed the U.S. grid’s capacity and as upgrades will take decades, the need for reliable off-grid energy has surged. Off-grid micro data centers provide rapid, scalable alternatives to outdated traditional data center facilities. Powering The Future of Ai: Addressing The Looming Energy Challenge, Scott Wassmer, (Forbes) July 16, 2024 states “By 2030, the United States alone is projected to reach 35 gigawatts (GW), nearly double the 17 GW consumed in 2022. This surge is driven by the substantial computational power and cooling capabilities needed for AI and machine learning.”
By Grid2Power August 20, 2024
Benj Edwards’ article “Elon Musk claims he is training “the world’s most powerful AI by every metric” (2024), retrieved from News Channel 3 WREG Memphis, outlines Musk’s unveiling of what he claims is "the world’s most powerful AI training cluster" at xAI's new supercomputer facility in Memphis, Tennessee. This facility, dubbed the "Memphis Supercluster," boasts a staggering 100,000 liquid-cooled H100 GPUs connected via a single RDMA fabric, promising to give xAI a significant edge in AI training (Edwards, 2024). However, as with many of Musk's ambitious projects, there are notable hurdles that could impact its success. Power Issues and Local Concerns According to Edwards’ reporting, the facility's enormous power requirements—up to 150 megawatts at peak times—have raised concerns about its impact on the local power grid and infrastructure. Doug McGowen, president of Memphis Light, Gas and Water (MLGW), has highlighted the uncertainty surrounding the project’s full power needs and its potential effects on the local utility system (Edwards, 2024). In response to these concerns, xAI has temporarily deployed 14 VoltaGrid natural gas generators to supplement the facility’s power needs while negotiating a formal power agreement with the Tennessee Valley Authority (TVA) (Edwards, 2024). Despite these interim measures, there is considerable anxiety among Memphis residents and officials about the long-term implications for the city’s power supply. A Sustainable Solution: Harnessing Flare Gas Given the power challenges, an innovative and sustainable solution could be to utilize flare gas from oil fields as a power source for the Memphis Supercluster. Gas flaring—the process of burning off excess natural gas at oil extraction sites—represents both an environmental issue and a waste of valuable energy resources. Integrating flare gas capture and conversion technology into the supercomputer’s power strategy could offer several benefits: 1. Environmental Impact Reduction: Using flare gas for electricity generation would decrease the environmental harm caused by gas flaring. This approach would lower greenhouse gas emissions and reduce other pollutants associated with gas flaring, contributing to global efforts to combat climate change. 2. Cost Efficiency: Converting flare gas into power could lead to significant cost savings for the Memphis Supercluster. By using waste gas as a power source, xAI could reduce operational expenses, making the project more financially sustainable in the long run. 3. Energy Independence: By generating its own power from flare gas, xAI could mitigate its reliance on the local power grid. This would ensure a more reliable and continuous operation of its AI training facility, even if local power supply issues arise. 4. Community Relations: Demonstrating a commitment to eco-friendly and innovative solutions could improve xAI’s standing with the Memphis community. Addressing local power concerns with sustainable practices could foster positive relationships and alleviate some of the community’s apprehensions. The Future of AI and Energy As xAI pushes forward with its goal of developing the world’s most powerful AI, integrating flare gas utilization into the facility’s power strategy could provide both a practical and ethical solution to its energy challenges. This approach would not only support the supercomputer’s operational needs but also align with broader sustainability goals. In a competitive AI landscape dominated by tech giants like OpenAI, Microsoft, Amazon, and Google, leveraging sustainable energy solutions could offer xAI a distinct advantage. The success of such an integration could set a precedent for future projects, showcasing how cutting-edge technology and environmental stewardship can go hand in hand. References: Edwards, B. (2024, July 24). Elon Musk claims he is training “the world’s most powerful AI by every metric” One snag: xAI might not have the electrical power contracts to do it. Retrieved from News Channel 3 WREG Memphis
By Grid2Power August 20, 2024
In the past years, organisation within the energy industry have faced intense pressure to cut costs, deal with unpredictable shifts in demand, and enhance their competitiveness by making decisions based on up-to-the-minute information. This has led to an increasing part of the computing resources being moved out to where the data is collected. This rapid increase of IoT devices within the energy sector set new and high expectations of companies active within the industry, spanning both enterprise operators and solutions vendors targeting the unique challenges of energy. At the heart of it is building a versatile infrastructure capable of managing large amounts of data. This provides valuable insights enable predictive maintenance and also help battle some of the most crucial challenges to stay competitive as an oil and gas enterprise or service provider.  In this article, we’ll look at what characteristics of the energy industry drives the need for digital change and how edge computing can address some of the challenges in achieving operational excellence in distant, numerous locations. Downtime within energy is not like any downtime All industries aim to minimize downtime as much as possible but the energy sector stands out when looking at the consequences downtime cause. Within the energy sector, non-stop operations are crucial, and proactive or instant mitigation of incidents is key to minimizing damage and cost. According to an MIT Sloan study, one single day of downtime for a liquefied natural gas (LNG) facility can cost $25 million. And a typical midsize LNG facility goes down about five times a year. Meanwhile, an average offshore oil rig generates 1TB-2TB of data every day , and without the proper infrastructure to support that, it can take up to 12 days before the data can transmit to a central cloud for computing. That is both costly and way too slow for digital strategies including any AI or machine learning. The problem statement couldn’t be clearer and the need for minimized downtime and reduced costs go hand in hand to the top of the priorities for actors within the energy sector. Unlocking operational autonomy on-site Since the operational locations of an energy sector enterprise can be both offshore, distant, and numerous, the idea of moving infrastructure and application workloads out to the operational sites is increasingly adopted throughout the industry. A technology typically referred to as edge computing. Edge computing has become a vital means of addressing the challenge of accessing crucial data, particularly in isolated or offshore areas such as oil platforms, drilling rigs, wind parks or solar plants, or even oil tankers, where internet connections are unreliable and operations need to run autonomously. With operational autonomy and edge computing, energy sector operators become resilient to downtime and connectivity outages that inevitably occur at on-site operational locations. This not only contributes to competitiveness and operational excellence but is key to reducing safety risks and costs. Keep reading: Setting the stage for a successful edge computing pilot The benefits of edge computing within energy Pursuing an edge computing strategy within the energy sector brings several strong benefits including: Application autonomy, ensuring non-stop operation excellence at off-shore or isolated locations. Higher efficiency and productivity with local compute resources. Increased safety with automated and instant incident detection and mitigation. Decreased costs in not having to move data back and forth from the cloud more than needed. Who benefits from using edge computing in energy? The above mentioned benefits can be reaped by energy sector operators and software providers within energy respectively. The distributed and distant nature ot the operational locations within the energy sector make it a match made in heaven with edge computing technology.
Share by: