
AI’s Thirst for Power Is Dragging ‘Zombie’ Peaker Plants Out of Retirement
AI’s insatiable appetite for electricity and water is forcing grid operators to resurrect dirty ‘peaker’ power plants, clouding tech giants’ climate promises.
The Cloud’s Hidden Carbon Bill
On the outskirts of Phoenix, where saguaros give way to server racks, a relic of the 1970s is sputtering back to life. Unit 3 at the Mesquite Peaker Station—once moth-balled, its turbines wrapped in tarps—was spinning again last month, burning natural gas to keep an Amazon Web Services campus cool.
The reason? A new neighbor: a 500,000-square-foot AI training facility that gulps electricity like a steel mill and drinks water like a dairy farm. Across the United States, similar scenes are repeating themselves. Utilities that spent a decade retiring fossil plants are quietly reviving them, because the cloud’s newest workload—large-language-model training—won’t wait for solar panels to catch the morning sun.
From Cat Videos to Carbon Peaks
“Five years ago we worried about streaming video; today a single AI model can use more power than the entire city of Cleveland does in a day,” says grid analyst Maya Patel at EPRI.
The numbers are stark. A conventional data center uses about 8 MW of electricity—enough for 5,000 homes. A hyperscale AI campus can spike to 120 MW, and water-cooled chips add another twist: every megawatt of electricity can evaporate 1.7 million gallons of water a year, according to Department of Energy figures.
Peaker Plants: The Grid’s Emergency Exit
Peaker plants were built to run only a few hundred hours a year, when air-conditioners max out on the hottest afternoons. Now grid operators are calling them up at 2 a.m. to meet the rhythmic hunger of GPUs that never sleep. In Texas, ERCOT has re-listed 3.4 GW of retired gas turbines since January; California’s CAISO has extended the life of four coastal peakers that were scheduled to close in 2025.
The revival undercuts climate pledges. A single peaker emits as much CO₂ in one hour as 560 gasoline cars do in a week. Yet without them, voltage collapses and the AI boom stalls.
Water, Fire, and Algorithms
Power is only half the story. AI chips run hot; cooling them requires either evaporative towers—cheap but thirsty—or energy-hungry chillers. Google’s data-center report shows a 20 % rise in water use since 2021, almost all traced to AI workloads. In drought-stricken Arizona, that has turned tech giants into unlikely water buyers, bidding against lettuce farmers in Yuma.
The Search for a Cooler Circuit
Silicon Valley is racing for fixes. Microsoft is piloting immersion cooling: servers bathed in non-conductive fluid that cuts both energy and water use by 90 %. Nvidia’s newest GPU, the H100, trains models 30 % faster per watt. Start-ups such as Crusoe Cloud are placing data centers at shale-flaring sites, turning waste gas into compute cycles.
But the stop-gap is still fossil. “We can’t build renewables fast enough to match AI demand curves,” admits one utility executive who asked not to be named. “So the peakers stay.”
What Happens Next
- The Federal Energy Regulatory Commission will hold hearings next month on fast-tracking grid connections for “flexible loads,” a category that now includes AI farms.
- California regulators may delay the retirement of six coastal peakers until 2029, citing “load growth beyond forecast.”
- Google, Amazon, and Microsoft have jointly pledged $12 billion in green bonds to fund off-site solar and wind, but most projects will not come online before 2027.
Until then, every prompt you type into ChatGPT has a small but real chance of lighting up a turbine that was supposed to fade into history.