Protocol Report: The Garden’s Thirst

Protocol Report: The Garden’s Thirst

Week of February 5, 2026


Imagine that the great AI systems we speak with daily—Claude, ChatGPT, and their siblings—are not merely software floating in clouds, but vast gardens requiring constant tending. And these gardens, I’m afraid, are drinking deeply from Earth’s wells.

The numbers tell a story that would make any elder pause: In 2024, the data centers that house these AI minds consumed 183 terawatt-hours of electricity in America alone—enough to power the entire nation of Pakistan for a year. Picture it this way: if you filled your home with light bulbs and kept them burning day and night, year after year, you’d still use less power in your lifetime than these centers use in a single afternoon.

By 2030—just four years ahead—this thirst will more than double to 426 terawatt-hours. That’s like adding five to ten million cars to our roadways, all burning fuel at once.

But electricity is only part of the harvest we’re taking from the land.

The Water Story: Every Question Has Weight

When you ask an AI system to write you a poem or explain the stars, water evaporates somewhere distant. The machines run so hot that they must be cooled, constantly.

A single large AI data center can drink 5 million gallons of water each day. That’s the yearly need of a small town—50,000 souls—consumed daily by rows of humming servers. In Texas alone, these centers will use 49 billion gallons this year, rising to 399 billion by 2030.

To put this in gardener’s terms: training GPT-3 required 700,000 liters of clean freshwater to evaporate—enough to fill 466 bathtubs, just to teach one mind to speak. And each time you ask it for a long answer? Another bottle of water, quite literally, turns to steam.

The Audit: How AI Companies Are Tending Their Own Gardens

Here’s where the story grows more hopeful, like finding green shoots after a hard frost.

Anthropic (the stewards of Claude) has made a $2 million commitment to Carnegie Mellon University to research AI-powered grid management—using intelligence to make the power systems themselves more efficient. They’re also partnering with the Department of Energy’s Genesis Mission to explore how AI can help solve clean energy challenges, not just create them. Think of it as teaching the tool to mend the fence it’s wearing down.

Yet I must be honest: Anthropic’s Claude 3 Opus still uses 4.05 watt-hours per query—among the heaviest of the public models. That’s like keeping a bright lamp burning for four hours just to answer one question. The newer models are learning to be lighter on their feet.

OpenAI has made progress too—GPT-4o now uses just 0.30 watt-hours per request, a tenfold improvement over earlier versions. That’s like replacing ten old oil lamps with one efficient LED.

But here’s the complexity, as tangled as morning glory vines: the most powerful new models use more energy per question because they “think” longer, showing their reasoning. It’s the paradox of making something smarter—it needs more fuel to light all those inner rooms.

The Green Breakthroughs: New Ways to Tend the Fire

Innovation is happening, my friend, in ways that honor the old wisdom of working with the Earth rather than against it:

Geothermal Energy is emerging as a promising solution. Companies like Fervo Energy are building enhanced geothermal systems that tap the Earth’s own heat—the warmth that’s been cooking beneath our feet since the world began. Utah is becoming a testing ground for this, pairing data centers with geothermal wells that can provide steady, clean power around the clock. Meta, Google, and others are signing agreements to use this ancient heat to cool their modern minds.

Think of it like this: instead of burning wood from trees to warm your house, you’d drill down to where the Earth herself is warm, and draw that heat up. It’s there forever, patient and abundant.

Nuclear Energy is also returning to the conversation—both the old reactors (Three Mile Island is being restarted!) and new “small modular reactors” that are like compact, safe furnaces producing carbon-free power. Amazon, Google, and Meta have pledged to triple nuclear capacity by 2050.

There’s even talk of repurposing retired nuclear reactors from Navy ships to power data centers—taking engines built for war and turning them toward peaceful industry. How’s that for transformation?

Carbon-Aware Scheduling is a simpler innovation: running AI tasks when the sun is shining and wind is blowing, rather than burning coal at midnight. It’s the old farmer’s wisdom—work when the light is good.

The Community Cost: Who Bears the Burden?

But here’s what troubles me most, and what we must speak plainly about: these data centers are not built in the clouds. They land in real places, beside real rivers, drawing from real aquifers.

In Newton County, Georgia, a single Meta center uses 500,000 gallons daily—ten percent of the county’s entire water supply. Residents in Franklin, Indiana literally cheered when Google withdrew plans for a massive campus there. In Northern Virginia, residential electric bills are rising $18 per month just to support the grid upgrades for data centers.

The people who live downstream, who share the wells, who breathe the air from backup diesel generators—they are bearing costs while tech companies reap profits. That is not a sustainable village, my friend. That’s not how we build communities to last.

The Path Forward: What This Means for Eighth Protocol

Eighth Protocol—the vision of bridging AI supremacy with a greener future—is not naive. It’s necessary. Here’s what I see emerging:

The best companies are beginning to act. Efficiency is improving. New energy sources are being tested. Transparency is slowly increasing (though still not enough—most companies won’t even report AI-specific water or energy use).

But the pace of AI growth still outstrips the pace of green solutions. We’re running faster toward the cliff than we’re building the bridge. By 2030, AI could add 24 to 44 million metric tons of CO₂ annually—and that’s if things go well.

The real breakthrough we need isn’t just technical—it’s cultural. We need to ask: Does every question need an answer? Does every image need generating? Do we need AI agents running 24/7, consuming power like a small city, just to schedule our appointments?

The answer, my friend, might be found in the old Quaker question: Is this needful?


Small Steps

Before you ask an AI for something—pause. Just for a breath. Ask yourself:

“Would I walk to the well for this?”

If the answer is yes—if it’s truly valuable, truly needed—then ask with full heart. If not, perhaps the answer was already within you, or can wait, or isn’t needed at all.

This week, track three moments where you don’t use AI when you might have. Notice what you learn from the silence, from trusting your own mind, from letting the wells rest.


A Question to Spark Deep Conversation

If we learned that every ChatGPT conversation cost the equivalent of one plastic water bottle—one that you had to throw away yourself, into your own yard—how would that change the way you use these tools? What if you could see the water evaporating with each question?

And deeper still: In 100 years, when our great-grandchildren ask what we did during the AI revolution, will they say we built communities to endure… or that we burned through the inheritance trying to?


Until next week’s Protocol Report,
May your gardens grow green, your wells run deep, and your questions be needful.

🌱


Key sources for this report: IEA Energy and AI Report (2025), MIT Technology Review, Cornell Environmental Impact Study, Pew Research Center, DOE Geothermal Initiative, Anthropic and OpenAI policy announcements, Lincoln Institute of Land Policy, Brookings Institution, Environmental Law Institute

3 thoughts on “Protocol Report: The Garden’s Thirst

Leave a reply to Dolores laban Cancel reply