Sam Altman Says Training a Human Costs More Energy Than ChatGPT — He’s Serious
The OpenAI CEO just compared your entire childhood to a GPU cluster. At an AI summit in India. With a straight face.
AI data centers are projected to consume 90 TWh of electricity by 2026 — roughly the annual power consumption of Belgium. Altman’s defense? “It takes 20 years of life and all of the food you eat before you get smart.”
Between you and me, when a CEO starts comparing server farms to human evolution, that’s the tell. That’s the moment you know the energy bills got ugly enough to need a PR strategy.

🧩 Dumb Mode Dictionary
| Term | What It Actually Means |
|---|---|
| “Evaporative cooling” | Old-school data center cooling that literally boils water. They stopped because it looked bad. |
| “Energy per query” | How much electricity one ChatGPT question burns. Altman says don’t measure this way. |
| “Total energy consumption” | The real number — all AI queries combined. This is the one that keeps growing. |
| “Training a human” | Altman’s metaphor for 20 years of food, school, and not getting eaten by predators. Yes, really. |
| “Nuclear or wind and solar” | Altman’s proposed fix → just build more power plants. Easy, right? |
| “1.5 iPhone charges per query” | A stat Bill Gates reportedly floated. Altman says “no way it’s anything close to that.” |
📰 What Altman Actually Said
Speaking at an event hosted by The Indian Express during his India visit, Altman laid out his whole defense:
- Water usage concerns? “Totally fake.” He claims they stopped evaporative cooling in data centers, so the 17-gallons-per-query stat floating around is dead.
- The iPhone battery comparison? Bill Gates apparently said one ChatGPT query = 1.5 iPhone charges. Altman’s response: “There’s no way it’s anything close to that much.”
- His big move: comparing AI training costs to human training costs. His exact words → “It takes like 20 years of life and all of the food you eat during that time before you get smart.”
- He went further: “It took the very widespread evolution of the 100 billion people that have ever lived and learned not to get eaten by predators… to produce you.”
So in Altman’s math, ChatGPT’s energy bill is a bargain compared to the caloric cost of raising a child. That’s the argument.
📊 The Numbers That Actually Matter
| Metric | Number |
|---|---|
| Projected AI data center power (2026) | 90 TWh/year |
| Total global data center power (2026) | 650–1,050 TWh/year |
| US data center share of electricity | ~6% of total demand |
| Global data center capacity growth | Nearly 2x from 2023 to 2026 (96 GW) |
| Companies required to disclose energy use | Zero (no legal mandate) |
Here’s the thing nobody’s saying out loud: there’s literally no legal requirement for these companies to tell us how much energy or water they use. Scientists are running their own estimates because OpenAI won’t publish the real numbers.
🗣️ The Internet Had Opinions

The backlash was immediate and savage:
- The “humans still exist” argument → Even if AI replaces workers, those humans don’t vanish. They still eat. They still use electricity. You just added a second energy consumer on top.
- The training data problem → Critics pointed out that AI’s training data was created by billions of humans over centuries. If you’re counting human energy costs, count the energy that produced the data the model ate.
- The transparency problem → Hard to argue your energy costs are “fake” when you won’t publish the actual numbers.
- Someone on Lemmy filed this under “Not The Onion.” Which, fair.
🔍 Why He Said This in India, Specifically
This wasn’t random. Altman was at an AI summit in India — a country where:
- Electricity costs are a top-of-mind issue for both government and voters
- Data center buildouts are accelerating fast
- Local pushback against power-hungry tech infrastructure is growing
- OpenAI is actively expanding its user base there (hundreds of millions of potential users)
The play → get ahead of the energy narrative before Indian regulators or media turn it into a problem. Same playbook he’s run in every market. Show up, charm the room, reframe the question before anyone pins you down on the actual watts.
Cool. The world’s richest AI CEO just told you your childhood cost more energy than his data centers. Now What the Hell Do We Do? ( ͡ಠ ʖ̯ ͡ಠ)

💰 Hustle 1: Build an AI Energy Cost Calculator That Goes Viral
Every time Altman opens his mouth about energy, thousands of people Google “how much energy does ChatGPT use.” Here’s what you do: build a simple web tool that lets people calculate the energy cost of their AI usage → compare it to household appliances, driving miles, whatever.
Monetize with affiliate links to green energy providers, carbon offsets, or just good old display ads on viral traffic.
Example: A developer in Lisbon built a “carbon footprint calculator for streaming” back when Netflix got heat for emissions. Simple Next.js app, took a weekend. It hit the front page of Reddit three times and pulls in ~$900/month from AdSense and energy company referrals. Same playbook, new target.
Timeline: Build in a weekend → post to r/technology and HN → iterate based on viral spikes → add API/embed for journalists
💡 Hustle 2: Sell 'AI Sustainability Audits' to Startups Trying to Look Green
Every AI startup needs to say they care about the environment now. ESG investors are asking. Customers are asking. But none of them actually know their numbers (because, again, nobody’s required to track them).
Here’s what you do: package a consulting offer around “AI sustainability assessment.” You don’t need a PhD — you need a spreadsheet, the IEA’s published estimates, and the ability to write a professional report.
Example: A freelance consultant in Berlin started offering “carbon auditing” for SaaS companies in 2024. She charges €2,500 per audit. Most of her work is Googling public data and formatting it nicely. She now has 6 recurring clients and a waitlist. AI startups are the same market, just newer.
Timeline: Create a template audit report → cold email 50 AI startups from Crunchbase → close 2-3 at $2K+ each → build referral pipeline
🔧 Hustle 3: Build a Power Monitoring Dashboard for Self-Hosters
With all this noise about AI energy costs, the self-hosting and local AI crowd wants to know: how much power is MY setup actually pulling? Here’s the angle — build an open-source power monitoring tool specifically for home GPU rigs and local LLM setups.
Track watts per inference, cost per query, monthly energy spend. Make it pretty. Then sell a premium hosted version or a hardware integration kit.
Example: A hardware tinkerer in Shenzhen built an ESP32-based power meter that clips onto GPU power cables and streams data to a dashboard. He sells the kit for $35 on Taobao and does about 200 units/month. The local AI community in China eats this up because electricity is metered by the government and people actually care about the bill.
Timeline: Fork an existing power monitoring tool → add GPU-specific features → share on r/LocalLLaMA and HN → launch premium tier
📱 Hustle 4: Create a Newsletter or YouTube Channel on 'AI vs. Climate'
This debate isn’t going away. Every quarter there’ll be a new energy scandal, a new Altman quote, a new study. That’s recurring content on a plate.
Here’s what you do: start a niche newsletter (Substack or Beehiiv) or YouTube channel tracking AI’s real environmental costs. The audience is climate-conscious tech workers — and that demographic has money and will pay for premium content.
Example: A journalist in Amsterdam started a Substack called “The Compute Report” tracking cloud computing costs and energy usage. 4,200 subscribers in 8 months, 380 of them paying €7/month. That’s €2,660/month from a weekly email. She sources most of her data from IEA reports and company earnings calls — all public.
Timeline: Launch Substack with 5 seed posts → cross-post to Twitter/Threads → hit 1K free subs in 2 months → flip paid tier at month 3
🛠️ Follow-Up Actions
| Want To… | Do This |
|---|---|
| Track real AI energy data | Bookmark IEA’s AI energy demand report — it’s the gold standard |
| Monitor the PR spin | Follow Altman’s public appearances — every India/EU trip produces quotable material |
| Build something fast | Use the energy calculator angle — lowest effort, highest viral potential |
| Go long-term | Newsletter play — this topic has years of runway as AI scales |
Quick Hits
| Want | Do |
|---|---|
| Read the IEA and MIT Technology Review analyses — ignore CEO talking points | |
| Use HWiNFO + a kill-a-watt meter on your local rig | |
| Build calculator tools, sell audits, or start a niche newsletter | |
| Follow researchers like Sasha Luccioni who publish independent estimates | |
| Remember: if the number was good, they’d publish it themselves |
When the guy burning the most electricity in the room tells you the fire is fine — maybe check the meter yourself.
!