AI Raises Energy Demand, Reshapes Climate Impact

AI is no longer a negligible electricity consumer at scale. Global data centers used about 415 TWh in 2024, and AI-specific servers consumed roughly 93 TWh in 2025, or about 0.3% of global electricity. Simple per-query estimates like 0.3 Wh for a short GPT-4o call are accurate today but misleading for the near future. Reasoning-tier models and agentic workflows are becoming the default and consume 10-100x more energy per query, with measured benchmarks showing o3 at 33 Wh, GPT-4.5 at 30 Wh, and Claude 3.7 Sonnet with extended thinking at 17 Wh. The climate question to focus on is aggregate demand growth, its drivers, and how infrastructure and model design choices will determine emissions trajectories.
What happened
The piece reframes the AI-climate debate from whether a single ChatGPT call matters to how aggregate AI demand grows and what powers it. Global data centers consumed 415 TWh in 2024, and AI-specific servers used roughly 93 TWh in 2025, or about 0.3% of global electricity. A short, single-turn query is roughly 0.3 Wh, but reasoning models and agentic workflows raise that by 10-100x, with o3 at 33 Wh, GPT-4.5 at 30 Wh, and Claude 3.7 Sonnet at 17 Wh in measured benchmarks.
Technical details
The shift driving higher per-query cost is movement from lightweight inference to sustained, multi-step reasoning and agent orchestration. Key technical points practitioners should note:
- •o3, GPT-4.5, Claude 3.7 Sonnet show order-of-magnitude higher energy per query than short-turn models.
- •Agentic workflows multiply inference steps, I/O, and state management overhead, raising energy and tail-latency concerns.
- •Aggregate impact depends on user behavior, model defaults, and how providers meter and route workloads.
Context and significance
Comparing sectors clarifies scale: residential air conditioning uses more than six times AI's footprint today, industrial motors about forty times, and global streaming sits around 100-120 TWh. The crucial contrast is growth rate. If reasoning models and agents become default for broad classes of apps, AI could shift from a niche consumer to a major grid actor. That influences datacenter sizing, regional grid stress, procurement of renewables, and corporate reporting on Scope 2 emissions.
What to watch
Track per-query energy trends, deployment defaults (reasoning vs. short-turn), caching and routing optimizations, and hardware efficiency gains like quantization, sparsity, and accelerator improvements. Policy and corporate procurement (time-of-use, renewable contracts) will shape how that demand maps to emissions.
Scoring Rationale
This analysis reframes the climate question toward aggregate demand and infrastructure, a notable issue for operators and engineers. It is important for planning but not a paradigm-shifting discovery, placing it in the 'Notable' range.
Practice interview problems based on real data
1,500+ SQL & Python problems across 15 industry datasets — the exact type of data you work with.
Try 250 free problemsStep-by-step roadmaps from zero to job-ready — curated courses, salary data, and the exact learning order that gets you hired.


