Small numbers, enormous impact. This is the defining paradox of AI economics. A single prompt consumes just 0.24 watt-hours—barely enough to light an LED for a minute. Multiply by billions of daily queries, and you get an energy footprint that rivals nation-states.
The Multiplication Effect framework reveals how microscopic per-query costs aggregate into macroeconomic forces:
ChatGPT: A Scale Study
- 2.5 billion queries per day
- 850 MWh daily electricity consumption
- 311 GWh annually (~30,000 US homes)
- 100,000+ tons CO2/year (equivalent to 20,000 cars)
- Electricity cost: $30-40M annually
Water: The Hidden Cost
- Each query: ~0.26 mL of water for cooling
- Daily total: 450 million gallons by 2030
- Equivalent to 8 million people’s daily water use
- Half of US data centers sit in water-stressed regions
2030 Projection: All Generative AI
- 347 TWh annual consumption (Schneider Electric estimate)
- 4.5% of global electricity generation (IMF projection)
- 38 Google-class (1 GW) data centers required
The insight for business leaders: at planetary scale, microscopic costs aggregate into trillions of dollars and terawatts of demand. Companies that ignore the multiplication effect will find their unit economics destroyed by infrastructure costs they never modeled.
The prompt is cheap. A billion prompts reshape the global energy grid.
This analysis applies The Business Engineer’s Scale Math framework to understand how AI’s unit economics translate to planetary impact. Read the full analysis: The Economics of an AI Prompt →









