Business

Cursor Is the Uber of the 2010s: Subsidizing Growth to Own a Category

Just as Uber subsidized rides to reshape transportation, Cursor is subsidizing AI-assisted coding to own developer workflows. The playbook is identical: accept massive losses to achieve market position that eventually enables pricing power. Cursor offers capabilities that cost far more than users pay. The gap is investment in market share—betting that developer habits formed now […]

Cursor Is the Uber of the 2010s: Subsidizing Growth to Own a Category Read More »

Synthetic Diamonds Are Unlocking a Technology Revolution Beyond Jewelry

Forget engagement rings—synthetic diamonds are becoming critical technology infrastructure. The unique properties of lab-grown diamonds are enabling breakthroughs in semiconductors, quantum computing, and thermal management that natural diamonds could never supply. Credit: Financial Times Carbon in diamond crystalline structure has properties no other material matches: highest thermal conductivity, extreme hardness, wide bandgap, and quantum spin

Synthetic Diamonds Are Unlocking a Technology Revolution Beyond Jewelry Read More »

Diamonds and Quantum Computing: Carbon’s Quantum Properties Could Define the Next Era

The race to practical quantum computing may ultimately be won by an unlikely material: diamond. Carbon’s quantum properties—specifically nitrogen-vacancy centers in diamond lattices—offer a path to room-temperature quantum computing that other approaches cannot match. Credit: Financial Times Most quantum computing approaches require extreme cooling—near absolute zero—to maintain quantum coherence. This creates massive practical barriers: cost,

Diamonds and Quantum Computing: Carbon’s Quantum Properties Could Define the Next Era Read More »

Copper Surging Toward Record Highs: The Metal That Powers Everything AI

Bloomberg reports copper surging toward all-time highs—and the driver is the same force reshaping everything else: AI infrastructure. Every data center, every power line, every electrical system enabling AI runs on copper. The AI boom is fundamentally a copper demand story. A single large data center requires thousands of tons of copper for electrical systems,

Copper Surging Toward Record Highs: The Metal That Powers Everything AI Read More »

Daily Roundup: Big Five Hit $400B Quarter, Waymo’s 4M Rides, and Copper Powers the AI Revolution

The Big Picture Today’s stories reveal AI’s transformation from software phenomenon to industrial complex. The Big Five tech giants now generate $400 billion per quarter—concentration accelerating, not stabilizing. AI infrastructure wars are measured in gigawatts. Copper surges as the physical material enabling everything digital. Waymo crosses 4 million rides, proving autonomous vehicles work commercially. And

Daily Roundup: Big Five Hit $400B Quarter, Waymo’s 4M Rides, and Copper Powers the AI Revolution Read More »

Stargate Project: $500B for 5 GW Data Centers

The scale of AI infrastructure investment is unprecedented. The Stargate Project alone commits $500 billion, with each data center requiring 5 GW of power—more than New Hampshire’s entire electricity consumption. By 2030, AI data centers will consume 4.8% of global power. This is creating a genuine power crisis that’s reshaping energy policy, grid infrastructure, and

Stargate Project: $500B for 5 GW Data Centers Read More »

DeepSeek R1: The $0.55 Challenger Disrupting AI Pricing

The AI pricing landscape is being disrupted from unexpected quarters. DeepSeek R1 offers input tokens at just $0.55 per million—a fraction of GPT-4’s $1.25 and dramatically cheaper than older models. The December 2025 pricing landscape shows massive variance: from GPT-4 Nano at $0.03 to GPT-4 at $1.25 per million input tokens. DeepSeek’s aggressive pricing suggests

DeepSeek R1: The $0.55 Challenger Disrupting AI Pricing Read More »

AI Inference Revenue Surpassed Training Revenue in 2025

2025 marks a historic inflection point: inference revenue has now surpassed training revenue. This shift fundamentally changes the economics of AI companies. Training is episodic, capital-intensive, and a one-time investment ($150M+ for frontier models). Inference is continuous, user-facing, and generates ongoing operational revenue ($2.3B annually for leaders—15x training costs). This shift favors purpose-built inference chips

AI Inference Revenue Surpassed Training Revenue in 2025 Read More »

AI Data Centers in Water-Stressed Regions: A Growing Crisis

Half of US data centers are located in water-stressed regions, creating a collision between AI’s growth and local water needs. By 2030, AI could consume 450 million gallons of water daily. This daily water use equals 8 million people’s consumption. The environmental impact of AI extends far beyond carbon—water is becoming a critical constraint that

AI Data Centers in Water-Stressed Regions: A Growing Crisis Read More »

US Data Centers Already Consume More Power Than Malaysia

The scale of American AI infrastructure is already enormous. US data centers consumed 183 TWh in 2024—more than Malaysia’s entire national electricity consumption. And this is just the beginning. Global AI spending is projected to reach $1.5 trillion by 2030. The infrastructure story is becoming inseparable from the AI story—companies controlling power, chips, and data

US Data Centers Already Consume More Power Than Malaysia Read More »

The True Cost of an AI Prompt: Why 0.24 Wh Changes Everything

Every AI prompt you send consumes 0.24 watt-hours of energy, releases 0.26 milliliters of water, and generates 0.035 grams of CO2. These numbers seem microscopic—until you understand the multiplication effect that defines AI economics. This is the fundamental insight of the Prompt as Economic Primitive framework: just as the HTTP request became the atomic unit

The True Cost of an AI Prompt: Why 0.24 Wh Changes Everything Read More »

The 280x Token Price Collapse: How AI Became 280 Times Cheaper in Two Years

Between November 2022 and October 2024, AI token prices collapsed by 280x. This isn’t incremental improvement—it’s the kind of deflationary shock that reshapes entire industries. The Token Economics framework reveals tokens as the new currency of AI. One token equals approximately 0.75 words (or ~4 characters). “The quick brown fox jumps” costs you 5 tokens.

The 280x Token Price Collapse: How AI Became 280 Times Cheaper in Two Years Read More »

The Great Shift: Why 2025 Is the Year Inference Surpassed Training

2025 marks a historic inflection point in AI economics: inference revenue has officially surpassed training revenue. This isn’t just an accounting change—it represents a fundamental restructuring of how AI companies make money and where competitive advantages lie. The Two Different Economic Models framework explains why this shift matters: Training Economics: Capital-intensive ($150M+ for frontier models)

The Great Shift: Why 2025 Is the Year Inference Surpassed Training Read More »

AI Infrastructure Economics: Why $1.5 Trillion Is Just the Beginning

The subtitle of AI’s infrastructure story could be: “Beyond the Prompt: Where the Trillions Actually Go.” While the industry obsesses over model capabilities, the real constraints are emerging from power, water, and physical infrastructure. The Infrastructure Economics framework identifies three converging challenges that will define AI’s trajectory: 1. The Power Crisis The Stargate Project commits

AI Infrastructure Economics: Why $1.5 Trillion Is Just the Beginning Read More »

The Scale Math: How 0.24 Wh Becomes 347 TWh

Small numbers, enormous impact. This is the defining paradox of AI economics. A single prompt consumes just 0.24 watt-hours—barely enough to light an LED for a minute. Multiply by billions of daily queries, and you get an energy footprint that rivals nation-states. The Multiplication Effect framework reveals how microscopic per-query costs aggregate into macroeconomic forces:

The Scale Math: How 0.24 Wh Becomes 347 TWh Read More »

Why Midjourney’s 65% Cost Cut Reveals AI’s Hardware Future

When Midjourney switched from GPUs to TPUs, they cut inference costs by 65%. This single case study encapsulates the most important hardware trend in AI: the shift from general-purpose to purpose-built silicon. The Competitive Implication framework explains why this matters: as inference revenue surpasses training revenue, the optimal hardware changes fundamentally. Why GPUs Dominated Training:

Why Midjourney’s 65% Cost Cut Reveals AI’s Hardware Future Read More »

The Hidden Subsidy: How Consumers Pay $37.50/Month for AI They May Not Use

Virginia residents have seen their electricity bills rise by $37.50 per month. The cause? Data center expansion driven largely by AI workloads. This is the hidden subsidy of the AI revolution—grid infrastructure costs socialized onto consumers regardless of whether they use AI. The Consumer Cost dimension of infrastructure economics reveals an uncomfortable truth: AI’s growth

The Hidden Subsidy: How Consumers Pay $37.50/Month for AI They May Not Use Read More »

DeepSeek R1 at $0.55: Is China Commoditizing AI Faster Than Expected?

DeepSeek R1 offers input tokens at $0.55 per million—less than half of GPT-4’s $1.25. This aggressive pricing from a Chinese competitor suggests the commoditization of AI inference is accelerating faster than Western incumbents anticipated. The Token Pricing framework shows a market in dramatic flux. December 2025 pricing reveals a 40x spread between cheapest and most

DeepSeek R1 at $0.55: Is China Commoditizing AI Faster Than Expected? Read More »

38 Data Centers by 2030: The Physical Constraint on AI’s Growth

By 2030, generative AI will require 38 Google-class data centers—each consuming 1 gigawatt of power. This isn’t a projection buried in an appendix; it’s the binding constraint that determines how fast AI can actually scale. The 2030 Projection framework quantifies what “AI scaling” actually requires in physical terms: The Numbers: 347 TWh: Annual AI energy

38 Data Centers by 2030: The Physical Constraint on AI’s Growth Read More »

Scroll to Top
FourWeekMBA