Layer 2: Why the Model Is Not the Product — The Model Embedded in the Stack Is

This analysis is part of Google: The Gravitational Center of the AI Market, a deep dive by The Business Engineer.

Layer 2: Foundation Models
Source: The Business Engineer

OpenAI and Anthropic sell models. Google sells models that are co-optimized with proprietary silicon, integrated into the world’s largest distribution surfaces, and continuously improved by data signals from 2B+ Search users, 750M Gemini users, and $60B+ in YouTube content interactions.

Sell Models vs. Sell the Stack

Sell Models: OpenAI (GPT-4.5, o3, ~300M ChatGPT MAUs) — no silicon, no cloud, no distribution. Anthropic (Claude, safety-focused) — committed 1M+ Google TPUs, validates Google silicon quality. Meta (Llama, open-source) — developer goodwill but no direct model revenue. xAI, Mistral, Cohere — niche/regional/trailing frontier.

Sell the Stack: Google’s model is embedded in proprietary silicon (L1), distribution surfaces (L4-L6), data flywheel signals from 2B+ Search users and 750M Gemini users (L5-L7), and Gemini 3.0 frontier-competitive enough that Apple chose it for next-generation foundation model development.

The Key Numbers

10B+ tokens per minute via direct API. 350 customers each processed 100B+ tokens in a month. 400% YoY revenue growth from generative AI products. 750M Gemini MAUs. 2B+ Search users feeding the data flywheel.

Read the full analysis on The Business Engineer →

Scroll to Top

Discover more from FourWeekMBA

Subscribe now to keep reading and get access to the full archive.

Continue reading

FourWeekMBA