
Autonomous vehicles and robotics were “five years away” for two decades. 2025 was the year that finally changed — after eight years of NVIDIA R&D.
The constraint, Jensen explained, was never hardware or sensors. It was understanding how the physical world works.
Three Breakthroughs Converged
1. World Foundation Models
NVIDIA’s Cosmos understands physics — how objects move, interact, respond to force. This enables reasoning about novel scenarios rather than pattern-matching to training data. Jensen called it “the world’s leading world foundation model, downloaded millions of times.”
2. Closed-Loop Simulation
Cosmos generates the world’s response to AI actions in real-time. Train an autonomous vehicle on a billion miles without touching a road. Jensen’s framing: “Compute becomes data.”
3. Reasoning Transparency
NVIDIA’s Alpamayo — “the world’s first thinking, reasoning autonomous vehicle AI” — doesn’t just act. It explains why. Novel scenarios can be decomposed into familiar situations. This solved the “long tail” problem that killed earlier AV approaches.
Production Partner Ecosystem
Physical AI entered production across industries:
- Zoox / Uber — Robotaxis
- John Deere — Agriculture
- Waabi / Volvo — Trucking
- Agility Robotics — Humanoids
- Serve Robotics — Delivery
- Mercedes-Benz — CLA with dual-stack architecture (AI + rule-based)
Strategic Implication
Autonomous vehicles became the first large-scale mainstream physical AI market. Physical AI is where AI meets the $100T real economy. NVIDIA is building the horizontal platform; partners are capturing the verticals.
This is part of a comprehensive analysis. Read the full analysis on The Business Engineer.









