3 items

The pattern across all three pieces is the same: the layer everyone is funding is one step below where control is actually migrating. AWS is committing $200B to infrastructure at the moment customers bypass Bedrock for direct model APIs; the Karpathy Loop works, but the competitive advantage belongs to whoever designs the metric and stopping criteria, not whoever runs the agent; world models suggest synthetic environments will outcompete real-world demonstration data as the bottleneck worth owning. Infrastructure, execution, and training data are each being abstracted away simultaneously; and the capital is still chasing the layer beneath.

Not Boring 2026-03-23-1

World Models: Computing the Uncomputable

The definitional move matters more than the technology survey: action-conditioned prediction, P(st+1 | st, at), is presented as the line separating world models from video slop. If that definition holds, the $4B+ deployed into World Labs, AMI, GI, and Decart is a bet that spatial-temporal reasoning trained on games and driving footage transfers to general embodied control. The strongest signal is Ai2's MolmoBot result: a sim-only-trained policy outperforming VLAs trained on thousands of hours of real data. If sim-to-real transfer keeps improving, the entire robotics data flywheel thesis inverts: synthetic environments become the bottleneck worth owning, not real-world demonstrations.

Fortune 2026-03-23-2

The Karpathy Loop: Autonomous Agent Optimization as Research Pattern

Karpathy's autoresearch ran 700 experiments in two days on a 630-line codebase: the result matters less than the pattern. The Karpathy Loop (agent + single file + testable metric + time limit) is the atomic unit of constrained autonomous optimization, and it generalizes to any problem with a measurable output and a modifiable code surface. The real competitive shift isn't building better agents; it's designing better constraints, metrics, and stopping criteria: taste becomes the bottleneck, not compute.

GeekWire 2026-03-23-3

AWS at 20: Inside the rise of Amazon's cloud empire, and what's at stake in the AI era

GeekWire's oral history buries the competitive signal inside the nostalgia: AWS customers are bypassing Bedrock to call Anthropic directly, which means the fastest-growing AWS service ever may be growing on committed-spend burn-down, not organic AI workload choice. The $200B capex bet and Jassy's $600B revenue target are Amazon paying to stay relevant at a stack layer it used to own; the structural question is whether AWS becomes a platform or a utility as models become the new developer interface. Azure at $75B (34% growth), Google Cloud at $50B, and the OpenAI deal at 16x Microsoft's per-point cost all point the same direction: the cloud market AWS created is converging, and custom silicon is the last defensible layer.