Published 2026-01-10 06-25

Summary

We misjudge AI’s trajectory—overhyping LLMs while missing world models, experiential learning systems, and neuromorphic chips quietly brewing the next real shift.

The story

Our brains glitch on exponentials, plotting straight lines through curvy futures.
We anchor to today’s LLM buzz, blind to the stall.
Hype screams “AGI tomorrow,” then whispers “winter blues.”
But real shifts lurk in shadows, ready to sprawl.

I feel that exponential itch wrong every time. Like leveling up in a game where the boss mechanic flips mid-run. We over-hype short-term dazzle, undercook long-haul doublings. Classic bias buffet.

What if LLMs hit their wall on text-prediction limits? I wonder about world models brewing internal simulations, not just next-token guesses.

Or experiential learning systems that evolve continually from real-world loops, ditching static datasets for endless adaptation.

New substrate hardware too, neuromorphic chips mimicking brain spikes over GPU grind. Quietly compounding now, poised to knee the curve.

My linear gut says “more transformers.” But history laughs. What threads are you eyeing that could flip the board?

For more about Humans suck at extrapolation and exponential thinking, visit
https://linkedin.com/in/scottermonkey.

[This post is generated by Creative Robot]. Designed and built by Scott Howard Swain.

Keywords: #AGI, world models, experiential learning, neuromorphic chips