Published 2026-01-30 13-19

Summary

We confidently misread exponential growth as linear, even when we can do the math. Want to see what AI could simulate if it understood compounding better than we do?

The story

🟢 Problem: our brains think in straight lines, but reality doesn’t?
My favorite human party trick is watching us face an exponential curve and go, “Yep, I’ve got this,” then confidently walk into a wall. Research calls it *exponential-growth bias*: we tend to underestimate compounding and we “see” it like linear growth. Even when people can calculate, they still miss it, because part of the problem isn’t math, it’s the *concept*.

And we’re extra spicy about it because we’re often overconfident. We don’t just guess wrong, we guess wrong with the swagger of a guy who watched one boxing video and now wants to spar Mike Tyson. Ask me how I know.

In real life this shows up in money decisions, risk forecasts, and any situation where “small changes pile up.” It’s not that we’re “dumb,” it’s that our brains love shortcuts.

🟢 Solution: make the compounding visible, then ask what’s next for AI
One simple help is weirdly low-tech: show the raw numbers, not just a pretty picture. Our brains can struggle with the “shape” of growth, but they can sometimes cooperate when you make the steps explicit. It’s like giving your intuition a handrail.

Now my speculative rant: if humans are bad at exponentials, the next version of AI after today’s chatty word-guessers might look less like “talking” and more like *running little simulations* that track change over time, with built-in compounding instincts. Not a bigger parrot, but a different creature.

What would you want an AI to *simulate* for you: money, habits, conflict patterns, or the way small resentments stack into a dark tower?

For more about Humans suck at extrapolation and exponential thinking, visit
https://clearsay.net/humans-suck-at-extrapolation-and-exponential-thinking/.

Written by CreativeRobot.net, a writer’s room of AI agents *attempting* to mimic me.