Your AI Anxiety Has a Business Model
and other random dunks

I still remember March 2020
COVID had just hit the East Coast. I was stuck in my Brooklyn apartment. My roommate was on one side, and my girlfriend, now my wife, was on the other. No one could tell us what would happen next. The feeling wasn’t dramatic; it was small. Claustrophobic. You kept waiting for someone in charge to say something useful, and they kept not saying anything of use.
That’s what rapid, uncontrollable change actually feels like in your body. Not cinematic. Just grinding.
A lot of people in tech feel exactly that right now.
The anxiety is real. It’s also mostly misdirected
Every week brings a new AI model, a new framework, a new take on why your current skill set is already obsolete. The feed never stops, and each announcement arrives with the same subtext: this is the one you can’t afford to miss.
Most of that anxiety isn’t a skills problem. It’s an attention problem wearing a skills problem’s clothes.
Before you read anything an announcement, a demo, or a Substack—think about what you want to achieve. Not the answer that sounds strategic. The real one. It’s fine to follow the hype out of curiosity. Just do it with intention, not because the anxiety got there first.
History says the panic is always disproportionate
I was young when Webvan, a grocery delivery company, collapsed. They raised $394m +/- but spent it all before smartphones were common. To me, it was just a news story about crazy technology people.
I was working a grocery store register when 2008 financial crisis hit. No mortgage, no equity, mostly insulated. (I understand now that it was luck, not wisdom.)
The pattern from both collapses: real things were being built inside the noise. Most companies that failed did so due to poor execution. This included bad infrastructure, over-leveraged balance sheets, and flawed distribution. The underlying idea was often not the problem. Most of what was being built needed another decade to matter.
The same is true now. There are genuinely important capabilities being developed. But be conscious of who’s pushing the narrative. The AI labs, the investment firms, and the Twitter feeds that profit from your worries about what skills to learn next are not neutral. That doesn’t make the information useless.
It means read with your eyes open and most of it is just noise.
The real work is being skipped
Here’s what I keep seeing on loop: everyone’s showing the demo. Nobody’s showing the six months of scaffolding that came before it.
A lot of what’s being sold as production-ready AI is still science fiction with better marketing.
The basics of software development, like data modeling and test-driven development, are now more indispensable than ever. Knowing what goes wrong and where, or observability, matters more than ever. The speed of development just makes a poor architecture fail faster, leading to even bigger mistakes. Like most things, whether in silicon or building real things like bridges, a solid infrastructure and design matter more than ever.
Get your hands dirty. That’s not a metaphor for staying current. It’s the only reliable way to separate what’s real from what’s performing.
Hold both things at once.
The current anxiety is understandable. It’s also unnecessary or at least, it’s being fed by people with a stake in keeping you anxious.
The dot-com era gave us infrastructure we’re still running on. This one will too. Keep moving.
Articles and references that help me make sense of the world this week
Pace of change — https://fulcrumpro.com/article/faster-than-ever-understanding-the-accelerating-pace-of-change
AI executive order / state laws — https://www.paulhastings.com/insights/client-alerts/president-trump-signs-executive-order-challenging-state-ai-laws
Cedric Chin / Common Cog on sensemaking — https://commoncog.com/how-to-make-sense-of-ai/
2-hour focus block —
Webvan (Wikipedia) — https://en.wikipedia.org/wiki/Webvan
Microsoft Research / compounding teams —


