Discussion about this post

User's avatar
Rainbow Roxy's avatar

This article comes at the perfect time. You nail the GPU obsolesence. Applications win, thankfully.

Expand full comment
DeReK WaTSoN's avatar

A thought on Nvidia's V Michael Burry's chip depreciation argument and why Nvidia are wrong.

The reason older chips appear to have long useful lives right now isn’t because they’re economically efficient. It’s because the supply chain is so constrained that people will run anything they can get their hands on.

That’s not a strategy. It’s a shortage, and right now they can take advantage of it and depreciate old chips over a longer period.

As soon as supply catches up — and especially as AI moves into real-world robotics — the myth collapses. Robotics, automation, AV systems, edge devices… they all need much faster, low-latency inference than 7nm or 14nm silicon can deliver.

The reality is simple:

Older nodes persist today because we have a bottleneck, not because they’re good enough.

Once AI leaves the cloud and enters physical systems, inference becomes the new frontier, and efficiency wins.

The obsolescence cycle accelerates, not slows.

The industry hasn’t priced this in yet. But robotics will force it.

Expand full comment

No posts

Ready for more?