The Great Software Quality Collapse: How We Normalized Catastrophe
October 17, 2025

We’re living through the greatest software quality crisis in computing history. A Calculator leaks 32GB of RAM. AI assistants delete production databases. Companies spend $364 billion to avoid fixing fundamental problems.
This isn’t sustainable. Physics doesn’t negotiate. Energy is finite. Hardware has limits.
The companies that survive won’t be those who can outspend the crisis.
There’ll be those who remember how to engineer.
Source: The Great Software Quality Collapse: How We Normalized Catastrophe
Denis Stetskov thinks we’re in the middle of what you might consider to be a new software crisis. Not created by, but perhaps amplified by large language models.
He observes how wasteful of memory and computing resources modern software applications tend to be, and he’s also concerned, particularly with live sandwich models, about how we get new developers when the roles we traditionally gave junior developers are being done by AI code generation tools.
Definitely worth a read and some consideration of the points he is making.
Having been around the block a few times, I don’t think there’s anything particularly new about wasting computing resources. We all know about Moore’s law, but there is also the ironic Wirth’s Law (also know as Gates law) which observes that “software is getting slower more rapidly than hardware is becoming faster.”
But I think his points are well made and, as mentioned, well worth thinking about.







