The recent disaster at CrowdStrike is just the latest symptom of a much larger illness plaguing the software industry: continuous delivery is killing software quality. Agility, when applied correctly, is a powerful force in software development. However, the relentless pursuit of rapid and frequent releases has led to a situation where quality is sacrificed in the name of speed. It's as if the whole software industry has turned into a fast-food factory, where delivering software fast is more important than making software that is good.
Remember the golden age of gaming? When a game was released, it was printed on a cartridge or burned onto a DVD. There was no turning back, no patches, no hotfixes. If your game was full of bugs or incomplete, it was a financial disaster. The pressure to deliver the best possible product was immense. The result? Games were usually impeccable.
Fast forward to today, and we find ourselves in the era of "early access" releases. Games, and increasingly other software, are launched unfinished with the promise of "fixing it later." Continuous delivery has become the perfect excuse for shoddy work. We are literally telling our customers, "Don't worry that your product isn't ready now; we'll finish it eventually."
This mentality is spreading like wildfire. It's not just games anymore. Critical infrastructure, financial systems, even medical devices and transport infrastructure are being subjected to this reckless approach. The CrowdStrike incident is a harsh reminder that when software fails, the consequences can be catastrophic.
We need to return to the principle of formal releases. Software should only be launched after rigorous processes that validate real-world usage in controlled environments. The idea of putting software that is just done, after passing some pipeline of tests, into production is crazy dangerously. While automated tests are crucial for ensuring code quality, they are complete meaningless for validating the overall system correctness and reliability.
We can learn a lot from other engineering fields. Take civil engineering, for example. When a bridge is built, no one would open it to the public traffic before it is fully completed and has undergone rigorous safety tests. In Brazil, the ART (Technical Responsibility Annotation) requires engineers to sign off on their projects, assuming full responsibility for their execution. A mistake, caused by lack of skill or negligence, can have severe criminal implications. Why should we expect any less from software, which is increasingly becoming a vital component of our lives?
We know it is possible to create safety critical software that works correctly. This happens when the responsible organizations are motivated not by economic forces, but by significant regulatory pressure. The problem is that as software continues to "eat the world," more and more critical systems are being developed by industries that face no regulatory pressure.
We can therefore expect things to go very wrong, and a major disaster involving the loss of many lives is a question of "when," not "if." Perhaps then developers will stop treating their work as a perpetual beta and start treating it as the critical component of our lives that it truly is.
Top comments (0)