When the OceanGate Titan submersible set out to push the boundaries of undersea exploration, it did so at breakneck speed—with a disregard for some tried-and-tested engineering best practices. The parallels between that fateful journey and coding with generative AI tools like GitHub Copilot or Cursor are hard to ignore. Both represent a fascinating, yet potentially perilous, obsession with speed and innovation over thoughtful diligence.
Why we use Generative AI
Generative AI in coding promises the moon—and then some. These tools can spit out boilerplate code, intricate algorithms, and even full-fledged modules in seconds, eliminating repetitive tasks and unlocking time for developers to focus on "more important" matters. I use these tools myself, often leaning on Copilot or Cursor to generate large chunks of code, particularly when faced with looming deadlines or the ever-present pressure to ship faster.
Management loves this. I'm frequently asked by leadership how we can leverage generative AI to deliver projects faster, cut costs, and maximize efficiency. It's an enticing prospect—one where coding moves from hours of painstaking manual effort to rapid, AI-assisted deployment. But beneath this glittering promise lies a deeper, more insidious risk.
The OceanGate Comparison
The OceanGate Titan disaster wasn’t just a tragedy—it was a cautionary tale about what happens when speed and innovation are prioritized over safety and rigor. In software development, generative AI mirrors this mentality. Like the submersible, AI-generated code often skips over best practices, cutting corners to save time but risking catastrophic failures.
Consider this: the Titan’s builders chose lightweight materials, unconventional designs, and limited testing to "move fast." Similarly, generative AI tools produce code that might look functional but lacks the careful thought and consideration of a human developer. The result? Code that’s more prone to obscure, insidious bugs—bugs that could lie dormant until they cause significant damage.
Bugs, But Worse
Generative AI doesn’t think. It synthesizes patterns from oceans of training data—and sometimes, it synthesizes bugs right into your code. Not the trivial, easy-to-spot bugs that humans create, but ones so convoluted, subtle, and buried deep within that a human wouldn't have written.
It’s the kind of code that looks fine in a quick review but blows up spectacularly under edge cases. And the more you rely on it, the greater the risk of introducing flaws that cascade through your software—like a faulty weld on a deep-sea submersible. The horrifying part? These AI tools don’t understand the context, let alone the consequences, of the code they generate.
Most Websites Aren’t Important
Let’s be honest—most websites and apps aren’t that important. They’re not controlling life-critical systems, operating medical devices, or managing infrastructure. For many projects, the "move fast and break things" mentality is fine because the stakes are low. If your e-commerce site or blog has a bug, it’s an inconvenience, not a catastrophe.
But this doesn’t hold true for high-stakes software—the kind of code that powers airplanes, runs hospitals, or secures sensitive data. For these critical systems, using generative AI isn’t just risky; it’s reckless. In such cases, there’s no room for cutting corners, no justification for delegating critical thinking to a machine.
The Ethical Dilemma
I’m torn. I use generative AI to save time—a lot of it. But I also worry about the long-term cost of this tradeoff. The more management pushes for speed, the more likely we are to sacrifice robustness. And the more AI-generated code we ship without thorough review, the more we’re gambling with the safety and reliability of the systems we build.
A Sobering Conclusion
At the end of the day, generative AI in software development is not inherently bad. It’s a tool—one that can be used responsibly or recklessly. But we must be honest about its limitations and potential consequences. Using it as a crutch to move fast is fine—if you’re building something unimportant. But when lives, livelihoods, or security are on the line, "fix later" might come too late.
Top comments (1)
great share.