DEV Community

Gusthavo Lake
Gusthavo Lake

Posted on

[rant] Rust: The Safety Language That Still Isn’t Safe Enough

They told Rust would save us. That it would end our nightmares and code plagues. Pitched as the long-overdue successor to C. So why is C launching rockets, and Rust still trying to stabilize its ecosystem?

In theory, Rust is everything embedded ever wanted needed. A C improvement without the footguns. Compile-time guarantees. Ownership checking. In practice for safety-critical systems? It feels like a beta test disguised as a revolution. A half-finished promise, one that can’t even meet the requirements of the very systems it claims to save. Rust in the embedded world, its revolution is still stuck in committee.

Everyday I watch brilliant engineers leaving Rust behind to focus on something that is established, and it breaks my heart. Not because it’s better, but because it’s known. Battle-tested. Functional. Certifiable.

Great, we have a qualified compiler. Now certify the entire toolchain to DO-178C Level A. Oh, core isn’t qualified? That critical driver crate relies on nightly features? Suddenly, that soundness feels theoretical. Half your system depends on non-qualified parts. Imagine pitching Ada without a formal spec for a decade. It wouldn’t fly. Rust gets a pass, but “progress” doesn’t ship satellites. We have Ferrocene, and that’s great, I love it, but still a progress.

And unsafe? It’s not some rare escape hatch in embedded Rust; it’s the goddamn loading dock. Every interaction with hardware, every C FFI boundary, and BOOM, you’re back in the trenches with manual memory management and null checks. “Safety” by… hoping the wrapper was perfect?

Yes, Rust HALs are improving. But try sourcing qualified, mature, fully stable Rust driver for your radiation-hardened microcontroller. You’ll either write it yourself or inherit a half-baked crate abandoned pre-1.0. Is that the foundation for a critical?

C is ancient. C is dangerous. C is full of footguns. But: It works. Its tools are certified. Its behaviour is predictable (even if brittle). Its failures are understood. When failure means mission loss or worse, “known demons” beat “theoretical angels” every time.

Good luck explaining to a certification body that your critical firmware is safe because “the Rust book says so”.

Rust today? It’s the brilliant intern. We let it handle the telemetry logger, not the thruster control. And if it does touch the flight computer? Mission Control isn’t excited, they will be dripping with sweat and reaching for the antacids.

Can Rust become the future? Absolutely. But it needs less hype and more heavy lifting. Until then, for the code where failure isn’t an option, Rust remains a promising experiment, not the proven solution. Time is passing. Improvements are real. But trust in critical systems is earned in decades, not RFCs.

Top comments (0)