DEV Community

Cover image for Industries Where Your C Code Saves Lives (And They're Hiring)
fosres
fosres

Posted on

Industries Where Your C Code Saves Lives (And They're Hiring)

Between 1985 and 1987, a radiation therapy machine called the Therac-25 killed three people and seriously injured three others. The machine was supposed to deliver controlled doses of radiation to cancer patients. Instead, it delivered radiation doses hundreds of times higher than intended.

The cause? A race condition in the C code controlling the machine. When an operator entered treatment parameters too quickly, the software would configure the machine for high-energy electron mode while the safety mechanisms thought it was in low-energy X-ray mode. The patients received massive radiation overdoses.

One victim, a 33-year-old woman, felt an intense burning sensation during treatment. She died from radiation poisoning. Another patient, a man receiving treatment for a tumor, received such a massive dose that he developed radiation burns throughout his body. He died five months later.

The Therac-25 disaster wasn't caused by sophisticated hackers or nation-state attackers. It was caused by a simple bug—the kind of concurrency error that CERT C rules are specifically designed to prevent.

This is what happens when C code controlling life-critical systems has bugs. People die. And the industries that learned this lesson the hard way are now desperate for developers who understand how to write secure C code.

Let me show you which industries need CERT C expertise—and more importantly, why their software can't afford to have the vulnerabilities that secure coding practices eliminate.

Aerospace: Where Every Line of Code Gets Audited

Flight control software runs on embedded systems with no room for error. You're at 35,000 feet when the autopilot makes a calculation. That calculation depends on sensor data stored in memory. If a buffer overflow corrupts the memory location next to that sensor data, the autopilot is now making decisions based on garbage values.

For those of you educated in secure coding in C, you'll immediately recognize this scenario. The gets() function in C will copy characters into a buffer regardless of the buffer's size. If you're reading sensor data and use gets() or don't properly validate array bounds, you can overwrite adjacent memory. That adjacent memory might be storing your altitude. Or your airspeed. Or your pitch angle.

Aerospace companies mandate CERT C compliance because they've debugged these exact bugs in code that was controlling actual aircraft. This isn't theoretical—it's the accumulated knowledge from decades of incidents where software bugs caused crashes.

When aerospace engineers read CERT C Rule ARR30-C—"Do not form or use out-of-bounds pointers or array subscripts"—they're not reading a style guideline. They're reading a rule written in blood.

Automotive: Your Car is a Computer on Wheels

I want you to imagine something. You're driving on the highway. Your adaptive cruise control maintains distance from the car ahead. The distance sensor returns a measurement. The code multiplies this measurement by a constant to calculate braking force.

What happens when that multiplication overflows?

In C, when you multiply two integers and the result exceeds the maximum value the integer type can hold, you don't get an error. You get wraparound. A large positive number becomes small. Or negative, if you're unlucky. Then that negative value gets cast to unsigned—and suddenly you have a massive number that makes no physical sense.

Your car's computer thinks the vehicle ahead is either impossibly close or impossibly far away. It calculates braking force based on corrupted data. Either it slams the brakes when it shouldn't, causing the car behind you to rear-end you at highway speed—or it doesn't brake at all, and you slam into the car ahead.

Modern vehicles have millions of lines of C code controlling anti-lock brakes, airbag deployment, electronic stability control, and autonomous driving features. The automotive industry requires secure coding practices because integer overflows have caused real crashes with real injuries. Automotive engineers have spent weeks debugging crashes where someone forgot to check for overflow before doing arithmetic on sensor data.

Medical Devices: Code Running Inside Your Body

If automotive software is dangerous when it fails, medical device software is even worse. Because when medical device code has bugs, there's no recall notice. There's no software update. The code is running inside someone's body.

A pacemaker is a small computer implanted in your chest. It monitors your heart rhythm and delivers electrical impulses when needed. The code running on that device is written in C because the hardware is resource-constrained and needs direct memory access for real-time performance.

Now imagine the code has a null pointer dereference. In a desktop application, you'd get a segmentation fault, the program would crash, and you'd restart it. In a pacemaker implanted in someone's chest, the device crashes and stops monitoring heart rhythm. If the patient has a life-threatening arrhythmia in those seconds while the device is crashed, there's no one watching. No one delivering the electrical impulse that keeps them alive.

Or consider an insulin pump. It calculates insulin delivery based on blood glucose readings. The glucose meter sends data to the pump. If the input parsing code has a buffer overflow, an attacker could inject malicious data—but even without an attacker, a simple bounds checking error could cause the pump to misread glucose values. Too much insulin delivered means hypoglycemic shock, seizure, coma, death.

Medical device manufacturers face strict regulations precisely because these bugs kill people. The Therac-25 disaster taught the industry that radiation therapy machines need rigorous software safety practices. Modern medical device manufacturers learned from that tragedy.

CERT C exists to eliminate undefined behavior before code gets implanted in someone's body. Before it's trusted to keep them alive.

Industrial Control Systems: When SCADA Fails, Cities Suffer

Beyond individual safety, there's another domain where C coding bugs threaten lives at scale: the infrastructure systems that keep civilization running.

Industrial Control Systems weren't designed for security. They were designed in the 1970s and 80s when "security" meant a chain-link fence and a guard at the gate. The control software was written in C with zero consideration for security because these systems weren't networked.

That's changed. SCADA systems now connect to corporate networks, which connect to the internet. A water treatment plant's SCADA system is potentially accessible to anyone who can compromise the network perimeter.

Here's what a format string vulnerability can do to a water treatment plant:

The SCADA software logs sensor readings from the chlorine monitors. The logging code uses printf() with user-controlled input directly in the format string—a classic mistake. An attacker sends a crafted sensor reading that includes format specifiers like %x to leak memory or %n to write to arbitrary memory addresses.

The %x specifiers dump memory contents, potentially revealing cryptographic keys or the state of safety systems. The %n specifier writes values to memory. An attacker can now modify the SCADA system's state. They disable the chlorine monitors that detect when water isn't properly disinfected. Contaminated water flows to thousands of homes. People get sick. Some die.

In 2021, an attacker accessed a water treatment facility's SCADA system and attempted to poison the water supply by remotely changing chemical levels. The attack was detected and stopped—but it demonstrated that these systems are accessible and vulnerable.

Format string vulnerabilities aren't coding style issues when you're writing SCADA software. They're potential weapons that attackers can use to kill people.

IoT Devices: Billions of Targets

Smart locks on apartment doors. Baby monitors in bedrooms. Industrial sensors in manufacturing plants. Medical devices in hospitals. These devices are projected to reach 39.6 billion by 2033.

Most run embedded C code on processors with no memory protection, no operating system security features, no runtime bounds checking. When there's a bug in the code, there's no safety net. A buffer overflow in a smart lock's firmware means attackers can remotely unlock doors. A use-after-free bug in an industrial sensor means attackers can manipulate the readings that automated systems depend on.

IoT devices experience 5,400 attacks per month on average. These attacks succeed because developers used strcpy() instead of strncpy(). Because they didn't validate array indices. Because they assumed input would be well-formed.

Take a look at a simple example. You're writing firmware for a smart thermostat. The device receives temperature setpoint commands over the network. Your code looks like this:

char buffer[64];
gets(buffer);  // Read the setpoint command
Enter fullscreen mode Exit fullscreen mode

For those of you educated in secure coding in C, you immediately see the problem. The gets() function—the same function that enabled the Morris worm back in 1988—copies input into the buffer regardless of size. An attacker sends 200 bytes. Your 64-byte buffer overflows. Adjacent memory gets corrupted. That adjacent memory might be the authentication token. Or the network credentials. Or function pointers that the attacker can redirect to malicious code.

The device is now compromised. Multiply this by millions of IoT devices all making the same mistake, and you have a massive botnet capable of taking down internet infrastructure.

CERT C rules force you to write code that's secure even when attackers are actively trying to exploit it. They eliminate the assumptions that lead to vulnerabilities.

Why These Industries Need You

82% of software vulnerabilities come from coding errors. Not zero-days. Not sophisticated exploits. Just bugs in the code.

The CERT C Coding Standard was developed by over 1,900 security experts who analyzed real-world vulnerabilities and documented the coding practices that prevent them. Each rule addresses a specific class of bug that has caused actual security incidents or safety failures.

When aerospace companies require CERT C compliance, they're not checking a regulatory box. They're requiring proof that you won't repeat the mistakes that brought down aircraft. When automotive manufacturers require it, they're requiring proof that your code won't be the reason for a recall—or a fatality. When medical device companies require it, they're requiring proof that your code won't fail inside someone's body.

The Therac-25 killed people because of a race condition that proper coding practices would have prevented. Modern industries learned from that disaster. They know that undefined behavior in C isn't an academic curiosity—it's the difference between a working safety system and a death trap.

If you're learning CERT C, you're not just learning rules. You're learning the accumulated wisdom from decades of debugging failures that killed people. You're learning how to write code that saves lives.


Thanks for reading this blog!

Top comments (0)