The Whitehouse released a report this week encouraging the use of memory safe languages, over non memory safe languages, specifically calling out C/C++. This has obviously caused a stir in the programming community, especially considering most people's low opinion of the government’s ability to produce quality software. But if you’ve used a computer in the last year you’ve probably been subjected to poor quality software made by the brightest minds in Silicon Valley, so it's a moot point
Missing Points
What I find surprising in all of this discourse is that people have missed the point, intentionally or not. Memory unsafe languages have footguns that lead to unsafe software, and that leads to CVEs that cause harm to real people. The context for the report is not your hobby project, nor your single player 2D platformer. It’s about software that power governments and institutions, places that store large amounts of people’s data etc. It’s about companies that provide live services to millions of people, and store their credit card numbers, phone numbers, addresses, names, and dates of birth. This also not about it becoming illegal to program in memory unsafe languages. To sum it up
Programmers writing lines of code do not do so without consequence; the way they do their work is of critical importance to the national interest.
The Enduring Legacy of C
A lot of the discourse revolves around C, with the main point being that you will never get rid of it. That the world runs on C, your operating system runs on C, your programming language runs on C etc, etc. I know how important C is, I've written multiple articles that feature C prominently, and I'll probably write some more
- Tracing the Lines: From the Telephone to Unix
- C Strings and my slow descent to madness
- Understanding the Compilation Process in C: A Step by Step Guide
- SDL Tutorial Part 1: Opening A Window
There are a lot of reasons for C's dominance in the modern era, but the one that sticks out to me is that it is the lowest common denominator for just about everything. If you write something in C, it is straightforward in most languages to bind to it through an FFI. That means if I for instance, come across a cool C library that does exactly what I need, I can just bind the functions in my Python code without knowing all the nitty gritty details of how it's implemented. I can do it again in Go, and Java and C# and on and on. Just look at the list of programming languages that have Raylib bindings to see how powerful of a concept this can be
C's Simplicity and Its Challenges
But Cs greatest strength is also its greatest weakness. It's simple. My 2nd Edition of K&R is just shy of 300 pages, but a good chunk of that is a reference manual, and a chapter on lexical conventions which can be skipped on a first read through. You could skim through the book in a day. But while the syntax may be simple, there are a lot of concepts in C that are tricky to get right, especially in large programs
- Pointer Arithmetic
- Memory Management
- Headers/Preprocessor
- Macros
- Undefined Behavior
- Strings and Cross Platform string behavior/support
- Actually building large projects
And you don't have to take my word, here is C++ being brought up again as an alternative to C in the Linux Kernel just this year (2024), by someone who has programmed more C than I ever will in my lifetime. And here is a page that details GCC's transition to C++ (the irony is not lost on me). If these people are the best C programmers in the world, what chance do I have? Even John Carmack talks about being surprised in the Lex Fridman podcast about how many mistakes he found when he attacked his code base with sanitizers and fuzzing.
You might ask, well can't we just make C a better language? And the answer is yes, but that takes time. I'm really looking forward to many of the changes that are in C23, but it will be many years before that will be widely available across MSVC, Clang, and GCC. And getting any feature into C is an exercise in frustration. You only have to read JeanHeyd Meneide's article titled Finally, Embed in C23 to understand how thankless of a task this is. I'll give you one excerpt, but I've seen this same sentiment time and time again when it comes to improving the language
It’s deeply depressing and ultimately a great source of burnout being at the grindstone for 4 years for things people were casually discussing about in September of 1995 (and earlier). It’s almost as depressing as putting typeof in the C Standard and then realizing this was something they’d been discussing doing since after C89 (1989). Am I destined to just keep closing the loop on old, unrealized dreams because a bunch of people were too tired/scared/busy/combative to standardize what has literally been decades of existing practice?
It was hard to realize how tired I was of this kind of grind until the day the feature was put into the C Standard, this past Tuesday. I quite literally could not even muster a “yes!” after the vote finished. I just felt empty and numb, because quite literally dragging an entire community of implementers through several hurdles, to finally get them to acknowledge the existence of a problem and its solution, is just… soul-crushing. It is a level of effort that I recommend to nobody in any sphere of life, for any task. At least when you are on the ground and organizing and helping people, you’re providing some kind of material value. Water. Food. Improving something. Here? I’m talking about things that are quite literally older than I am. Just trying to bring ideas from the last 3 decades - things you would think were Table Stakes for foundational languages like C and C++ languages - would not be seen as a radical new paradigm or shift in the language. Nevertheless, I spent (burned?) that energy, and finally. It's in.
That doesn't sound like a fun process to me...
Now if you're still here I want to make it clear that there are qualities of C that I admire. It's an important language in the annals of programming history, and it is well worth learning. I love C's Spartan nature when I'm practicing Algorithms and Data Structures, and I appreciate that it produces software that is small and fast. Today's modern programming languages have done a shit job at filling in the C's shoes, which is why it's still around so prominently. I think of all the programming languages out there, Zig is the one I'm most hopeful for replacing C in what C does best. And it does so while being safer, easier to cross compile and build, and without adding a ton of complexity to the syntax like languages like Rust, D, and C++ do. Now to my main point...
Envisioning a Post C World
It is worth noting that C didn't appear on the scene until 1972. That means that 60% of the population in the United States has been around longer, and there are plenty of programmers that remember a world where C didn't exist. C was also absent in the early 8-Bit era compared to programming languages like Basic, Pascal, and Assembly due to its difficulty to compile. So really it's only like 45 years that C has been a major player in the programming space. But people keep repeating the same mantra over and over again "C will never die", which feels like a shocking amount of small mindedness coming from the programming community. I thought we were supposed to be innovative thinkers, problem solvers, and engineers. Must we forever be shackled to a programming language we created 50 years ago, in field that is only 75 years old? Is it really not possible to envision a world that isn’t powered by C or C++ even if it takes another 50-100 years?
Two Years ago no one was talking about Large Language Models. We were still 30 years away from anything remotely resembling a useful AI assistant. Now, one year later my Church is using AI Generated images from ChatGPT in their service, and Nvidia has doubled its stock priced in a year. 8 Years ago Visual Studio Code was released, and now it is the most popular text editor by far. The programming space can move quickly if the value is there, and I believe that moving away from C is one of those things. If our best years of programming are still to come (and I hope they are), and we expect more code to be written in the next few years than at any other point in history, then the percentage of C code should dwindle every year. If the financial incentives are there, and if governments will start biasing favor towards more memory safe languages, then there will be a strong incentive for certain companies to comply.
Replacing C does not mean erasing its legacy. I will continue to write about C, and tell its part in the ever evolving history of programming, and I hope you will continue to enjoy the story as it is written.
Call To Action
Hi 👋 my name is Diego Crespo and I like to talk about technology, niche programming languages, and AI. I have a Twitter and a Mastodon, if you’d like to follow me on other social media platforms. If you liked the article, consider checking out my Substack. And if you haven’t why not check out another article of mine listed below! Thank you for reading and giving me a little of your valuable time. A.M.D.G
Top comments (4)
Your post could do with a bit of editing (to trim it down — seems a bit repetitive). Also, splitting into sections with each section having a clear point rather than just running on.
One thing you didn't mention is that modern C++ (which is C++ ≥ 11) can, with some discipline, be used in a more memory-safe way via things like never using either
new
ordelete
explicitly and instead relying onstd::unique_ptr
and related classes.I touched a little bit on this with the Linux Kernel talk and GCC's move to C++, but I could have expanded it. I definitely think that C++ is safer than C and that C++11+ is a great way to achieve that. But my observations within the C++ community is not in agreement that modern safe C++ is the way forward
And I have been recently swayed to believe the same
For the first link, it's not clear which of the 4 posts that links to is the one I'm supposed to read. For the second link, the fault isn't C++'s since the smart pointers are being turned back into raw pointers and handed off. That's what I meant by "with some discipline" — which those programmers didn't exhibit.
Unless your entire software stack is written in a memory-safe language, then at some point, you're going to end up calling something written in C. If you do that wrong, the memory-safe language won't save you.
For projects written in "old" C++ — or even C — it's a lot less work to modernize them by using "modern" C++ than it is to rewrite everything in a memory-safe language.
Sure, if you're going to start a brand new project from scratch, strongly consider a memory-safe language; but you can still do better with the legacy code.
Here's a pragmatic-ish approach to the topic. The time when I wrote C code the most intensively was during university, more than 2 decades ago and I still love the language until today. Since then I've been writing C stuff here and there, always to solve very specific use cases, but I've been using mostly higher level languages, C#, Go, Dart, with which I've successfully implemented lots of business applications. I have never made money out of writing an entire application in C or C++ and I don't see any good reason to use those languages as the main language for business application development.
However, I've used FFI to reuse C code, but I've also used FFI to call Go code from Dart (with very minimal or no C code in the middle - tech.craveiro.pt/go-for-dart-with-...). So, you do have a great point about FFI, even if neither language is actually C. However, I would hope WASI would replace that role in the future.
Having that said, I can't imagine the Linux kernel being rewritten in a memory safe language, or XNU or the Windows NT kernel... you get the idea. But, Linus Torvalds seems to be OK with Linux drivers being written in Rust.
So, in conclusion I would say that C isn't going away as a systems programming language, but it should go away as the glue between other languages and, except for maintaining existing applications, it shouldn't be used for application development at all. (I am not including embedded systems, IOT, and so on in the discussion as I have no experience with those, so no valid opinion).
I still think that it should be the first language every developer should learn, though as it creates great foundations.