More than 20 years ago now (wow, I'm old), I was in the middle of a computer science program at my alma matter, Carnegie Mellon University.
And, ...
For further actions, you may consider blocking this person and/or reporting abuse
As someone who learnt C as a first language only 2 years ago I think it can be good. Although it depends. If you learn it at university, like I did, then it's great because they teach you in a structured way and you get the help you need. Self-learning, not so much. It can be hard to know what to learn in what order and finding the right resources to explain concepts. You may just end up more confused than when you started.
So my conclusion is that if you can find someone to teach you C then it's good. Otherwise you may be better off sticking to some easier language.
I'm honestly just kinda tickled that, all these years later, they're still teaching C at colleges.
How many students actually use that knowledge learned outside of uni is probably a very small sliver, however. There's not a lot of need for C/++ for most things people are building nowadays (see: CRUD business apps, SaaS web apps, etc.).
I wish Rust had existed when I started learning development instead of starting with C because I appreciate the memory model there way more and its compiler messages are amazing.
"How many students will actually use X after university" can be applied to any course. The point isn't to teach a useful technology, but rather to teach the fundamentals of how programming languages and computers work. I know there are several more parts needed to cover it all, but learning C is a very good place to start. Even though I hate C and will hopefully never use it professionally, the things it has taught me will always be there in the back of my head when programming.
I can't comment on Rust as I've never used it.
This. So much this.
//Learn C the Hard Way// by Zed Shaw is an excellent path if you're not in university.
This is the best take I've seen on the subject, well said.
New programmers should not learn C/C++ unless they want to. I recently wrote about this very topic here . There's nothing inherently wrong with learning C/C++, and you should 1000% learn it if you want to, the problem is with the community of developers telling newbies that they must learn C/C++ or the they're not real programmers because of made up reasons.
For example, I saw an answer on Quora once saying you're not a "real" programmer unless you understand the following code: and the code was a very long pi calculator formatted to look like the pi symbol. Who does that help? Furthermore, just about every sentence in a book like "Learn C The Hard Way" is condescending about C in the context of other programming languages with the background noise of "you're not a real programmer until you've mastered this language."
Telling people they must learn C/C++ as their first or second has become a form of gatekeeping, or at least cookie-cutter go-to advice that people give without thinking. In fact, I think most of the people telling newbies that "learning C will make you a better overall programmer" don't even know the language themselves, or have not been programming for very long.
I agree with this. I had written a few words about gatekeeping specifically, but eventually discarded most of my post because I didn't want to derail the thread. Such attitudes turn people away from the community. It is not wise, even if they are CS students.
I think one should definitely start with C/C++.
Things like memory allocation, garbage collection and keeping track of stray pointers when done manually helps learn about these things and thus improve performance which a new programmer could just pay no attention at all starting off with some modern language.
Same goes for data types, and the fact that you have to manually write many functions too.
This was essentially CMU's reasoning for introducing us, as students, to C/C++. We re-implemented malloc(), wrote a garbage collector, implemented RSA encryption, and did several other 'basic' things.
I could definitely chalk my undergrad experience up to "those educators knew how to get us jobs." But, if I'm being perfectly honest, more than 60% of our initial undergrad cohort failed out and more than 90% of us either failed out, or re-majored.
So it wasn't "here's how CS kiddies will do well." :( It was... something else for those of us who came out :)
I can't say learning C++ is the right choice, but it is beneficial to know. I started with Zzt OOP, maybe some perl. But one thing I had been doing was reading, "Teach Yourself C++ in 21 Days" I never finished, didn't really write any C++ but that was a helpful book overall.
If you started with "maybe some Perl," you are immediately and unquestionably an OG on this site and forever.
Don't let the young-uns phase you. There's nothing that can be expressed in any client side language that you haven't seen, grokked, and dismissed as a Perl regex or some shit. :D
Helpful in what ways? about the book
This was the 2001 edition. I like some of the analogies used to explain things. Like speed dial for memory addresses (pointers). It did a good job of building on the previous examples. It has been a long time but The D Programming Language by Andre is the only other book I feel drives the principles to programming as part of the content.
There is merit on both approaches. Or at least I understand what most opinions are trying to convey. But I believe there must be a balance between learning depth and actual progress or productivity.
In the field of biological sciences, we do not start by teaching the students the intricacies of molecular genetics and metabolic regulation. That would be incredibly overwhelming. Instead, we start from the basics, build a solid knowledge foundation that we know they will continue to use in their future careers (and thus won't easily discard or forget), whatever these may be.
Having worked as an instructor for new programmers, I have a strong opinion about this: definitely not. C++ is incredibly daunting for anyone who hasn't been working with computers in some way already (memory pointers is where every single one of them come to a screeching halt and likely fail the class, I've seen a lot of really smart kids change majors because my stupid university teaches C in comp sci 1, to mostly engineering majors). From what I've seen beginners without familiarity with computers or programming hate the syntax and it's very cryptic to people that don't know how languages work already. That's not to say I think they shouldn't learn it, it teaches great fundamentals of programming and techniques that can carry over to java python and JS (🤢) if they dont actually want to go into a job with systems design.
The problem I have with C/C++ is not that they are difficult, but that teaching them does not lead to reliable software. There is not enough time in any undergraduate curriculum to go into the undefined behaviors in C/C++, and those come back to bite the whole industry in the form of CVEs.
For every CS undergraduate today I hope they can finish Software Foundations from University of Pennsylvania. Hopefully every engineer, too. The course teaches you Coq, a formal verification system, with a functional formal specification language and a procedural proof language. This is truly how a computer works. Assembly is after this, because the circuit logic of a working CPU architecture needs to be formally verified in a similar system before the chips can go into production.
Coq teaches you how to reason about software. This is the biggest struggle which I have witnessed. A student with no prior knowledge does not tell keyword from function, variable from type, or function from variable. It is a whole new language. They need to learn by trial and error. We all did. But in C/C++, simple mistakes can lead to cryptic error messages, even referring to internals unrelated to the topic at hand. The best students Google their way out, while the others are left behind. With Coq, the interactive reasoning environment gives proper instructions to students on their trials and errors, while the logic system is permissive enough to model the topic in a non-distracting way.
Software Foundations starts from basic logic, then goes on to teach a theory of programming languages. Programming language is the tool we use as developers. Not knowing it thoroughly is like a craftsperson not knowing their tools. A lighter and informal alternative of this is the Structure and Interpretation of Computer Programs. After that, it introduces reasoning about software systems built on top the language developed in the chapters before.
Although a managed language helps a lot, it does not cover every use case of a unmanaged language. That's where Rust comes in. Rust has apparently been enough for many companies to replace C/C++. Rust teaches you how to reason about memory lifetimes, and concurrent memory access. University professors are creating courses for Operating Systems written in Rust, such as UW CS451 and Stanford CS140e. The error messages from the Rust compiler are not only comprehensible, but also helpful.
For fields not requiring the stability of a long running server or infrastructure, such as computational art, data analysis and visualization, and business process mining, C/C++ is just a mismatch. They are much better off learning Python or JavaScript to boot.
Starting with C/C++ is like starting in the middle and is not an optimum approach to a curriculum IMO.
If you want to build someone up from the basics, the curriculum should probably start with computer architecture (which was a 3rd or 4th year class when I was in college), then assembly, then C, C++, and so on. This kind of approach shows you the progression of abstraction and how we got to where we are now.
But if you want to start with problem-solving, the curriculum should start at a higher level, like python. Then drill down into the details. Like an assignment which purposely leads the student to cause a stack overflow exception. That paves the way for future topics on hardware constraints and language design.
Starting with C/C++ attacks the student from both spheres at once. I believe an incremental approach (from either direction) would better accomplish the goal of educating students.
My own college curriculum started with C/C++, but supports my hypothesis above. I had the benefit of taking Pascal in high school, which was a higher-level approach to structured programming. In my first college programming class (which was C), the professor stated it would be a weeding out class where 50%+ would fail or drop. I showed up exactly 3 times to lecture - first day, midterm, and final - and was an unofficial lab assistant because people asked me to help when I finished early. I made an A, but not because it came naturally to me. It was because I already had a foothold on the problem solving from a higher level language.
I think C/C++ is too hard for beginners. The cognitive load will be too high, and many could drop programming because of that.
Learning with a higher level language (python, for example) for a total beginner can help him understand the basics without the language (including the syntax) going in his way.
When the developer begin to be efficient with the tools he needs to develop in a company, without understanding deeper what they really do, that's when he could be more interested to go down the abstraction stack.
Ooof. All that gatekeeping from a philosophy major.
I would pick a third option, learn C or C++, but don't limit yourself to one language. At uni we covered several languages (Haskell, C, Java, assembly) with some overlap, every time we started a new one it felt like we were progressing faster.
I think being exposed to a low level language and having to work through some of the pain is worth the effort, but new programmers shouldn't be stuck with that until they've mastered the language. I'd say do enough with memory management and pointers so you can appreciate an "easier" language like Python.
I didn't start with C/C++, but something simpler. In my time, it was Basic, Fortran, Turbo Pascal. But I always wanted to develop with C/C++ and I did. I think, it does not really matter what programming language you learn, but it matters what technology you know. At the end, who cares how pointers and multiple inheritance work. But, for example, if you don't know anything about event-driven or asynchronous programming, the knowledge of syntax may worth nothing. And when use technologies, we all learn new programming and scripting languages.
And at the end, C++ is not so complex, but it is very flexible. Some people love C++, some others hate. It is better if newbies try C++ when already have some experience, because they might discover another universe :)
So, my answer is No. Start with something easier :)
First, no.
Eventually, yes!
I had a class that used C++ and it was bad. I barely remember anything about what we were supposed to be learning because I had to spend all my time fighting the language.
Anything that gives you a new perspective has to be good. C does, as does Haskell. Maybe that’s a cliche but it’s true.
Stop treating C and C++ as one language. As for 2020, the concepts of both are completely different.