Degrees are unquestionably hard work, and expensive. But they are unquestionably worthwhile, if you are definitely going into the field in which you study and plan to work there long-term. If your goal is just to write code, then the degree is overkill. If you want to build/plan/design/architect, it'll be a definite asset.
I have a Bachelor's and Master's degree in Computer Science, and I've been in the workforce for 30 years. I earned my Bachelor's before starting work, and then worked on my Masters while employed fulltime.
I've worked with a number of folks with degrees in other fields, and a number with no college degree at all. For most workplace tasks, I don't think those folks were at any disadvantage compared to me.
I very rarely used the specific programming languages I learned during my Bachelor's at my workplace. But the fundamentals I learned, the theory behind the programming, made it relatively easy to learn new tools/languages quickly. I would say I have frequently been the one on the team to grasp the new concept/tool, and then teach the others how to apply it to the problems we were solving.
Where I saw the biggest difference is in the design/architecture phase. Judging from only my personal experience, folks without training in CS do seem to struggle with more theoretical-level tasks. In my career, I have frequently found myself taking lead in system architecture & code design tasks, and letting others do the implementation. That may just be me, or that may be my education; I think it's probably both.
UX/UI Designer and Developer, intern at IBM. Loves studying, from Astrology to Computer Science to Art. Dog person, interested in sci-fi, writing enthusiast.
From what I've seen, most self-taught developers focus on making things work, while CS people will sit down and think about theory and trade-offs, although I've seen the opposite happen.
I think it depends on how much effort you put into understanding how the computer will handle your source code (things like compiling, computer architecture) and how to improve it (algorithms optimization, data structures, heuristics, AI), which is what a CS course forces you to do throughout it's duration. If you're self-taught, you'll have to do these by yourself, which leaves room for mistakes and misinterpretation if you're not cautious.
In general, it's just two different skill sets; but in the end, it majorly depends on the person to learn these things either way.
Degrees are unquestionably hard work, and expensive. But they are unquestionably worthwhile, if you are definitely going into the field in which you study and plan to work there long-term. If your goal is just to write code, then the degree is overkill. If you want to build/plan/design/architect, it'll be a definite asset.
I have a Bachelor's and Master's degree in Computer Science, and I've been in the workforce for 30 years. I earned my Bachelor's before starting work, and then worked on my Masters while employed fulltime.
I've worked with a number of folks with degrees in other fields, and a number with no college degree at all. For most workplace tasks, I don't think those folks were at any disadvantage compared to me.
I very rarely used the specific programming languages I learned during my Bachelor's at my workplace. But the fundamentals I learned, the theory behind the programming, made it relatively easy to learn new tools/languages quickly. I would say I have frequently been the one on the team to grasp the new concept/tool, and then teach the others how to apply it to the problems we were solving.
Where I saw the biggest difference is in the design/architecture phase. Judging from only my personal experience, folks without training in CS do seem to struggle with more theoretical-level tasks. In my career, I have frequently found myself taking lead in system architecture & code design tasks, and letting others do the implementation. That may just be me, or that may be my education; I think it's probably both.
From what I've seen, most self-taught developers focus on making things work, while CS people will sit down and think about theory and trade-offs, although I've seen the opposite happen.
I think it depends on how much effort you put into understanding how the computer will handle your source code (things like compiling, computer architecture) and how to improve it (algorithms optimization, data structures, heuristics, AI), which is what a CS course forces you to do throughout it's duration. If you're self-taught, you'll have to do these by yourself, which leaves room for mistakes and misinterpretation if you're not cautious.
In general, it's just two different skill sets; but in the end, it majorly depends on the person to learn these things either way.
Valid point!