In 2013, I became demoralized in my career. I had left a fulfilling role as a web applications and services developer in the non-profit sector because I knew that my organization didn't follow good practices. After I had joined a software company in the hopes of learning how real software companies improve code quality, I instead found that it was not much better in the software industry than elsewhere.
A large majority of people working on software simply aren't aware of how to build large pieces of software in teams successfully. The Standish Group's 2015 CHAOS report shows that over a dataset of 50,000 projects worldwide, only 29% are considered successful. There are a lot of competing efforts to evangelize practices such as TDD, frequent commits, and continuous delivery, but even those interested in such practices don't often actually, well, practice them. I noticed that during one of Jez Humble's talks on continuous integration, he asks the audience how many of them are actually practicing it, and three people put their hands up.
I was lucky enough that at this low point in my career, a colleague recommended the book Professional Software Development by Steve McConnell. (Thanks Mark!) Among other things, McConnell uses the pages to differentiate software engineering from computer science, and makes a great argument why professional software engineers should have a professional education. Here is one of the things he says, which I have permanently bookmarked in my copy:
When workers educated as computer scientists begin working on production systems, they often design and build software that is too frail for production use, or that's unsafe . . . The typical computer science graduate typically needs several years of on-the-job training to accumulate enough practical knowledge to build minimally satisfactory production software without supervision. Without appropriate formal education, some software developers work their entire careers without acquiring this knowledge.
This was hard to swallow at first. I had my bachelors in computer science and six years experience. Wasn't that good enough? But McConnell seemed to have put his finger on something I had witnessed time and time again in my own career. The experience of deploying software and seeing it melt down in front of users is something that is hard to learn in a computer science class. He had planted a seed that kept growing in my mind. Before long, the seed sprouted and I began looking to attend a software engineering program.
If you are a working software professional who is thinking about doing a master's degree, consider looking for a software engineering instead of a computer science program. Computer science is great; don't get me wrong. You will learn deeply interesting theoretical aspects of computing. But if you want practical skills you can take back to your job, if you care about how to scale up software and software teams, if you want to learn what has worked in practice, in the actual industry, then a software engineering program is definitely something to check out. It was great for me, and I don't think you can possibly go wrong.
This past June, I graduated with my Master of Software Engineering from Seattle University--exactly the same masters degree that Steve McConnell has, from the same program. It took a lot of work, sacrifice from my wife and children, and a not insignificant amount of money, but the price was well worth it.
The program exposed me to things I would have never encountered on the job, no matter how long I worked in software. I unexpectedly have experience with user interaction design, software finance, and software security risk management. I also have a deeper arsenal of principles, patterns, and practices for the activities associated with designing, implementing, testing, and delivering software.
Should everybody who works in a software team get a Master of Software Engineering? I won't go that far, but I can say that it was deeply beneficial for me, even after first having more than five years' experience in the industry.
Would it be good for you?
Of course, only you can decide.
Top comments (18)
Instead of encouraging a graduate-level Software Engineering program, I would rather look at how we can promote learning the core topics of software engineering - requirements, design, testing, project management, process and methods, ethics, economics - in undergraduate education. As someone who has interviewed perspective interns and recent graduates for entry-level positions, I can say that there shouldn't be so many people who are completing undergraduate programs and entering industry with only exposure to a small fraction of things that they will end up encountering.
Another problem is the number of people from non-traditional educational backgrounds - graduate certificates, bootcamps, and self-taught. I've worked with people who either don't have a college education or have an education (in some cases, a graduate degree) in an area other than computing. I would consider some of these people to be very good software engineers, but they still needed to learn a lot on the job. I'm unsure how to make these topics accessible to entirely self-taught developers. Perhaps it's simply a risk that when a company employs someone from a non-traditional background, they are losing out on some topics taught in the classroom.
Those topics should be promoted - if you are getting a bachelor's in software engineering. If you are getting a bachelor's in computer science, then you should be focused on CS. A university is not a trade school and the whole point of it is not to get students ready for careers.
You're absolutely right that a university isn't a trade school. There's also value in computer science undergraduate education, for some people.
People who want a university education and to go into industry afterwards aren't typically served by a computer science degree, though. Completion of the program doesn't leave these people with the knowledge and skills needed to work in teams to build large and complex software systems.
There are far more undergrad CS programs than SE programs. Many students don't realize the difference and industry is worse off for it, since experienced engineers are now teaching university graduates things that they should have learned in the classroom.
I 100% agree with you. Part of the problem is trying to figure out a degree before you turn 18. It often does surprise me that more schools don't have SE degrees. I went to an engineering school for undergrad (did not do SE or CS) and my university didn't even have a SE degree, BS or MS.
Edit: Though part of me also thinks that maybe we shouldn't be expecting new grads to be ready for the job, in any field. Apprenticeships and formal mentoring should probably be bigger across the board. If you get a CS degree, you have a lot of knowledge that is very useful... but you still need more and, really, most of that is best learned on the job.
100% on apprenticeship, formal mentoring, and I'd add internship and co-op in university education.
Not every software engineer will have a university degree. But even those who have taught themselves or go to a boot camp can take advantage of apprenticeships and mentoring. In fact, I think that companies that offer students 3 to 6 month internships should be more open to people outside of degree programs participating.
Part of the problem is that industry, and the economy at large, doesn't really reward quality software development. I covered this in my article are we forever cursed with buggy software
So long as ad-hoc solutions are capable of getting funding, being used by millions, and generating initial revenue, there is no incentive to build quality solutions. There is simply no competitive advantage to achieving more than the bare minimum for a product.
Granted, many companies will fail because they don't have a talent pool that achieves even that low threshold.
I wouldn't be surprised if the tolerance of the general public begins to drop, especially given some of the recent incidents. The sheer volume of data that these companies and applications have access to is staggering and can be rather personal. If data is leaked from Facebook, Amazon, or Google that involve geolocation data, purchase history, or personal files, I can see that being a huge incentive for companies to focus on quality.
But that requires an informed and engaged public...so...maybe not.
I couldn't agree more. A computer science degree alone is no guarantee that you can develop software effectively or efficiently.
Formal education in our profession has a terrible track record when it comes to producing graduates with the skills required build software with desirable qualities.
With that said, there are definitely exceptions. But even then, it takes years of on-the-job training to turn a green developer into an experienced, productive developer with good judgment.
While I applaud Scottshipp's drive to get his masters, I pray that we can improve our skills and knowledge as a profession without going that far.
I'm a voracious reader of anything related to programming and software development and I can tell you that there is good stuff out there. It's just hard to find among all the tool-pushing, language-advocating, methodology-aggrandizing, opinion-filled internet.
There are a couple of problems as I see it:
Congratulations on your masters.
Part of the issue with research is that we don't know what we should be measuring? Metrics for quality and productivity are ellusive in the software world. Test setup is also an issue as all software projects tend to be unique in some way. There are a few domains with lots of inner similarities, like full stack business pages, but this touches on only a fraction of programming.
I think this is closely related to your point about demonstrating skills, but I'm focusing more on the achievement of the team instead.
Is the challenge of finding acceptable quality and productivity metrics really that big of an issue for CS researchers?
When I think of all the crazy-hard questions scientists study (in all kinds of fields) and all the years they spend on one little area, I don't think CS research is particularly challenging or unique.
I just think that too few people and businesses really care about CS research.
I read about a hospital adopting a decision tree to triage patients with chest pain (seilevel.com/requirements/visual-m...). It saves lives, saves money, and increases quality of care. But the point I want to make is that someone took the time to study the subject and come up with a decision tree based on the evidence. It's not perfect but it's better than sending everyone with chest pain to the cardiac unit.
Why can't we have this kind of thing for TDD? It doesn't have to be 100% accurate, it just has to be better than what we have now, which is basically no scientifically validated guidance.
Imagine if we had a decision tree for TDD with 3-5 questions and it would spit out the probability that your project would benefit from TDD.
Our profession really is in the dark ages.
Medicine is surprisingly predictable and routine compared to software development. You can classify health issues, and outcomes, in a relative straight forward manner. The hospitals across a region, the country, or continent, all generally deal with the same types of issues (naturally regional variance), as there are always humans in the equation with a similar culture.
In software we just don't know what indicates something is good or bad. Is it user satisfaction? Money made by the company? Data breaches, or security incidents? What exactly is the quality that we are measuring?
We don't have reliable ways of saying the complexity of a feature, or the significance of defect. There are too many variables involved.
I would compare this to the field of art. Do they have good metrics on what constitutes a good artist?
I'm enjoying our little debate.
You wrote:
I disagree. For the most part we desire software that meets all its functional and non-functional requirements. When you turn on the auto-pilot in a plane, you expect it not to fly the plane into the ground. When you activate WPA-2 wireless encryption, you expect your wireless connection to your router to be encrypted.
That's very different that art.
All things being equal:
We have money. The only reason business people care about software quality is because low quality means loss of money and possibly risk and liability. It's non-trivial to convert defects, technical debt, and new features into money but it can and should be done. The business people supervising your project are doing that all the time.
So, given two identical applications produced by ACME Software Inc., the better one is the one that has the greatest positive effect on the bottom line of the business over the life of the system.
Software developers go on and on about quality because they think low quality slows them down and makes their projects more costly than they need to be. But, when you zoom out, money is the only thing that matters for most commercial software projects.
There is no standard by which functional and non-functional requirements are defined for software. When we say user data should be "secure" we don't have a clear definition of what that means. We also don't have any true way of testing that, other than just waiting for incident reports.
Waiting for issues is also not a good measure since popularity plays a major role here: some products have fewer users to discover issues, some services have less interesting data, thus attracting less attackers.
I agree on the money argument, it's what I said in my article about doomed software quality. Major breaches, failures, and otherwise seem to have no impact on the money flow at the moment. With money as our only metric there is simply no pressure to have what most people would qualify as good quality software.
Every year somebody programs they should know more about software than the year before. Any year where this is not the case is a year that person has slipped behind in their career path.
It's not necessary to go to school to learn the concepts you need, but it is necessary to find a place and time to learn them. Many people will simply not be in jobs where that experience will come naturally. Many people will not take the initiative to learn more.
That is, whether you go to school or learn on the job doesn't matter. What does matter is that you are constantly engaged in the field of programming. Even if you go to school there is a constant need to update your information, it's a rapidly changing field. The day that somebody stops reading about new methods, new languages, new tools, or otherwise is the day they start failing.
Am actually considering taking a masters in Software Engineering coz i want to understand perfectly well, how softwares are been made and managed. Though, i dont have a CS degree but i program in Java and also do javascript, html and css Android included also. . . what do you guys think . . is it possible to do the masters ?? . . would any school accept me??
Hi Nonso, sorry for the delay in replying. You would have to look into the specific degree program entry requirements and see. Usually they are listed on the university's web site. I know that when I applied to the program at Seattle University, it was specifically geared toward people who had been working a couple years in the software field already. Many people who are already working in software don't have CS degrees, though. They instead have business, math, physics, philosophy, or other degrees. This was acceptable at Seattle University, but the degree requirements then changed slightly for those people to incorporate an object-oriented programming class and an algorithms class in the first year. Those who had CS degrees already were allowed electives instead of these classes. ** Note that Seattle University may have changed their entry requirements, I don't know.
Ok thank you, would definitely check out the schools website
I would hit the like button but then I would encourage masters and implicitly an archaic education system that needs a huge redesign and refactor.