In 2013, I became demoralized in my career. I had left a fulfilling role as a web applications and services developer in the non-profit sector beca...
For further actions, you may consider blocking this person and/or reporting abuse
Instead of encouraging a graduate-level Software Engineering program, I would rather look at how we can promote learning the core topics of software engineering - requirements, design, testing, project management, process and methods, ethics, economics - in undergraduate education. As someone who has interviewed perspective interns and recent graduates for entry-level positions, I can say that there shouldn't be so many people who are completing undergraduate programs and entering industry with only exposure to a small fraction of things that they will end up encountering.
Another problem is the number of people from non-traditional educational backgrounds - graduate certificates, bootcamps, and self-taught. I've worked with people who either don't have a college education or have an education (in some cases, a graduate degree) in an area other than computing. I would consider some of these people to be very good software engineers, but they still needed to learn a lot on the job. I'm unsure how to make these topics accessible to entirely self-taught developers. Perhaps it's simply a risk that when a company employs someone from a non-traditional background, they are losing out on some topics taught in the classroom.
Those topics should be promoted - if you are getting a bachelor's in software engineering. If you are getting a bachelor's in computer science, then you should be focused on CS. A university is not a trade school and the whole point of it is not to get students ready for careers.
You're absolutely right that a university isn't a trade school. There's also value in computer science undergraduate education, for some people.
People who want a university education and to go into industry afterwards aren't typically served by a computer science degree, though. Completion of the program doesn't leave these people with the knowledge and skills needed to work in teams to build large and complex software systems.
There are far more undergrad CS programs than SE programs. Many students don't realize the difference and industry is worse off for it, since experienced engineers are now teaching university graduates things that they should have learned in the classroom.
I 100% agree with you. Part of the problem is trying to figure out a degree before you turn 18. It often does surprise me that more schools don't have SE degrees. I went to an engineering school for undergrad (did not do SE or CS) and my university didn't even have a SE degree, BS or MS.
Edit: Though part of me also thinks that maybe we shouldn't be expecting new grads to be ready for the job, in any field. Apprenticeships and formal mentoring should probably be bigger across the board. If you get a CS degree, you have a lot of knowledge that is very useful... but you still need more and, really, most of that is best learned on the job.
100% on apprenticeship, formal mentoring, and I'd add internship and co-op in university education.
Not every software engineer will have a university degree. But even those who have taught themselves or go to a boot camp can take advantage of apprenticeships and mentoring. In fact, I think that companies that offer students 3 to 6 month internships should be more open to people outside of degree programs participating.
Part of the problem is that industry, and the economy at large, doesn't really reward quality software development. I covered this in my article are we forever cursed with buggy software
So long as ad-hoc solutions are capable of getting funding, being used by millions, and generating initial revenue, there is no incentive to build quality solutions. There is simply no competitive advantage to achieving more than the bare minimum for a product.
Granted, many companies will fail because they don't have a talent pool that achieves even that low threshold.
I wouldn't be surprised if the tolerance of the general public begins to drop, especially given some of the recent incidents. The sheer volume of data that these companies and applications have access to is staggering and can be rather personal. If data is leaked from Facebook, Amazon, or Google that involve geolocation data, purchase history, or personal files, I can see that being a huge incentive for companies to focus on quality.
But that requires an informed and engaged public...so...maybe not.
I couldn't agree more. A computer science degree alone is no guarantee that you can develop software effectively or efficiently.
Formal education in our profession has a terrible track record when it comes to producing graduates with the skills required build software with desirable qualities.
With that said, there are definitely exceptions. But even then, it takes years of on-the-job training to turn a green developer into an experienced, productive developer with good judgment.
While I applaud Scottshipp's drive to get his masters, I pray that we can improve our skills and knowledge as a profession without going that far.
I'm a voracious reader of anything related to programming and software development and I can tell you that there is good stuff out there. It's just hard to find among all the tool-pushing, language-advocating, methodology-aggrandizing, opinion-filled internet.
There are a couple of problems as I see it:
Congratulations on your masters.
Part of the issue with research is that we don't know what we should be measuring? Metrics for quality and productivity are ellusive in the software world. Test setup is also an issue as all software projects tend to be unique in some way. There are a few domains with lots of inner similarities, like full stack business pages, but this touches on only a fraction of programming.
I think this is closely related to your point about demonstrating skills, but I'm focusing more on the achievement of the team instead.
Is the challenge of finding acceptable quality and productivity metrics really that big of an issue for CS researchers?
When I think of all the crazy-hard questions scientists study (in all kinds of fields) and all the years they spend on one little area, I don't think CS research is particularly challenging or unique.
I just think that too few people and businesses really care about CS research.
I read about a hospital adopting a decision tree to triage patients with chest pain (seilevel.com/requirements/visual-m...). It saves lives, saves money, and increases quality of care. But the point I want to make is that someone took the time to study the subject and come up with a decision tree based on the evidence. It's not perfect but it's better than sending everyone with chest pain to the cardiac unit.
Why can't we have this kind of thing for TDD? It doesn't have to be 100% accurate, it just has to be better than what we have now, which is basically no scientifically validated guidance.
Imagine if we had a decision tree for TDD with 3-5 questions and it would spit out the probability that your project would benefit from TDD.
Our profession really is in the dark ages.
Medicine is surprisingly predictable and routine compared to software development. You can classify health issues, and outcomes, in a relative straight forward manner. The hospitals across a region, the country, or continent, all generally deal with the same types of issues (naturally regional variance), as there are always humans in the equation with a similar culture.
In software we just don't know what indicates something is good or bad. Is it user satisfaction? Money made by the company? Data breaches, or security incidents? What exactly is the quality that we are measuring?
We don't have reliable ways of saying the complexity of a feature, or the significance of defect. There are too many variables involved.
I would compare this to the field of art. Do they have good metrics on what constitutes a good artist?
I'm enjoying our little debate.
You wrote:
I disagree. For the most part we desire software that meets all its functional and non-functional requirements. When you turn on the auto-pilot in a plane, you expect it not to fly the plane into the ground. When you activate WPA-2 wireless encryption, you expect your wireless connection to your router to be encrypted.
That's very different that art.
All things being equal:
We have money. The only reason business people care about software quality is because low quality means loss of money and possibly risk and liability. It's non-trivial to convert defects, technical debt, and new features into money but it can and should be done. The business people supervising your project are doing that all the time.
So, given two identical applications produced by ACME Software Inc., the better one is the one that has the greatest positive effect on the bottom line of the business over the life of the system.
Software developers go on and on about quality because they think low quality slows them down and makes their projects more costly than they need to be. But, when you zoom out, money is the only thing that matters for most commercial software projects.
There is no standard by which functional and non-functional requirements are defined for software. When we say user data should be "secure" we don't have a clear definition of what that means. We also don't have any true way of testing that, other than just waiting for incident reports.
Waiting for issues is also not a good measure since popularity plays a major role here: some products have fewer users to discover issues, some services have less interesting data, thus attracting less attackers.
I agree on the money argument, it's what I said in my article about doomed software quality. Major breaches, failures, and otherwise seem to have no impact on the money flow at the moment. With money as our only metric there is simply no pressure to have what most people would qualify as good quality software.
Every year somebody programs they should know more about software than the year before. Any year where this is not the case is a year that person has slipped behind in their career path.
It's not necessary to go to school to learn the concepts you need, but it is necessary to find a place and time to learn them. Many people will simply not be in jobs where that experience will come naturally. Many people will not take the initiative to learn more.
That is, whether you go to school or learn on the job doesn't matter. What does matter is that you are constantly engaged in the field of programming. Even if you go to school there is a constant need to update your information, it's a rapidly changing field. The day that somebody stops reading about new methods, new languages, new tools, or otherwise is the day they start failing.
Am actually considering taking a masters in Software Engineering coz i want to understand perfectly well, how softwares are been made and managed. Though, i dont have a CS degree but i program in Java and also do javascript, html and css Android included also. . . what do you guys think . . is it possible to do the masters ?? . . would any school accept me??
Hi Nonso, sorry for the delay in replying. You would have to look into the specific degree program entry requirements and see. Usually they are listed on the university's web site. I know that when I applied to the program at Seattle University, it was specifically geared toward people who had been working a couple years in the software field already. Many people who are already working in software don't have CS degrees, though. They instead have business, math, physics, philosophy, or other degrees. This was acceptable at Seattle University, but the degree requirements then changed slightly for those people to incorporate an object-oriented programming class and an algorithms class in the first year. Those who had CS degrees already were allowed electives instead of these classes. ** Note that Seattle University may have changed their entry requirements, I don't know.
Ok thank you, would definitely check out the schools website
I would hit the like button but then I would encourage masters and implicitly an archaic education system that needs a huge redesign and refactor.