DEV Community

Peter Harrison
Peter Harrison

Posted on • Edited on

Is a Four Year Degree The Way?

My path toward a career in software development began at age 11 when I got my first computer. At the time computer programming as a job was a little like nuclear physicist; they existed but were rare. By the time I got to the point of higher education there were technical colleges offering information systems certificates.

During my high school years I developed my programming skills writing messaging software for the school network of BBC computers. Later I developed bulletin board software. A friend of mine wired up my 300 baud modem to make it possible to automatically answer. This was all before I attended a technical college.

In my view the brains of teenagers are very plastic. The ideal time to start teaching programming is early, about ten. Make the tools available and don't patronize them with simplistic tools that assume they are not able to understand 'adult' languages. At eleven years old I was perfectly capable of understanding the languages such as BASIC, C and even Assembler.

Academically we have put software development in a box, treating it the same as other professions. But software development is a broad skill that is being used in many fields. It is a force multiplier. Thus I think we need to give all kids access to learning materials and a path to professional developer. I see this occurring in three phases.

The first phase, from about ten to leaving school should involve learning how to program. Schools should support a voluntary curriculum to give students the opportunity to learn. Ideally we should have software clubs similar to code clubs that exist outside schools.

Second phase for those wishing to become professionals should be a technical institute where students will be introduced to aspects of software development they would not be familiar with from their school days. The main difference here is learning how to work in teams towards a single project. It would involve training in the disciplines of software development. For me this would be Agile Methodology and Scrum. This phase would be for no more than a year.

The third and final phase would be a paid apprenticeship of one or two years where the new developer would be involved with real projects under the tutorship of experienced senior developers. At the end of this you will have a competent intermediate developer who can be trusted to cut code and work within a team.

The four year University degree does not in fact prepare you for being a professional software developer in my view. At best it gives you a ticket into your first job where you will begin learning the ropes. It will take a couple of years further to become an intermediate developer.

The University track will therefore take between five and six years, while a technical college track would take two or three years to reach intermediate. Also, you will be earning after only a year with the technical college track, while you will only be able to earn after four years with University. Then there is the significant savings on educational costs.

There is of course more to University than getting qualified, so if you can afford a broader and deeper course in computing then this might be of benefit. But if your aim is to make a living and enter the workforce as quickly as possible it is a questionable use of time.

As an employer formal qualifications now mean very little to me. They might get you through the door, but so too will be practical examples and demonstrations of genuine competence. Am I the typical employer? Perhaps not. Large corporations may be less flexible with their job requirements. But smaller outfits tend to be more energetic and care more about competence than qualification. Just ask Bill Gates.

So what is your view? Am I totally off base? If you are an employer what is your view on four year degrees?

Top comments (3)

Collapse
 
jfrankcarr profile image
Frank Carr

Speaking from a US perspective, there are a few reasons to have a degree.

  1. H1B immigration status. This is a double edged sword. If you're a foreigner planning on working in the US under this program, you will need a degree in CS or related discipline to even get your foot in the door. If you're a US citizen, not having a degree means that it makes it easy and legal for a company to not hire you or even replace you with a lower cost H1B worker/contractor who does.

  2. HR Requirements. A lot of companies, especially outside of the tech realm, still require a 4 year degree. It doesn't always have to be a technical degree though. This is usually more of an HR requirement than hiring manager requirement and can be waved if the hiring manager is persistent. But, you have to get to the hiring manager and HR reps will often block the way.

  3. Connections. This is more of an elite college thing. A shared or very similar technical educational background helps get your foot in the door at many of the big tech firms as well as for engineering jobs. This is the main reason tech has become very monocultured in the US in recent years. Earlier, when more people were self-taught, especially on the PC side of programming, things were more diverse.

Collapse
 
mporam profile image
Mike Oram

My views may be a little bias but I totally agree. The cost benefit analysis of university just isn't worth it. For employers or employees. You mention bootcamps in the comments as a good fast track, but your right that they alone cannot produce a well rounded developer. However ultimately I think that is the responsibility of the industry. If a bootcamp can provide coding skills, teamwork IMG experience and real software methodologies such as agile, and the graduates can then go in to work for a supportive company with mentoring senior Devs, this is surely the best solution for everyone.

This is precisely the model I have created at Mayden Academy in England. We train small teams of 8 students in software development, teaching them coding, agile and teamworking through project based learning. They then go on to work for one of our industry partners as a junior developer where they receive continued support and training from us and their dev team. We are very selective about the companies we work with to ensure our students have the right environment to flourish as developers. It also means we have a 100% success rate.

Collapse
 
cheetah100 profile image
Peter Harrison

Society will need to review how we do education. The cost of a University education can be a significant barrier, especially in the US, but even in New Zealand. Student loans are terrible, encumbering students with debt early in life when their earning potential has not been developed.

In the past University has not only been a qualification system for certain professions, but a more general education in topics like philosophy. Interestingly the Internet has made this kind of material more available and less elitist.

Vocational organisations like coding boot camps have sprung up to cater for those who want a fast track, much like I suggested. However, the promises they make are unrealistic. They may arm you with the foundations, but you need a year or two to mature and gain self confidence. This is true regardless of whether you come from University or a boot camp.

I don't want to beat up on Universities, they have their place after all, and we need them for basic science and technology development, but the value equation will drive people away towards more efficient options.

My son was sat down in front of a computer at six months - but only for a photo op :-) However, by three he was able to use a mouse and play basic games. Now... well lets just say the apple doesn't fall far from the tree.