Is front-end development having an identity crisis?
Vernon Joyce Sep 9 Updated on Nov 02, 2018
Courtesy Adobe Stock
Does front-end development as a we know it still exist; or has the role evolved into something we no longer recognise? As with evolution in nature, the evolution of "front-end" has resulted in several distinct flavours --- and in my opinion --- an identity crisis.
What is a front-end developer anyway?
Traditionally speaking the front-end could be defined as the UI of an application, i.e. what is client-facing. This however seems to have shifted in recent years as employers expect you to have more experience, know more languages, deploy to more platforms and often have a 'relevant computer science or engineering degree'.
This places an unfair amount of pressure on developers. They often quit or feel that there is no value in only knowing CSS and HTML. Yes technology has evolved and perhaps knowing CSS and HTML is no longer enough; but we have to stop and ask ourselves what it really means to be a front-end developer.
Having started out as a designer I often feel that my technical knowledge just isn't sufficient. 'It secures HTTP requests and responses' wasn't deemed a sufficient answer when asked what an SSL certificate was in a technical interview for a front-end role. Don't get me wrong, these topics are important, but are these very technical details relevant to the role?
I will be occasionally referring to front-end development as FED from here on.
This identity crisis is perpetuated by all parties: organisations, recruiters and developers. The role has become ambiguous with various levels of responsibility, fluctuating pay scales and the lack of a standardised job specification within the industry.
While looking at the job market you might find that organisations expect employees to be unicorns and fill multiple shoes. Recruiters also could have unrealistic expectations in terms of the role which is often supplied by a Human Resources department with little understanding of what they are hiring for. Lastly developers compound this problem ourselves: they accept technical interviews as they are and should we get the job we place ourselves under unnecessary pressure to learn the missing skills, instead of challenging recruiters and organisations on what it actually means to be a front-end developer.
Compare the below job posts on LinkedIn, both titled 'Front-end Developer'. The roles are vastly different: on the one hand the developer is expected to know Flux architecture and unit testing while on the right they are expected to know Java and MongoDB.
Comparing two roles on LinkedIn, both labelled "Front-end developer"
Both of these roles are vastly different; and clearly lack a definitive scope or the role.
Why it is important to standardise the role
- Evening out the pay scale: front-end engineers won't get paid what FEDs should and vice versa.
- Alleviates pressure; allowing developers to either focus on engineering products or on creating rich interactive web experiences
- Makes job hunting less stressful when it comes to technical interviews and job specifications
Separation of concerns
In order to define the role we have to strip out all the roles that could be considered above and beyond the scope of a FED. The web developer role, for example, should not be confused with the FED role as the one builds applications and the other builds experiences. Other examples include front-end designer, web engineer, back-end web developer etc.
To distinguish these roles we could look at four criteria:
The developer's canvas
The lines between back-end and front-end became blurry somewhere between JQuery and Node and ever since it's often expected for front-end developers to know Node and accompanying packages like Express. These are clearly back-end technologies, so why are we adding them to a FEDs job specification?
Before we can standardise the role we have to agree on what the front-end developer's canvas is. In my view, this is confined to the UI of an application and primarily runs in a browser --- the role should not be concerned with building any server side functionality.
The chosen language
A second criteria to consider might be a developer's chosen programming language. It is possible to build website infrastructure in languages like Python and C# which begs the same question as before --- could Python, PHP and C# be considered a front-end language?
The below example asks for PHP as a required skill where the other expects the developer to know TypeScript.
Comparing the skills required for two roles on LinkedIn, both labelled "Front-end developer"
Which frameworks or libraries then should be part of the role's scope if we are excluding PHP, C#, Java etc? JQuery for example is a the perfect tool for building interactivity for the web, where most front-end developers might argue that it's better to learn Vue.
When does a front-end developer transition into a full-stack developer or a web developer?
Distinguishing this becomes really easy when considering the canvas as well as the chosen language. A full-stack developer is a developer that understands both front-end and back-end (i.e deals with more than one canvas). A web developer is a developer that can work in multiple frameworks, libraries and languages to build rich data driven applications. Most FEDs will then most likely move from an intermediate FED role to a senior full-stack, engineer etc. role.
The last factor to consider are all the additional requirements that come with front-end roles. I consider this 'baggage' mostly because these requirements often get thrown into the mix in an ad-hoc fashion.
A good example of this is MongoDB (which was a requirement in the listing mentioned earlier). Previously database administration or architecture was a role in itself, so why now are we expecting FEDs to have this skill set on top of everything else?
Another example from an earlier screenshot is the listed graphic design requirement. Personally I am a big advocate of developers understanding design, but expecting them to have it as a skill on top of their other FED skills changes the role into something else (perhaps a front-end designer or full-stack designer).
When considering the added responsibility that comes with having all this knowledge, we have to ask ourselves whether adding them into the mix only complicates the landscape. If today I decided to bring React into my organisation, the developer they choose to replace me with would have to know React as well. If the new developer then decides to add Redux to the mix... well you understand where this is going. To make matters worse, they will keep on hiring front-end developers regardless of the technology used because that is the role required by the department.
So with great power does come great responsibility and it is ultimately up to us as developers to use technology responsibly. Consider the operational impact of a technology stack change and understand that you might be perpetuating an existing problem.
Defining the role
Now that we've unpacked what it means to be a front-end developer, we could write the following job description:
Let's keep things simple --- a FED should not need to understand functional programming or how SSL works on a micro technical level. This is not to say that they shouldn't learn these concepts; but at the very least it shouldn't be an expectation.
I feel that it is important that we collectively address the confusion surrounding the roles in the development community by helping the next generation of front-end developers understand what it means to be a FED.
This article is purely based on my personal experiences and biased opinions --- I would love to hear your views down in the comments.
Vernon Joyce, a full-stack unicorn.