DEV Community

Amanda Sopkin
Amanda Sopkin

Posted on

Are Technical Interviews a good measure of software engineering ability?

Technical interviews in technology have come a long way for such a young industry. Candidates from 10 or so years ago were famously given brain teasers at companies like Microsoft, like “Why are manholes round?” When these questions were demonstrated to reveal little about the caliber of candidates, the practice was discontinued. Later on, companies peppered interviewees with trivia, including pages of javascript questions and random factoids about various languages. For the most part, companies today are focusing more and more on interviews that are platform and language agnostic -- intended to show only the problem solving ability of a particular person. However, this process still has its flaws.

Most engineers will admit that many good candidates can be missed by a traditional technical interview. Candidates that are out of practice, on an off day, or even (god forbid!) never really understood graph theory might do poorly in an interview or two and the company that rejected them misses out on a great candidate. However, many people believe that losing some of these “good” candidates is OK as long as they don’t accidentally hire “bad” engineers along the way.

In the words of Stack Overflow cofounder Joel Spolsky “if you reject a good candidate, I mean, I guess in some existential sense an injustice has been done, but, hey, if they’re so smart, don’t worry, they’ll get lots of good job offers.”

But is that mindset enough to uphold the interview status quo? According to this piece in Tech Crunch: “Historically, a false positive has been perceived as the disaster scenario; hiring one bad engineer was viewed as worse failing to hire two good ones. But good engineers are so scarce these days, that no longer applies.” No one wants to hire a bad candidate. But companies pour millions of dollars into recruiting costs in the process of avoiding one. Let’s look at the pricetag for one candidate. There are the obvious costs: like travel expenses for onsite candidates, but recruiting processes have lots of additional costs as well. According to an estimate from Recruiter Box:

  • Posting on job boards can cost between $40-500 depending on the method
  • Reviewing applicants takes from 10-24 hours costing companies $500+
  • Prescreening processes take 2-4 hours, which is easily $100-200
  • Interview preparations by recruiters take 1-2 hours, costing $40
  • Onsite interviews typically take 4-5 hours of developer time, $200-400
  • The wrap-up process, including making offers, talking to candidates and checking references can take around 8 hours, ~$200

The grand total for this estimate is: $1080-1840 without travel costs.

And very rarely will the first person to get through the process accept an offer. According to an analysis by Glassdoor, companies spend an average of ~35 days interviewing 120 candidates for a single engineering position. Of this number, 23 make it to a screen, 5.8 are brought onsite, and 1.7 are given offers. So you’re looking at $40-500 + $500 + $100-200 + $920 ($40*23) + $1160-2320 ($200-400*5.8) + 340 ($200*1.7) = $3060-4780.

This cost is on the conservative side, but it pales in comparison to the costs of making a bad hire. A bad call can cost teams tens of thousands of dollars in costly mistakes and correction time. So what are companies to do?

Does interviewing even work?

Studies show that the best possible way to evaluate candidates is through a “work sample test” meant to reflect the kind of work the candidate will be doing. This is why technical interviews rose to popularity over conventional “behavioral” interviews. A psychology study by the University of Toledo in 2000 found that judgements made within 10 seconds of an interview consistently predicted the overall outcome. By doing whiteboarding or coding interviews, we seek to eliminate that bias by judging performance against a (more) objective problem.

However, according to a study by interviewing.io, which is a platform for practicing technical interviews, performance in these interviews does not necessarily correlate strongly to job performance.

Study from interviewing.io
Technical interviewing performance source

The site found that performance varied widely, with just about 25% of candidates performing consistently at the same level. Even the “strong” performers, did poorly 22% of the time. The more concerning part of these revelations is the consistency. Some “good” candidates are bound to slip up now and again, but that’s ok as long as “poor” performers are not making it through the process. But if only a quarter of the interviewees are consistent, what does that say about the process? It’s certainly got room for improvement.

Part of the problem with technical interviews is the focus on one particular skill set. Most interview processes take it for granted that the goal is to hire the most qualified person--which most of us take to mean “smartest.” The problem is that this isn’t necessarily the best predictor of success. A study from the NYT a few years ago documented a series of studies showing that teams that were dominated by a few very “intelligent” engineers, by the standard of IQ tests, did consistently less well on all metrics than those teams that were A. More collaborative, B. Ranked higher on the ability to read emotional intelligence, and C. Had more diversity. None of these factors are screened in a traditional coding test. Although we ask questions to gauge cultural fit and collaboration, these factors are often an afterthought compared to performance on the technical portion of the interview.

Alternatives to whiteboard interviews

There are other technical methods of assessing candidates, but none are being adopted at scale in the technology industry. Some companies give candidates a technical project to work on, at least in lieu of a tech screen. The problem is that it can be difficult to come up with a project that is small enough to be feasible to complete and complex enough to reflect a normal day as a software engineer. Many companies also fear that these tests are often plagiarized by aspiring jobseekers. In addition, many candidates feel they should be paid for their efforts to complete these projects, some of which have even been used by the company post-interview!

Some companies like Stripe, use a different flavor of the typical technical interview. Stripe allows candidates to use their own laptops and look up syntax on sites like stack overflow in an effort to more closely mimic the conditions of the job they are seeking. In addition to technical questions, they incorporate interviews focused on a candidate’s ability to explain a technical project they worked on and answer basic engineering design questions.

Amazon has experimented with group interviews, where candidates are allowed to select a technical problem (from a given set) and work on it over the course of several hours, before being given the chance to explain their approach.

At Helpful they conduct simple 1 hour interviews and give offers to all candidates deemed worthy. Then they put those candidates on a probationary period (30-60 days) to see how they do on the job. This approach presents its own challenges since many engineers may find such a process demeaning, but it is an interesting spin on traditional interview processes.
These companies have written about their new processes and the kinds of results they are getting and it looks promising! If the technology industry values innovation so highly, what better place to practice this principle than within our recruiting systems?

Where do we go from here?

While many of the alternative interviewing strategies mentioned above are promising, there are deeper rooted problems with technical interviews. Egos run amok as software engineers who have great interviewing confidence are harsh on less experienced candidates, who may be great engineers that lack practice or may be prone to nervousness.

For some reason, the idea that it is necessary to hire “the best of the best” has spread quickly throughout the technology industry. When I was in college, I met with recruiting managers to discuss their strategies for hiring as a part of my role in a student group. Most of these recruiters echoed this need to find only the absolutely best candidates. While this attitude has a place at the executive level, when I was in school I was confident that at least 70% of the engineering students would be able to fulfill the job requirements these companies were seeking. Even if you require above average engineering skill, conservatively 35% of these students would be able to perform well at the jobs you are offering. Yet, they still discussed wanting to hire only the top 1-5%.

Elitest meme
Elitest penguins source

The elitist need to hire the “top 5%” of engineers and the accompanying elitist belief that those who have made it through the process are part of the “top 5%” do not account for the random elements of interviewing. (This mindset is correlated to the idea that tech workers are at the top of the food chain because of their technical skills. This “skills gap” theory has its own critics.) The problem with looking for the top is that as shown by interviewing.io even top performers are bound to mess up 22% of the time! While we would like to believe that interviewing is objective, other studies suggest that elements of this process are out of our control--like our ability to form a connection with the interviewer within the first 10 seconds.

The problem isn’t just the interviewing stage of the pipeline. Many recruiters filter out good candidates early on on dubious criteria. According to Triplebyte, which specializes in filtering for good candidates that would be a match for Ycombinator startups, candidates that do well in the hiring process are those that reflect the “backgrounds of the founders.” Even non-technical recruiters will reject 50% of applicants by pattern matching against these criteria at most of these companies. And because founders tend to lack diversity to a severe degree (just 3% of VC funding goes to female founders and only 1% goes to black founders) this is a big roadblock to improving diversity in technology.

Where do we go from here? It’s impossible to arrive at a better solution without iterating on new ideas. Start by experimenting slightly with traditional technical interview formats and run the numbers--how do hired candidates in one process compare to hired candidates in another? It is well known that companies like Amazon rely heavily on this data to refine their interviewing process from year to year. And the growing prevalence of Amazon’s group interview format suggests that this new method has proven successful.

Aside from systematic changes, as engineers who interview it is important to try and make candidates comfortable, to be aware of our personal biases, and to let go of the belief that we can only work with the 1% smartest possible engineers. These beliefs are limiting and fundamentally wrong. And as we know, there are more than enough software engineering jobs to go around.

Cats Typing
Plenty of coding to go around source

Share what has worked for your company in the comments!

Top comments (21)

Collapse
 
vbordo profile image
Victor Bordo

This is an awesome and insightful breakdown, Amanda. I really enjoyed reading it. My experience has been that candidates tend to perform better in the following interview settings:

  1. Discussing side projects they've worked on in the past at a technical level.
  2. Discussing open-source projects they've contributed to.
  3. Participating in a pair programming session with a senior dev on the team.
  4. Completing a take-home programming assignment (with compensation for their time) followed by a code review with a senior dev.

Recently launched Whiteboardfree, a job board to help devs seek out companies that do not include whiteboarding/riddles/games in their interview process. Very curious to hear your thoughts about this idea and if it's a useful way to incentivize companies to adopt more mutually beneficial interview practices.

Collapse
 
threedeeprinter profile image
Dan Benge

For me, Whiteboard Interviews are the equivalent of asking a musician to show him how to play "Piano Man" without a piano.

Collapse
 
msoedov profile image
Alex Miasoiedov

Made my day

Collapse
 
mortoray profile image
edA‑qa mort‑ora‑y

When I do interviews I get the impression that we aren't looking for great candidates, we're really just trying to route out the bad ones. So many of the people I interview simply aren't ready for a programming position. They lack either the skills, the mind-set, or something else.

Against this flood of under-qualified candidates, anybody who looks even acceptable suddenly flourishes in any process we set-up. I'm positive I could setup a test that only the top qualified candidates would pass, however, it has these problems:

  • We'd have to deal with passing near 0% of applicants, perhaps just 1-2% would make it to an on-site interview.
  • A candidate worthy of passing would rightfully expect some form of compensation for their time, which wouldn't be an issue, except you couldn't exclude the other 98% from this easily.
  • A rigid process can have the effect of scaring away good candidates that couldn't be bothered to deal with nonsense. It's a communication problem to indicate you have a strong process, and not just needless hurdles.
  • The company behind the process must be willing to pay a good compensation to those people that make it through the process. Part of the reason I suspect some companies don't attract the best is because they just aren't offering a good enough return.
 
palquimista profile image
Paulo Jesus

Developers sharing their interests is a communication skill. All the people I met with golden badges on stackoverflow do it during business hours while being paid to deliver other type of business value. Also, I've worked in strong technical environments and is not as funny as working on a mixed one where you can cheerlead juniors and so forth.

Collapse
 
remotesynth profile image
Brian Rinaldi

Great post! Love the various resources you brought together. This is definitely a tricky problem. Just thought I'd share some thoughts that came to me as I read it.

I was fortunate that when I started getting into web development, there was no whiteboard or test. Places were so desperate that showing any real ability was enough. All I brought was an app I'd built on my spare time. I had no formal schooling or formal experience. The .com boom days were unique and I was extremely fortunate. Had I been starting out today, my experience would have been very different and would certainly have required much more of me just to break into the industry.

I think you touch on some important points in that there is often too much focus on specific skillsets rather than overall fit. There seems to be too much emphasis on "hit the ground running" and very little effort to train and develop people. When I've had the opportunity to conduct interviews, I was often less interested in what specific language you knew or testing you on how deep your language-specific or computer science knowledge was than on finding out about your willingness and eagerness to learn. Language, framework and other specific skills can be taught, but you need someone who wants to learn. Companies need to learn to be patient and develop talent - I believe it pays off in the long run.

I'd also say that I have worked with all types. For example, some people were not most brilliant coders, but they were incredibly productive and didn't mind doing the mundane jobs that needed to be done, but that the "top 5%" coder would frown on doing because it felt beneath them. They played a critical role in the team regardless. My point here is, I also wonder if we often focus so intently on the pieces of the center pieces of the puzzle and forget the pieces around the edge that make the picture complete.

Collapse
 
newtorob profile image
Robert Newton

The whole whiteboard thing sucks for people, like me, who have insane amounts of anxiety. The whole time you feel like you are doing terribly, and when you don't get that call or you do get the email saying you have been rejected, it makes you feel like you're terrible. I get to use an ide or vim with a ton of plugins and google for my job, why am I not allotted the same resources in an interview?

Collapse
 
palquimista profile image
Paulo Jesus

I would still prefer to hire someone with communication and social skills, than someone that is a "ninja" and only knows to output code. We need humans, not machines.

Collapse
 
msoedov profile image
Alex Miasoiedov

Whiteboard is unideal and very narrow engineering culture test and it very prone to interviewer - interviewee personality mismatch. As well as it does not cover all worth for measurement dimensions of an autonomous and productive software engineer.

Collapse
 
gabeguz profile image
Gabriel Guzman

Yeah, this hits home. I interview a lot of technical candidates and it's pretty hard to determine who is going to be a good contributor based on technical interviews. The other problem, which technical interviews don't address, is getting someone who's good at churning out code, but steps on everyone else in the process...I've definitely hired a few of those in the past, and it is a huge detriment to the team they end up on. Thanks for this great article, it helps. I still don't know what the right solution is (currently doing a take-home mini project with a team interview that's more focused on fit and some technical questions) but I'm pretty sure it doesn't involve a whiteboard.

Collapse
 
jrohatiner profile image
Judith

The interview process for technical roles is broken on every level. Consider that if your interview is a test, and not a replication of a real world situation, then the candidates that will do well are students (or recent grads). Why? Because they have been taking tests for the last 4 years.

If you assign a project as the employment test the candidates will submit projects produced in their coding environment. You won't be judging them on the right parameters which would be the same stack the company is using. Not their stack of choice.

I appreciate all the research that went into your post; but did you know that highly intelligent people generally do not test well? Most tend to overthink info/situations and develop anxiety in situations where the outcomes are focused on quantifying performance by standardized answers (rather than giving answers that reflect what they think, or would do, to solve the problem).

The truth is that this kind of testing was born out of the fact that most recruiters/employers don't understand programming/engineering. These ridiculous tests are a crutch for the interviewer at best. At worst the testing process has become an industry. A way of monetizing the recruitment process ("cracking the coding interview", Leetcode, etc).

You want to know if an engineer can do the job? Check the work they have done for the companies they have worked for. Ask their references about them. Have her/him meet the team and hang out. See what kind of repore they develop with everyone on the team. This interviewing process is called: THE OLD FASHIONED WAY. And one more important step: Take it offline. Make it face to face. There is NO online way to get a gut feeling about a candidate.

Collapse
 
ben profile image
Ben Halpern

Super nice report Amanda. 👌

Collapse
 
rhymes profile image
rhymes

Thank you Amanda, this is so important. If companies get it right, or at least better than the status quo, it could reshape the environment. More emotional intelligence and diversity and less rock stars :-D

ps. manholes are often square here so I would have definitely failed that question :D

 
sanidz profile image
sanidz

You are both right, first you mustn't discriminate on a persons lack of social skills, but also you shouldn't ignore persons inability to emotionaly undestand and work thogether in a team as part of a team!

Collapse
 
etresoft profile image
John Daniel

Those two tweets being referenced were from 3 years ago. Nothing has changed much in the past 3, or even 23 years. Every company claims to only hire "A Players". Yet we all have ample evidence to the contrary.

I think you hit the nail on the head with the notion of "random elements". Those claims of good engineers being scarce are nonsense. That is just HR legal cover to allow companies to discriminate on every basis except ability. There is no magic secret to finding good people. Advertise according to the work being done, not a "purple squirrel" set of requirements that no one could ever meet. Identify candidates who have the education and/or experience to do the job. Hire at random to achieve a diverse workforce.

Collapse
 
serhuz profile image
Sergei Munovarov

Great article!

But I still think that recruiting processes adopted by big companies are not necessarily ideal for smaller ones. I mean, what if a startup with a team of 5 decides to hire a new member? I'm not sure whether there will be enough resources to do a group interview.

Or should you ask candidates about advanced computer science topics when you are looking for a person who will be working with parsing and producing JSONs most of their time?

Basically, companies should be looking for people who can manage to get the job done, and not "A-players", "ninjas", "gurus", whatever.