DEV Community

The Engineering Interview is Broken

Gearloose Jones on June 28, 2017

Yes. Another one of these posts. I've lost count of how many have crossed this site, and I'm sure you have too! Meanwhile, I've counted several s...
Collapse
 
carterwickstrom profile image
Carter Wickstrom

I think the problem is more with the arbitrary nature of interviewing and reviewing than the review itself (whiteboard and live code pairing exercises on random problems present additional problems).

We use the take-home code project approach - and use a standardized scoring sheet and review process. If the code project passes, we then use a standardized process for interviewing them, centered on the code project itself: can you defend your code, can you add a small feature to it, can you refactor part of it. We allow the candidate to use Google/StackOverflow/whatever during these code pairings, or to 'think out loud' on a whiteboard or notepad or whatever.

We follow this exact process every time. We also provide mentoring and training to staff on what we look for in both the code reviews and the interview. It isn't perfect, but most candidates who've given us reviews on Glassdoor have liked the process, and most of the candidates that we've hired have been great additions to our team.

Collapse
 
kaydacode profile image
Kim Arnett 

I like that you allow the candidate to use external resources to find a solution - that's definitely a real-world example.

I would advise caution around take-home coding. You maybe excluding people and not even know it
As a mom, and for a while single mom, this was not an option for me. I couldn't set aside an extra 4 hours in my day to sit down and fully concentrate on a coding project outside of my regular (baby-sitter-available) hours. Just a friendly heads up.

Collapse
 
carterwickstrom profile image
Carter Wickstrom

Yeah, it's definitely a weakness that we're aware of.

Thread Thread
 
ben profile image
Ben Halpern

Hmmmm seems like a situation where it might be possible to offer two different styles of interview, both expecting to evaluate the same basic skills and offer choice to the candidate.

We have a light take-home portion to our interview. It's a questionnaire that can be finished pretty quickly (30 mins). I wonder where the line is drawn where this can become a burden for some candidates.

Thread Thread
 
carterwickstrom profile image
Carter Wickstrom

I'll have to see if we have metrics on candidates who refuse outright to do the code project (for whatever reason). I know that a fair number agree to do it, and then never submit it, which can be interpreted any number of ways.

Thread Thread
 
gearloosejones profile image
Gearloose Jones

Guilty as charged on occasionally never submitting on some rare occasions. In my case, it was always one of the following:

1) An offer from the HR/dev interviewing you to answer questions isn't honored
2) Time conflict with other take-home assessments. This is on both parties to keep in mind that a candidate may have several on his/her plate and one may have to be cut.
3) Related to #1: Incomplete or conflicting requirements and questions regarding it are either answered vaguely or not at all.
4) Sometimes things come up in life and you can't just dedicate the time when you otherwise thought you could.

In all cases where I have to stop, I strive to send an email. I'm only human and sometimes that doesn't happen and I apologize profusely when I realize that's happened because, well, good manners. :)

I think there's a definite blindspot here, and I'm on board with giving people options with how to proceed on demonstrating their skills. Take-home or live-coding? Code samples?

I realize there's, well, let's call it "sensitivity" to making sure a candidate isn't misrepresenting themselves, and code samples are a terrific way for people like that to cheat. But a good way to filter those people out is to have them simply explain their code. In detail. Ask how they'd improve it? Ask them to potentially re-factor it in ES6 if it's written a little more classically.

There are options, and I think it's only fair to present a few so a candidate can pick the one that really lets them show off. :)

Thread Thread
 
carterwickstrom profile image
Carter Wickstrom

"[H]ave them simply explain their code. In detail. Ask how they'd improve it? Ask them to [...] re-factor[.]"

You've just described our interview process. :-)

We've definitely had candidates ask if they could submit an existing code sample instead of taking the test. In the handful of instances where we've agreed, those candidates have all crashed and burned. Hard. It's always a pet project that they've been working on for a while, it's never as good as the candidate thinks it is (even when it's good!), and they have a hard time looking critically at their baby.

Thread Thread
 
ben profile image
Ben Halpern

Very similar to us. Key to our process behind the scenes is that everyone fills out evaluation forms independently and we're not allowed to talk to one another about anything until everyone has submitted. We then discuss things based on the various feedback surveys (which are done at each possible stage, so a few times per candidate).

This, we think, provides maximum unadulterated wisdom of the crowds, less group-think, and more containerization of possible biases.

Thread Thread
 
gearloosejones profile image
Gearloose Jones

Not being able to look critically at your own work is a huge red flag, just because that's a person who's going to get super defensive in even the most casual code review.

I used to be that person, but I learned to accept that every baby, even from the most seasoned programmers on the team, has warts. And it's even OK to admit that one or two projects in any given gig is Rosemary's Baby from head to toe. It happens. The test is simply to see if someone can admit it to a peer.

Collapse
 
kaydacode profile image
Kim Arnett 

Yaaas.

I once had a personality test & coding challenge I had to submit together before I went on to the next round. They passed. Was it my personality? Or was it my coding? We may never know.

Dev interviews are sooo broken. I hope this pivots to a more healthy interview process for both candidates and managers.

My favorite code-interview was one where I built a sample app, on my own machine, on the projector, in front of a group of people.

  • I was familiar with my dev environment
  • I was allowed to use whatever tools needed to get the job done **Hello real world example!
  • I was also allowed to ask people in the room questions.
  • Managers could see what I needed assistance with and where I didn't.

It also helped that I got the job - but honestly I felt very comfortable doing this. They told me what I would be building before I got there, there were no surprises. The interviews leading up to the in-person interview were similar to what you talked about... just a general discussion of my personality and questions around technology. No white boards, no homework, no GitHub required.

Collapse
 
gearloosejones profile image
Gearloose Jones

I really dig the idea of BYOE. My only concern would be if there's any kind of projector set up so people aren't huddling around you for an insane amount of time. :) But details like that are definitely something that can be discussed prior to the on-site.

Collapse
 
jaxidian profile image
Shane Milton ☁️

I like the idea of the interviewee providing their own environment. I may use that in the future (with a me-provided environment ready as a backup). Thanks!

Collapse
 
kaydacode profile image
Kim Arnett 

For sure - Just be sure to tell them ahead of time so they can be prepared. :)
It's best case really. I have keyboard shortcuts I'm used to that aren't on someone else's machine. Therefore it looks like I'm stumbling, but I'm actually not :(

Thread Thread
 
jaxidian profile image
Shane Milton ☁️

Yeah, I think quite a bit of cooperative planning would have to go into it (which probably has "interview value" in itself) and would have to be custom-tailored on a per-candidate basis, which is fine for me. I could see this being prohibitively difficult for some orgs but this would work well for us. :)

Collapse
 
jaxidian profile image
Shane Milton ☁️ • Edited

When I interview candidates, I definitely use live coding tests as well as whiteboard tests. However, I probably look for different things than most.

With the live coding test, I put them in front of some tools that they've told me they're very good with. I give them some trivial coding test from the perspective of a business user with vague/conflicting requirements. I don't really care about the code they write (face it, it's probably crap as would be anything I'd do right in front of them rushed and under that kind of pressure). But what I want to take away from it is an answer to these questions:
1) Are they really able to use the tool as they said or do they stumble around in it because they've only kinda used it? (This gives me an indication of whether they're misrepresenting themselves or not, as well as how much I can trust their self-assessment of skills.)
2) When I give them intentionally-conflicting requirements, do they push back and demand more info where it's necessary or do they fall into obvious pitfalls that a simple question could avoid? (There is a right and wrong here for some teams, but other teams can easily manage either of these people with the proper support system.)
3) Are they able to fill in the blanks with common sense assumptions when flushing out what business requirements mean? (There isn't a "right" or "wrong" result here, it's more for gauging their ability to quickly analyze things with critical thinking skills while under pressure. This is something some people are great at and something some people aren't so great at even though they're great developers. It simply depends on what the role is for and if they'll be somebody dealing with structured process or putting out live fires.)

For the whiteboard portion, I usually give them a problem that I don't necessarily expect them to whip up a solution to. That's fine but it's also fine if it's a collaborative process (with me). What I'm looking for are answers to these questions:
1) Can they communicate well enough to explain their thought process? I don't need a wiz who can whip out an answer to all technical problems. I need a teammate who can work through a problem and get to a solution.
2) Can they communicate well enough not to just communicate but to also educate? I often ask them to elaborate on "why" not to challenge them but to educate me. I ask things like, "Pretend I'm a new developer learning ABC. Can you explain to me why XYZ?" For some positions, I like to know if the person may be able to mentor others on the team and this is where I can get a good indication of that.
3) Alternatively, are they somebody who would prefer to work through the problem themselves without leaning on others? This is okay, too. I definitely want to know this before hiring, though. Some positions this is preferable for and some positions the collaboration is preferable for. Both have their strengths and weaknesses (even on the same team)!

The point is, I'm not looking for "the right code" or "the right solution" when I put these tests here. The test isn't the end result. The test is the journey. And that journey tells you more than you could ever deduce from a code analysis of a fabricated problem solved with insufficient input.

BTW: Google, StackOverflow, and even texting friends is allowed during these parts of the interview. I actually encourage them to go ahead and look things up online or find snippets to use if they're struggling with anything that they think they could find online. There's value in seeing how they can use resources to solve already-solved problems rather than hammering through everything.

Collapse
 
codemouse92 profile image
Jason C. McDonald • Edited

At my company, we give applicants a week to complete a coding challenge before the final interview. All the interviewers (2-3 at our company) review and score these before the final interview. Yet, as long as the applicant submits code that (a) accomplishes the stated goal and (b) isn't obviously copy/pasted from somewhere else, we will proceed with the final interview.

However, we'll ask for explanations about that code during the final interview. Generally this involves us asking questions about design choices we noted in the review.

We also often have them fix a bug in it (there's always one or two) in front of us. This allows us to see them working on their code, instead of something entirely unfamiliar.

Collapse
 
andy profile image
Andy Zhao (he/him)

Thanks for the article! Definitely a good read :)

I came across an awesome blog by Aline Lerner about interviews, and I think it's totally relevant. I love her data-driven approach to figuring out interviews.

They're long blog posts, but worth every word. The one that hooked me in is this one: Technical interview performance is kind of arbitrary. Here's the data.