DEV Community

Discussion on: Let Your Code Do the Talking

Collapse
 
sohjsolwin profile image
Josh

Many very good points.

I would suggest timeboxing it to 4 hours or less, providing a boilerplate solution so candidates can spend time actually solving a problem instead of setting up Webpack, and accepting open-source contributions in lieu of an assignment.

Funny you should mention these, because they're all items that we either have currently, or are in discussions on integrating into our process. Our current style of coding tests are typically expected to take between 2-4 hours on average to complete based on your skill level. You're welcome to spend more time if you want to (no time limit to turn it in), but generally what you can accomplish in about 4 hours is enough of a gauge for what we're looking for.

Open-source contributions are a little harder to gauge, since they can have much wider variety than the relatively well defined box the code test fits inside of, but they can often be a more accurate portrayal of the applicant's skill set and coding styles.

you better make sure your requirements are damn clear if you are going to take divergence from requirements as a strike against the candidate. What if you give terrible requirements?

Absolutely true. This is one of the main reasons that all of our current coding tests (we have 3 the applicant can select from) include sample input and expected output, as well as text explanations of the expected usage and requirements for the test. "Divergence from requirements" in this case means that an explicit requirement was ignored, or the stock provided test cases don't produce the provided expected output. We'll test a few extra cases as well and some edge cases to see how you interpreted the assignment, but the base cases are what cover the central logic that you're expected to implement, and if that works properly, everything else is just bonus. You're able to document your assumptions and can reach out to our team (not only the recruiter) if you have questions/misunderstandings of the assignment as well.

At the end of the day, we're looking for people that think and ask questions, not just folks to come in and blindly hammer away at a keyboard for hours. If a ticket or feature request comes through that doesn't make sense (granted, it likely would be caught and marked invalid long before reaching one of our engineers, but every one here's human, so it's not impossible for a ticket to slip through the filters eventually), we want that engineer to ask questions, get clarification, verify, and if necessary, push back on the ticket when necessary. If something doesn't seem right, you're empowered to raise the question.

All in all, your assessment of code tests was entirely valid and eerily spot on (we even have a volunteer EMT on our team as well).

From your profile, I see that you're currently looking for new opportunities. If that's still the case, based on your profile and LinkedIn, I think your skill sets and experience align pretty well with the tools and technologies we use here at DealerOn. Apparently you even have a recommendation from a former classmate of yours and current coworker of mine here at DealerOn.

We have a few full time remote folks now, and I believe we're open to expanding that for candidates with prior experience being fully remote (I see your location and most of your prior experience is in George, but you worked for a company in Maryland for a time, though no indication if that was remote or on site). If you're interested, I would encourage you to apply. I think your skill set, experience, and penchant for asking questions and calling out where you see ways to improve something would fit in well here.

Collapse
 
ssimontis profile image
Scott Simontis

I'm going to go ahead and apply! I came very close to applying right before I left Maryland/DC and decided to go back to Atlanta instead. If I can get through the process quickly, it's definitely a position I'd love to consider.