loading...

Code review checklist

uday_rayala profile image Udayakumar Rayala Originally published at blog.mavenhive.in ・4 min read

http://blog.smartbear.com/wp-content/uploads/2015/09/Creating-Your-Code-Review-Checklist.jpg

I have posted in my last blog about how we do code reviews and compared it with pair programming.

One thing I noticed is, every team and team members have a different list of things they look for while doing code review. In this post, I will list out the things I look for. This is no way a complete checklist. If there any checks which you think are missing, let us know by adding them in comments.

Design

Depending on what programming language or platform you use, check whether the new code follows the right design.

What is a right design is up to the team to agree upon? It is always best to have upfront discussions among developers when new abstractions are brought in and the new design is followed. Code reviews can be a way to make sure it is implemented as discussed. This will make sure there will not be a lot of rework. But this can be exempted when you want to implement something new and then get feedback on it. Make sure you write enough so the idea is conveyed.

Following are some of the things you should check for (Thanks to Bhavin Javia for contributing to most of this list):

  • Verify if coding conventions as per the designated style guide is being followed to maintain consistency and best practices.
  • Check for Duplication inadvertently introduced by developers e.g due to lack of awareness about existing reusable components, misplaced logic/responsibilities, generic/reusable code not extracted and just copy/pasted elsewhere etc
  • Identify Refactoring opportunities which were missed or deferred during earlier tech debt and code reviews. Often if the developer is working in that area, has a lot of context which makes it easier to include some refactoring tasks as part of the story/PR vs finding a dedicated block of time for refactoring.
  • Spot and warn developers about potential dependencies or conflicts with other developer’s code e.g. when multiple features which touch same areas of the codebase are under development or review
  • If there’s any non-standard feature or hack is being introduced, make sure it’s annotated with a #TODO or #FIXME tags along with an appropriate comment e.g. reference to the code snippet/patch being used with reason and what the ideal solution should be. e.g. #FIXME Patch to workaround bug # in <framework> <link>. Remove after upgrading to fixed version.
  • Look out for new libraries introduced and see if they are required and are the latest version. Are there better alternatives for it?

Tests

  • All the above checklist about code also applies to test code.
  • Make sure all the code have tests written at all levels — unit, integration and functional.
  • Check if all tests passing.
  • Check if the tests are acting as documentation describing the intention of the code it is testing.
  • Check if tests give proper error messages when they fail. For example:
expect(subject).to include(:customer) #is better than:
expect(subject.key? :customer).to be\_true

Wear the QA hat

This is something which most people miss or probably think not as important. Working of the code is as important as its quality. Even if you have a testing team, catching bugs at code review will reduce the cost of fixing it.

  • Read the story description, ask questions to the BA or Product Owner and see if all the acceptance criteria is implemented as mentioned. I feel most bugs can be caught with just this check.
  • Think about scenarios outside of what is mentioned in the story. Check if the usual user flows related to the new changes are still working as expected.

Cross functional requirements (Non Functional Requirements)

Also focus on cross functional requirements like Performance, Security, Analytics, Logging, Alerting etc.

  • Check if the code can introduce any performance issues like N+1 query problem, or loading entire database into memory etc. Think about running this code in a production environment and anticipate what kind of problems might arise. Of course you can’t find all of them just by looking at the code but you might be able to identify some common mistakes from your past experience. Like doing Customer.all.
  • Security is something which can be identified early. Check if data is accessed only by authorised users. Read about common security precautions which can be taken and share the same understanding with the team.
  • If you already have integrated with some analytics system, check if the new functionality should integrate with it.
  • Check if enough logging is done to be able to debug the application.
  • Check if alerts are added when things break.

Continuous Delivery

When you are following continuous delivery, you should make sure the new changes don’t break production data and functionality.

  • Make sure data migration scripts are added and check if they execute as expected. Run them on your machine once if required. Don’t make changes which will result in data loss. Also, make sure the features going to production are auto-deployed to a staging or pre-production environment which has a replica of production data (obfuscated) to detected and fix potential issues.
  • Check if feature toggles are added for any functionality which you don’t intend to release.

Other

  • Check if the code under review is merged with the latest master branch code.
  • Make sure instructions are added for any other teams to follow after integrating these changes. For example, informing the QA team what areas to run regression on if there are any major changes. Or informing any downstream consumer systems for any changes in the API contract. Or updating README for helping other developers.

Overall, keep the changes to be reviewed small and raise pull requests soon to get feedback early.

Please add your thoughts in the comments. And if you like the post, recommend it so others can find it and participate in the discussion.


Discussion

markdown guide
 

In my experience, code reviews provide a gloss level review of the change or enhancement.

Also a giant impact is if the change is small, the code review will be of higher value.

If the change is large, the code review will be of little value.

If the change is gigantic, the code review will be of infinitesimal value.

Unless the code reviewer is well-versed and up-to-date on that area of code, it's easy to code review the picayune nits and complete miss the pink elephants in the code.

Whereas, anecdotally, in the few times that I participated in pair programming, which entails continuous code reviews, the code quality was far higher. The continuous code review provided significantly more value.

Here's a couple interesting essays about code reviews, from Microsoft:

 

Very nice list.

I think that list such as this are always handy. Fortunately I know that most of devs are lazy (and this is not a drawback) and I could add such rules but they won't use them if they don't have to. To avoid such a situation
when nobody follows such a list I'm adding it to my pull request policy. I always setup such as policies:

  • build based on feature/bugfix branch has to be succeeded
  • all tests have to be "green"
  • there shouldn't be any new code smells or code duplications after static code analysis
  • author cannot accept own changes
  • at least two developers are required to accept changes

I think that we should automate as much as possible and focus on things which can not be autometed like:

  • checking functional and nonfunctional requirements
  • checking if the code could be refactored
  • wear the tester hat (I like this term)

Last but not the least, always encourage all team members to actively participate in code reviews regarding their experience. Sometimes junior dev could find something interesting and there is also great place to knowledge and experience sharing.

To sum up, I think that this list is a great starting point for a pull request policy in a project.

Cheers.

 

The idea of the post was to bring common understanding to the team about what to look for in pull request reviews. Because everyone has their own understanding of what to check for in the code reviews, it is good to write down things to share knowledge and also act as an agreement. Such a list should be owned by the team and enhanced further.

I agree with automating as much as possible and such automated checks should be part of the build pipeline.