DEV Community

Discussion on: A code review checklist prevents stupid mistakes

Collapse
 
gnilrets profile image
Sterling Paramore

My team of 3 used a checklist very similar to this for over a year. We ended up abandoning it because most of the time the team was just zombie checking off the list. How do you avoid checklist fatigue?

Collapse
 
bosepchuk profile image
Blaine Osepchuk

We had the same problem (zombie checking things as done). We use two google forms and our "quality discussions" to solve the problem.

The first is mentioned in the post. The reviewer records the results of the code review checklist in the google form. During the sprint retrospective you basically get teased (gently) if you miss a checklist item in your self review. Stuff like, "Hey Bob, do you need a new checklist to pay attention when you do your self reviews?" That has been surprisingly effective.

Second, we track defects that we catch in production (and that are reported to us in production) in another google form. We look at each of these defects during our retrospective and do a root cause analysis. We want to know how that defect got by our process. Is it a problem with the process or did someone fail to follow the process? Tracking defects in production helps us detect if both the author and the code reviewer missed the same checklist item (it hasn't happened yet).

Third, we frequently talk about quality and how the cost of fixing a defect sky-rockets the more development stages it gets through before it's detected. We can afford to spend an extra few minutes or hours to make sure the code is good (or at least not obviously bad) because we aren't spending 50% of our time chasing defects in production. In other words, we are happy to invest an hour up front for the chance to save 10-20 hours downstream.