A checklist that works:
Basics
- Has the author finished making changes?
- Have you read the entire summary/description?
- Does the PR satisfy all the acceptance criteria?
- Have AI and/or junior developers reviewed it first?
- Are there any unnecessary changes?
- Are all the changes covered by automated or manual tests?
- Do the automated tests all pass? Check the logs afterwards: Do the messages look right?
Missing requirements
If anything is discovered at this point, it's too late. But raise it anyway.
- Is the feature performant?
- Do the changes affect overall performance?
- Are there metrics?
- Does the feature need to be communicated to users?
- Is it secure?
- Does it scale?
- Is the UX good?
- Is it maintainable?
Testing
- First, run the latest release to observe baseline behaviour. Make detailed notes if necessary, including logs and screenshots where appropriate, to help communicate all this to the team later.
- Observe system behaviour, paying close attention to:
- Any unexpected warnings or errors, or unexplained behaviour (no matter how harmless it seems).
- All relevant application logs
- Browser console
- You may need to account for differences between your dev environment and production, for example:
- Mounted network storage
- Browser ad-blockers/proxies
- Amount of memory/storage/processing power available
- Sanity test the changes. Try to break it. Does everything work as expected? Does it raise any questions in your mind?
- Check the logs again. Do all the messages look right?
- Are there any new non-critical warnings? (don't ignore them)
- Are there any gaps in the automated or manual testing?
More things
- Think through the flow with your human brain and apply Murphy's Law.
- Ask AI specifically "review this code/test thoroughly, considering everything that could possibly go wrong".
Gotchas
- Avoid gold plating / nit picking / bikeshedding
- Don't assume it will work. Assume it won't and prove otherwise.
Top comments (0)