I recently started at a new job with testers in the teams. Having never worked with testers, I am curious to see what you all think of the best ways to work with them.
I've had 1:1s with the testers, but keen to get other thoughts and find out what works for your business and team.
Finally, what skill level of programming are your testers, and how does that affect the working relationship?
Top comments (10)
I used to work for a small company where the best developer was me (and this was a BIG problem because everything come to me), so my tester team was also my coding padawan. She was completely newbie when we started to work together, and this was pretty nice because she could test everything like a real user.
I teached her how to use Chrome Developer Tools and how to report the errors on Gitlab, and everything runs OK. But it would be better if we have a real Q&A team, I guess.
Now she is a React ninja.
Good job man! Nice to see you training others as well as "fighting fires" :D
I too have never worked with testers, but I'd think a big part of any working relationship like this is about establishing things each finds interesting about the other.
As a developer it's easy to say you might want the testers to gain an appreciation for automation and workflow efficiency gains, but showing an interest in their expertise and struggle seems like it could be point zero.
Yeah, I started with 1:1s with the testers to see what they thought their role was, where they wanted to go, and how I could help. I'm starting to give their ideal approach a go. While it does seem a touch slower for me, I think the positivity of the relationship will pay dividends.
I'm currently quality assurance (AKA tester) for a large company with a lot of devs and qa teams, there are people with no programming skills doing tests and some others with programming skills but these ones mostly works on test automation (which is my case).
From my experience I just want to give you some advices:
0- I dentify which is the best channel of communication to get in touch with the QA team but specialy with the guy who is testing your items, Sometimes sending emails does not work and you need to find a way to get in sync with him.
1- Make sure that the one who will test your iteam has at least a hight level knowledge of how does that works, sometimes testers open bugs reports for things that were working as expected.
2- Always send emails about the work done and let the team knows that the item is in testing phase. This could help you to track your item activity, and if the manager start complaining about something you already has evidences about your job.
I hope this advices can helps you :D
Totally agree, communication should always be clear :) And triaging the issues to make sure those which aren't actually bugs get closed again.
Start with respect and explore the value of having a good tester on your side.
Testers usually face the brunt of things if something broken makes it to production, so be weary of saying things like "it's just a simple change" or "don't worry about that test case".
It's always better to give more detail on your work than less, and to focus on the testable aspect. It is also a great idea to familiarize yourself with their test cases and to discuss them rather than lobbing work over the fence.
I like your approach, Harry! :D Thanks for contributing.
It's a bit awkward for us because our QA department is based halfway around the globe. We have our standups as early as possible and they've adjusted their working hours so there's a little overlap when we can talk to each other without a time lag.
We deploy every evening (our time) so they can start validating the latest resolved tickets in our issue tracker as soon as they clock in. If they find a problem, they'll create a new ticket or reopen an old one as appropriate. When we're gearing up to release, we'll deploy to a second environment that doesn't get automatic updates so they can run their automated regression suites against it without having to worry about timing.
Our testers have some facility with programming, but focus entirely on automation. They don't dig into our code or try to look "behind the curtain" as they're checking us. I think that's a good way to approach it, by and large. If they know the application internals well it can lead to blind spots, and besides, if they knew the application internals well I'd want to move them into development.
Interesting! The second environment is definitely key. And yeah, knowing what's in the "black box" could lead to bad tests too