Software Manager, Agile Aficionado, Tech Dabbler, and Public Speaker. A lifelong learner, the prospect of something new and coffee are what get me up in the morning.
I've had the "completely tested" argument with people before, and it depends on what you are testing. Small applications you can get close, but I don't realistically feel you can get 100%, there is always something unexpected. I love this question though, since usually you get the opposite, and with the time crunch that often happens you have to choose what to cover.
This is where I use Risk Mitigation. What areas provide the lowest risk of issues? What parts have less use so if there is a problem it may be small, or affect the fewest number of users? Learn to assess what's not going to be a problem, it takes time and familiarity but its a great skill to work on.
Always know your "Happy Path" and cover that, if its a web application I have tried using web logs to see which pages are hit more often. If you have Google Analytics you can use their tools to see what areas of a web application Customers use more than others...and always touch those.
I always try to automate something every sprint, or when I get time. That way I can quickly pass through many areas and let the automation handle the details and cover new stuff. So older stuff, things that have code untouched, I may leave alone if there is no time, or quickly pass through if I have a moment.
Software Manager, Agile Aficionado, Tech Dabbler, and Public Speaker. A lifelong learner, the prospect of something new and coffee are what get me up in the morning.
That works well too, there is no right way to do it. You have to match the environment you have along with the business needs and pressures to provide the best quality product possible given your constraints. I've found a mix of strategies gives me better confidence that I can then prove when asked to provide detail to Upper Level Management who don't want or need the details but need context to understand a Go or No Go.
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
I've had the "completely tested" argument with people before, and it depends on what you are testing. Small applications you can get close, but I don't realistically feel you can get 100%, there is always something unexpected. I love this question though, since usually you get the opposite, and with the time crunch that often happens you have to choose what to cover.
This is where I use Risk Mitigation. What areas provide the lowest risk of issues? What parts have less use so if there is a problem it may be small, or affect the fewest number of users? Learn to assess what's not going to be a problem, it takes time and familiarity but its a great skill to work on.
Always know your "Happy Path" and cover that, if its a web application I have tried using web logs to see which pages are hit more often. If you have Google Analytics you can use their tools to see what areas of a web application Customers use more than others...and always touch those.
I always try to automate something every sprint, or when I get time. That way I can quickly pass through many areas and let the automation handle the details and cover new stuff. So older stuff, things that have code untouched, I may leave alone if there is no time, or quickly pass through if I have a moment.
I like the data driven approach of using GA to prioritize things to test.
That works well too, there is no right way to do it. You have to match the environment you have along with the business needs and pressures to provide the best quality product possible given your constraints. I've found a mix of strategies gives me better confidence that I can then prove when asked to provide detail to Upper Level Management who don't want or need the details but need context to understand a Go or No Go.