The speakers delves into the nuanced world of software testing, beginning with an exploration of various testing terminologies and the importance of understanding them. It then proceeds to elucidate different testing types, including unit testing, integration testing, end-to-end testing, and functional testing, each serving distinct purposes in ensuring software reliability. The discussion expands to encompass the considerations and challenges inherent in testing, such as achieving adequate test coverage within time and resource constraints, particularly in large companies like Facebook or Google. Furthermore, it examines the cost-benefit analysis of testing, weighing the expense of writing tests against the potential savings from avoiding production issues. Ultimately, the transcript concludes with a curiosity-driven inquiry into testing practices at major companies and a note of gratitude to the audience for their attention.
Summary
Overview of Testing:
- Discussion about various testing terminologies used interchangeably.
- Emphasis on understanding terminologies but not getting too bogged down by them.
Explanation of Testing Types:
- Definition and explanation of unit testing.
- Explanation of integration testing.
- Discussion about end-to-end testing.
- Differentiation of functional testing from other types.
Considerations and Challenges in Testing:
- Importance of test coverage and challenges in achieving it.
- Consideration of time and resource constraints.
- Curiosity about testing practices in large companies like Facebook or Google.
- Speculation on who is responsible for writing tests and how it's managed.
Cost and Benefits of Testing:
- Discussion on the cost of writing tests versus fixing issues in production.
- Consideration of the size of the company in deciding testing strategies.
- Curiosity about the percentage of test coverage in large companies and associated costs.
Podcast
Check out on Spotify.
Transcript
0:01
Hey there, hope you're doing well. In this video, let's quickly talk about testing. I did a video sometime ago, but that's different from this one. If you're a developer or a tester, you may have heard these terminologies used sometimes interchangeably: Unit tests, functional tests, end-to-end tests, integration tests, and so on and so forth.
0:20
And sometimes I feel like teams tend to get bogged down by terminologies. I mean, it's important, so we know what we're talking about. Unless we have an agreement on the terminology, it can become confusing, agreed. But that apart, I think let's not get too bogged down by what each of these actually means, because they tend to mean different things in different systems and different teams.
0:40
So let me just quickly go through them as I understand them, right. So unit testing is essentially testing those individual pieces, literally those units of work. So you don't worry about how it works with other items in the system. It's just, "Okay, I've written a service, an API or created a page. Does that work in just a silo? That's a unit test right now."
0:57
Integration tests essentially mean that when I'm deploying my app or when I'm completely building that particular page, I may have dependencies on other systems, either internally or third-party systems. Does that page work in conjunction with those systems? So that's integration testing, just in the context of that page. Now, if you go to end-to-end test, I guess I see them as maybe they're synonymous. I don't know how different they are. There could be some nuances. The integration testing could be specific to a particular service; end-to-end test may be a little bit broader than that. But again, this is where things get a little bit more muddy for me.
1:31
And then functional tests, they are slightly different again. Terminologies, right? Different platforms call it differently. They may attach different meanings to these same very same terms. To me, a functional test is something that's different from unit because it's not just testing your unit, but it's also different from integration tests because it doesn't necessarily worry about integration in that sense. It's just dealing with that particular functionality from a user perspective, right? When your end user is using your application or that particular page or servers or API, what are they going to be subjected to? And does your page or the API do everything that it promises to do? So it's broader in that sense. It's not just the unit of work, but it is not necessarily integration testing because it's not meant to integrate with other systems.
2:31
Now those are kind of the high-level explanations. There are obviously variations and you can interpret them however you want for it to be done. But what's important is when you complete your unit of work, make sure your test coverage is good enough to cover all the paths that you're traversing. Now, that's great ideally, but I've said this more than once, it's not always feasible. I've not just seen it happen just because of the amount of time it generally takes. And you know your deadlines tend to be more aggressive than you might like for them to be. So I would like to know how it happens in real large companies like product companies like Facebook or Google and stuff. So I'm sure they have the money to invest in creating these tests, but I would like to know, you know, I'm sure maybe the code coverage is not, maybe it probably is 100% given how big they are, but what is the effort that it takes for them to get to that point, right?
3:34
Is it the same team that's doing it? Is it the developer who's writing the actual code of the feature of the API also responsible for writing the coverage? If that's not the case, and if there's a team or a group of people dedicated to doing this, there are pros and cons, but I can see how that might work a little bit more effectively in terms of coverage. But if it's the same person who needs to do it, then I can, you know, I can see 10 challenges off the bat. I'm curious to know how they actually do it and how much time or money they spend, you know, proportionately. So if you're spending a dollar on actually writing that code, getting that page of the service of the feature, how many dollars or what percentage of that dollar are you spending proportionately to writing those tests? Because, you know, tests are expensive, but they're cheaper. I mean, I say two things. These might sound contradictory. What I mean by that is it's expensive because you have to spend more money to write them, but it's cheaper because, you know, once you do write them, it's going to probably going to cost you a whole lot lesser because if you fix an issue in production, it's going to be a whole lot more expensive, right?
4:57
But depending on the size of the company, you have to figure out how you want to go about it. And there's not, there's no silver bullet. But I would really like to know how I mean what kind of coverage these large product companies have assuming it's not 100%, maybe it is and what how much money they spend in writing these tests because it's it's a no brainer that you need to have them, you need to have 100% coverage.
4:57
But I'm just curious to know who's writing them? Is it the developer? Is it the tester? Do you have a different set of people doing this work? How? I mean how do you, what are the checks and balances and how do you split this work? So it's just out of my own curiosity, didn't know how it's happening. But anyways, hope some of that helped you.
5:15
Thanks.
Top comments (0)