DEV Community

loading...

Discussion on: Swagger + Excel Sheets, a wonderful way of validating REST APIs

Collapse
thorstenhirsch profile image
Thorsten Hirsch

This is indeed a very "enterprisy" way of testing. I'm pretty sure that this tool...

  • helps you executing hundreds or even thousands of well-prepared test cases
  • is kind of self-documenting thanks to the CSV files (which is great to show that you've done your tests thoroughly)
  • will lead to application tests with 100% API coverage and 0 reported errors

Managers will love it!

But then a perfectly tested API goes into integration testing and boom will fail instantly. Because your tool only does the most "boring" tests. What do I mean by that?

  • your test tool is based on the same foundation as the actual API, i.e. you have a specification
  • so basically all you're doing is checking if your developers can implement a client (test tool) for the server (API) based on the same specification
  • since all your test cases only differ in the body of your http request, I'm tempted to say that from a technical point of view your tool can handle only 2 test cases: valid and invalid business data

So I'm afraid that this tool generates a lot of manual work (-> writing the CSV files), but isn't worth the effort, because it is destined to miss those errors that are likely to occur, e.g. wrong character encoding, json format errors, concurrency, load tests, connections not being closed / timeouts, memory leaks, ...

But I have to admit that I've built pretty similar tools for testing. They were not as beautiful as yours, not as generic, and definitely not suited to be released as a product on their own. Now my opinion of data driven testing is that it should be renamed to data testing, because that's the only thing being tested thoroughly.

As long as one doesn't expect that a data driven testing tool helps in making an API production-ready I guess it's okay.

Collapse
dheerajaggarwal profile image
Dheeraj Aggarwal Author

Hi Thorsten,

Thanks for the detailed feedback. It was really enlightening to me what the end user is actually perceiving with my post.

Few points for clarification:

  1. Data driven testing is just a small use case which vREST NG handles beautifully by seamlessly integrating with CSV files.
  2. Starting with API specification file is also not necessary as this is also one of the use case.
  3. In my opinion, most of the time is gone into checking the API functionality with valid and invalid business data which vREST NG solves beautifully.
  4. With vREST NG, you can easily validate your complex scenarios via request chaining.
  5. As of now, vREST NG doesn't support load testing or performance testing. But we have plans to support it in the near future.

I do not agree with statement that the tool generates a lot of manual work for writing the CSV files. Please justify that if I am wrong. Although it saves time by separating the test logic and test data. It even opens up possibility for the end user to generate those CSV files through an external script.

vREST NG is suitable for validating the cases like wrong character encoding, json format errors etc. Yes, as of now, it is not possible to validate concurrency, load tests, etc because load testing is not supported yet. But vREST NG is architected to handle this case as well.

Collapse
ludeknovy profile image
Ludek • Edited

I agree with @Thorsten Hirsch.

I don't want to be mean or disrespectful, but this sounds to me like a tool mainly targeted at managers (test cases for everything). The downside of this whole solution is its effectiveness. All of it can be achieved more effectively with Schemathesis - without the need to manually create CSV files! As a bonus you can either use it as a CLI or in programatic way.