Watch the video tutorial for a walkthrough of the RapidAPI Testing. Then follow the step-by-step tutorial below to create your own API and start testing!
Start Testing with RapidAPI
In this tutorial, you'll learn how to test an API using RapidAPI Testing. First, we'll create a new API from an API specification file. Then, we will walk through how to quickly generate tests using the testing dashboard. Towards the end, we'll talk about how you can set up testing environments, schedule tests, and set alerts for failed tests.
Download API Spec
This tutorial uses a Swagger API Specification for an example API named Pet Store. You can click the link below to download the JSON file. Download Pet Store Swagger API Spec Now that you have the file, we are ready to create our API on the RapidAPI Testing dashboard.
Create a New API
Navigate to the testing dashboard at rapidapi.com/testing/dashboard. You may need to sign in to your RapidAPI account. After signing in, select Create API in the top right of the page.
In the pop-up, enter the new API's information:
- Select the Context
- Enter an appropriate API name (Pet Store)
- Select the API specification file
- Choose OpenAPI for Definition type
Click OK.
The new API should appear on your testing dashboard alongside any existing APIs. Also, it will appear in the account context that you chose when creating the API. Click on the new API tile to enter the Pet Store API testing dashboard.
Build a Test
On the next page, click Create Test. Name the test Add Pet. After you name the test, you'll be taken to the user interface for building tests. You have the option to build tests manually. However, it's easier to use the Request Generator.
Inspect Request Generator
At the bottom of the Add User test dashboard. Click on the highlighted Request Generator bar.
After clicking on the Request Generator, a new pane will appear that has three sections. On the left, you'll see your endpoints. If you click on an endpoint, the middle pane will populate with the information (and sample data) for that route.
Click the Send button in the middle section next to the route's URL. You should receive a successful response in the right section if your API has valid sample data.
Now that you have received a response, we can add this outcome to our Add Pet test. Click Add to Test.
Select Assertions
A new pop-up will appear asking us what parts of the response we want to add to the test. Here, we toggle properties on-or-off. If a property is _on, _then it will be added to the test. Furthermore, we can use the dropdown on the right to modify the type of assertion. In the image below, I have selected various assertions that I feel fit the properties best.
Once we finish checking out assertions, click OK at the bottom of the pop-up. Close the Request Generator and observe all the assertions that we have now added, as individual steps, to our test. You use the drag-and-drop feature to modify the order of test steps. Additionally, you can modify the content of steps by selecting the angle icon on the right of each step.
Now, we are ready to run the test.
Run Test & View Results
At the top right of the Add Pet test dashboard, click Save & Run. The test runs successfully, and the result should appear in the bottom left of the test dashboard.
Click on the new result to view the Execution Report. The report displays the test conditions and details aspects of individual steps. For example, the apiResponse.data.id
attribute is of type number, but the value is also displayed in the report. Furthermore, you can compare the response schema with your test schema by selecting Show Details for steps.
If a test fails, the value that caused the test to fail is displayed in the Execution Report. This makes it easier to find and fix errors.
Chain Requests Together
Some APIs have endpoints that work together or use the same data. We can test API functionality and integrity by chaining different requests together.
Create Pet
First, navigate back to the Pet Store Tutorial API dashboard, and create a new test. Name the test Add-Get Pet. Then, in the Request Generator:
- Select the addPet endpoint
- Click on the Body tab, underneath the URL, and select json
- Send a request to get a response
- Select Add to Test
This time, we are only going to test whether the status parameter returns a 200 status code.
Click OK to add it to our test.
Creating Test Variables
It's advantageous to incorporate test variables into chained requests so that the value can be updated—or tested—quickly. In this example, we want to use a common ID value to create and then get a pet. There are two ways to set a test variable. One way is to select the Settings option in the navigation bar and add variables to test context in the Test Variables section.
The second way is to create a step in the test where you set a variable. Back in the Add-Get Pet test, click the Add Step area to add a new step. You will see a list of different types of steps that are available. Select the Set Variable option. Drag this new step to the top of our steps, and open the step editor by clicking the angle-down icon on the right side of the step. For the Key, enter petId, and for the Value enter 9222645476172562000.
This petId test variable will now be available in later steps using the syntax {{petId}}
.
Using Test Variables
In the Add Pet request, modify the JSON body to include the petId variable.
Next, open up the Request Generator again and select the getPetById endpoint. Replace the last item (undefined
) in the URL with the value {{petId}}
. Then, send the request. It's ok if it fails.
Click Add to Test. Then, unselect all items from the pop-up so only the HTTP request is added to the test. Next, add a new step that asserts the status code for our GET request is 200.
Finally, assert that the ID returned in the response is the same as the ID that we set at the beginning of the test.
The full test should now have the following steps:
- Set test variable petId to 9222645476172562000
- Send a POST request to add a pet (with petId variable in JSON body)
- Assert that the response is equal to 200
- Send a GET request, using the petId variable to fetch the new pet.
- Assert that the status is 200
- Assert that the returned ID for the pet is equal to the petId variable.
Click Save & Run.
The test should be successful. Next, let's take a look at how to create testing environments.
Using Environments
To create environments, click on the Settings icon in the left navigation.
In the Test Environments section, click on the + icon to name your new environment. Environments show up as tabs along the top of the Test Environments section. You define environment variables the same way you define test variables. For this example, you can change the URL for your tests by specifying the protocol and domain. I specified two variables for each environment.
Back in the individual tests, I can modify the URLs for each request and select the environment I want to use before I execute a test.
With the environment variables, we can share variables across different tests. With test variables, we can scope values that are important to a specific test.
Scheduling Tests and Setting Alerts
With RapidAPI Testing, you can schedule tests to run from different locations around the world and with different environments. To schedule a test, select the test that you want to schedule from the dashboard. Click on the Schedule option in the side navigation. Choose the:
- Frequency
- Environment
- Locations
Now that your test is scheduled, go to the main API test settings. This is the same location where we defined test environments. You can select to either receive Email or SMS alerts when a test for this API fails.
Conclusion
In this brief tutorial, we covered how to test your API using RapidAPI Testing. Also, we looked at making basic assertions. However, there are a variety of different types of test steps that can be utilized to test more complex scenarios, including:
- Custom code
- Looping over responses
- Logical assertions (If/Else checks)
Now that you understand the basics of how to test APIs, I encourage you to try it out with your own API.
Top comments (3)
Hi, Mike here.
Just want to ask if it´s possible to compare and test if the answer the api delivers matches with the specified one on the API specification file. e.g.
The specification file says the answer for get: /products has to be an array, but the API´s response is an object. how can I make an assertion here?
Sorry if it´s a begginer question, but, I´m trying to get some help about it.
Thanks.
First time knowing this api tool, I will take a look with that soon. Thanks.
It's brand new :) Let us know what you think