DEV Community

Cover image for REST APIs - How To Mutate Data From Your React App Like The Pros
Johannes Kettmann
Johannes Kettmann

Posted on • Originally published at profy.dev

REST APIs - How To Mutate Data From Your React App Like The Pros

Changing data via a REST API is easy: just call axios.post or axios.patch in a React click handler. Done.

But unfortunately, production apps need more than that: e.g. you have to handle loading and error state, invalidate the cache, or refetch data. And as a result your code can easily turn into messy spaghetti.

Luckily we have libraries like react-query to our help. These not only give us a lot of these features out of the box. They also allow us to build advanced data-driven features with a snappy user experience without much effort.

On this page, you can see

  • how the “simple” approach quickly gets out of hand and
  • how react-query helps us to build such a snappy data-driven component.

As an example, we’ll build a paginated table component that allows the user to remove single rows by clicking a button. This doesn’t sound too hard at first. But the combination of pagination and removing items quickly leads us to a handful of problems and edge cases.

As always, the devil is in the details. But the end result (using techniques like cache invalidation, request cancellation, and optimistic updates) speaks for itself. Just look how fast the app is even though each click on the button sends a request:

Even a user going in click rage can’t break the component

Note: this is the second part of a series on React and REST APIs. If you want to learn more about fetching data (instead of mutating) based on an advanced example read the first part.

Get the source code

Table Of Contents

  1. The Project
    1. React Frontend
    2. The REST API
    3. The Component
  2. The “Simple” Approach
    1. Data Fetching With useEffect And Click Handlers
    2. Refetching Outdated Data
    3. Optimistic Updates
    4. The Problems Start
  3. The Efficient Approach With react-query
    1. Refetching Data By Invalidating Queries
    2. Cancel Previous Pending Requests
    3. Optimistic Updates With react-query
    4. Populate Missing Table Rows With Prefetched Data
    5. Edge Case: Concurrent Updates To The Cache

The Project

Nobody wants to read through the setup of a new React app, I assume. So I prepared a realistic project that we can use as a slightly more advanced example.

It’s an error-tracking tool similar to Sentry that I created for the React Job Simulator. It includes a React / Next.js frontend and a REST API which we will connect to.

Here's what it looks like.

React Frontend

The application fetches a list of issues from an API and renders them as a table. At the right of each row, you can see a button that “resolves” an issue. The user can click this when they fixed a bug in their application and want to remove the corresponding issue from this list.

Screenshot of the example app used in this tutorial

You can see that this list has several pages (note the “Previous” and “Next” buttons at the bottom). This will give us some headaches later.

The REST API

In this example, we use two REST endpoints.

  1. A GET endpoint to fetch the issue list: prolog-api.profy.dev/v2/issue?status=open (click here to see the JSON response)
  2. A PATCH endpoint to update the status of an issue from “open” to “resolved”: prolog-api.profy.dev/v2/issue/{id}

You can find more details about this REST API in its Swagger API documentation.

The Component

To start on the same page here is the whole component without the data fetching logic. We have a table that shows the issues in its tbody element. At the bottom, you can see the pagination which isn’t relevant to this article. But if you’re interested in building a paginated table have a look at the previous part of this series.

import { useState } from "react";

export function IssueList() {
  // state variable used for pagination
    const [page, setPage] = useState(1);

  // this is where our REST API connections go
  const issuePage = ...
    const onClickResolve = ...

  return (
    <Container>
      <Table>
        <thead>
          <HeaderRow>
            <HeaderCell>Issue</HeaderCell>
            <HeaderCell>Level</HeaderCell>
            <HeaderCell>Events</HeaderCell>
            <HeaderCell>Users</HeaderCell>
          </HeaderRow>
        </thead>
        <tbody>
          {(issuePage.items || []).map((issue) => (
            <IssueRow
              key={issue.id}
              issue={issue}
              onClickResolve={() => onClickResolve(issue.id))
            />
          ))}
        </tbody>
      </Table>
      <Pagination page={page} setPage={setPage} />
    </Container>
  );
}
Enter fullscreen mode Exit fullscreen mode

The “Simple” Approach

Sending requests in a React app isn’t difficult. In our case, we can use a useEffect to fetch data as soon as the component renders. And to send a request when the user clicks on the “Resolve” button we can use a simple click handler.

Data Fetching With useEffect And Click Handlers

Easy peasy.

import { useEffect, useState } from "react";

const requestOptions = { headers: { Authorization: "tutorial-access-token" } }

export function IssueList() {
    const [page, setPage] = useState(1);

  // fetch the data and store it in a state variable
  const [issuePage, setIssuePage] = useState({ items: [], meta: undefined });
  useEffect(() => {
    axios
      .get("https://prolog-api.profy.dev/v2/issue?status=open", requestOptions)
      .then(({ data }) => setIssuePage(data));
  }, []);

  // update the issue status to resolved when clicking the button
  const onClickResolve = () => {
    axios.patch(
      `https://prolog-api.profy.dev/v2/issue/${issue.id}`,
      { status: "resolved" },
      requestOptions,
    );
  };

  return (
    <Container>
      <Table>
        <head>...</thead>
        <tbody>
          {(issuePage.items || []).map((issue) => (
            <IssueRow
              key={issue.id}
              issue={issue}
              onClickResolve={() => onClickResolve(issue.id))
            />
          ))}
        </tbody>
      </Table>
      <Pagination page={page} setPage={setPage} />
    </Container>
  );
}
Enter fullscreen mode Exit fullscreen mode

And unsurprisingly, this works. We see the data in the table. And when we click the button we can see in the dev tools’ network tab that a PATCH request has been sent.

The button triggers a PATCH request but the UI doesn’t update yet

The problem is that the UX isn’t that great. The row the user clicked should disappear from the table. And it does… once we reload the page.

Refetching Outdated Data

We should be able to improve this behavior. Let’s try to refetch the data after the patch request.

export function IssueList() {

  // fetch the data and store it in a state variable
  const [issuePage, setIssuePage] = useState({ items: [], meta: undefined });
  const [invalidated, setInvalidated] = useState(0);
  useEffect(() => {
    axios
      .get("https://prolog-api.profy.dev/v2/issue?status=open", requestOptions)
      .then(({ data }) => setIssuePage(data));
  }, [invalidated]);

  // update the issue status to resolved when clicking the button
  const onClickResolve = () => {
    axios
      .patch(
        `https://prolog-api.profy.dev/v2/issue/${issueId}`,
        { status: "resolved" },
        requestOptions,
      )
      .then(() => {
        setInvalidated((count) => count + 1);
      });
  };

  ...
Enter fullscreen mode Exit fullscreen mode

Cringe… that looks a bit hacky. But it works.

As soon as we click the “Resolve” button, the PATCH request is sent. And once that request resolves the table data is refetched. As a result, the row disappears from the table.

The table data is refetched after the PATCH request

The UX is better but as you can see it’s not great yet. There’s a significant delay between clicking the button and the row being removed. Either we need to show a loading indicator (boring!) or make this snappier.

Optimistic Updates

A common technique to achieve a snappier UX is called “optimistic updates”: the app pretends the request was successful right away.

In our case, that means the issue is removed from the table as soon as we click the “Resolve” button. That’s not hard to do. We simply remove the issue to be resolved from the state before we send the request.

export function IssueList() {
  const [issuePage, setIssuePage] = useState({ items: [], meta: undefined });

  ...

  const onClickResolve = () => {
    // optimistic update: remove issue from the list
    setIssuePage((data) => ({
      ...data,
      items: data.items.filter((issue) => issue.id !== issued)
    });

    axios
      .patch(
        `https://prolog-api.profy.dev/v2/issue/${issueId}`,
        { status: "resolved" },
        requestOptions,
      )
      .then(() => {
        setInvalidated((count) => count + 1);
      });
  };

  ...
Enter fullscreen mode Exit fullscreen mode

The Problems Start

Again, this works. But the code gets messier. And in fact buggy as well.

  • What if the request fails? The user would still see the “optimistic update”. Thus we should restore the previous state.
  • What if the user tries to resolve multiple issues quickly after each other? We might have multiple concurrent GET requests that we should cancel. Otherwise, there’s a good chance for race conditions and invalid data shown to the user.
  • Optimistically removing a row from the table means that the number of rows changes. The table might appear wiggly. What if we want to prevent that?

Around React Email Course

The Efficient Approach With react-query

The initial implementation with react-query isn’t much shorter than our “simple” approach above. But as shown in the previous article we get a lot of things for free (like loading and error states, caching, and so on). For simplicity, we won’t go deeper into these topics here though.


import axios from "axios";
import { useMutation, useQuery } from "@tanstack/react-query";

const requestOptions = { headers: { Authorization: "tutorial-access-token" } }

export function IssueList() {
  const issuePage = useQuery(["issues"], async () => {
    const { data } = await axios.get("https://prolog-api.profy.dev/v2/issue?status=open");
    return data;
    });

  const resolveIssueMutation = useMutation((issueId) =>
    axios.patch(
      `https://prolog-api.profy.dev/v2/issue/${issueId}`, 
      { status: "resolved" },
      requestOptions,
    )
  );

  const { items, meta } = issuePage.data || {};

  ...
}
Enter fullscreen mode Exit fullscreen mode

To fetch the issues from our GET endpoint we can use the useQuery hook. The first parameter ["issues"] is the identifier for this query in the cache. The second parameter is the function responsible for fetching the data.

To send the PATCH request that updates an issue we can use the useMutation hook. In our case, we can simply pass the function responsible for sending the PATCH request.

Sending the PATCH request is now easy. We simply call the mutate function that is returned by the useMutation hook.

<IssueRow
  key={issue.id}
  issue={issue}
  resolveIssue={() => resolveIssueMutation.mutate(issue.id)}
/>
Enter fullscreen mode Exit fullscreen mode

As in the previous example, this sends the request but doesn’t update the table data until we refresh the page.

The resolve button triggers a PATCH request but the UI doesn’t update yet

Refetching Data By Invalidating Queries

To refetch the table data we luckily don’t need any hacky workarounds as before. We can use one of the callbacks that react-query offers in the mutation options.

The first one that we’ll use is onSettled. This callback is fired once a mutation is finished no matter if it was a success or an error. Kind of like the ”finally” method of a promise.

To refetch the table data after the patch request we flag it as invalidated.

export function IssueList() {
  const issuePage = useQuery(["issues"], async () => ...);

  const queryClient = useQueryClient();
  const resolveIssueMutation = useMutation(
    (issueId) =>
      axios.patch(
        `https://prolog-api.profy.dev/v2/issue/${issueId}`, 
        { status: "resolved" },
        requestOptions,
      )
    {
      onSettled: () => {
        // flag the query with key ["issues"] as invalidated
        // this causes a refetch of the issues data
        queryClient.invalidateQueries(["issues"]);
      },
    }
  );

  ...
}
Enter fullscreen mode Exit fullscreen mode

This invalidates all queries containing the issues key (even if additional keys are set). We can see now that the data is refetched automatically after we click the button.

The table data is refetched after the PATCH request

As in the previous “simple” approach, we see a delay between the button click and the row being removed from the table. Let’s deal with that in a bit.

First, there’s another issue that we can fix easily: When a user quickly clicks to resolve multiple issues we can see concurrent GET requests being sent to the REST API.

Multiple GET requests are sent in parallel

In this video, we first see the two PATCH requests. These are followed by two GET requests. Depending on the timing of the button clicks we can end up with different scenarios:

  • Both GET requests return the same data. That would make one of them obsolete.
  • The GET requests return different data. In the worst case, this could lead to an inconsistent UI.

Get the source code

Cancel Previous Pending Requests

To get around this problem, we can cancel any GET request that’s still pending when a new mutation is triggered.

First, we need to set up our GET request to support cancellation. This is typically done by passing the AbortSignal from an AbortController to axios (or fetch). And this again means some additional code.

react-query makes it easier: It already provides an abort signal in the first parameter of the query function.

export function IssueList() {
  // use the AbortSignal that useQuery provides
  const issuePage = useQuery(["issues"], async ({ signal }) => {
    const { data } = await axios.get(
      "https://prolog-api.profy.dev/v2/issue?status=open",
      // pass the abort signla to axios
      { ...requestOptions, signal }
    );
    return data;
  });


  const resolveIssueMutation = useMutation(...);

  ...
}
Enter fullscreen mode Exit fullscreen mode

Now that the query is set up for cancellation we can simply call queryClient.cancelQueries(…) at the right time and we’re done.

The right time to cancel pending GET requests is whenever a new mutation is triggered. Again react-query has our backs: we can use the onMutate callback (a sibling of onSettled):

export function IssueList() {
  const issuePage = useQuery(...);

  // get the query client
  const queryClient = useQueryClient();
  const resolveIssueMutation = useMutation(
    (issueId) =>
      axios.patch(
        `https://prolog-api.profy.dev/v2/issue/${issueId}`,
        { status: "resolved" },
        requestOptions,
      ),
    {
          onMutate: async (issueId) => {
        // cancel all queries that contain the key "issues"
        await queryClient.cancelQueries(["issues"]);
      },
      onSettled: () => {
        queryClient.invalidateQueries(["issues"]);
      },
    }
  );

  ...
}
Enter fullscreen mode Exit fullscreen mode

Let’s try that out.

When we quickly click on two of the issues in our table we can again see two PATCH requests followed by two GET requests. But this time, the first GET request is canceled.

Pending GET requests are canceled

Cool, that was easy to achieve. Didn’t even take a lot of code.

But as mentioned, we still see a delay between clicking the “Resolve” button and the corresponding issue being removed from the table.

Optimistic Updates With react-query

As mentioned before, to update the table immediately after the user clicks the “Resolve” button we can “optimistically update” the data on our frontend. This gives the user the illusion that the action they triggered (resolving the issue) happens instantaneously.

The plan is simple: As soon as the mutation starts we remove the issue from the data. When we control the data ourselves that's easy. But how does it wo with react-query?

  1. We get the current data from the cache via queryClient.getQueryData(...).
  2. We remove the selected issue from this data.
  3. We update the cache data via queryClient.setQueryData(...).
export function IssueList() {
  const issuePage = useQuery(["issues"], async () => ...);

  const queryClient = useQueryClient();
  const resolveIssueMutation = useMutation(
    (issueId) => axios.patch(...),
    {
      // optimistically remove the to-be-resolved issue from the list
      onMutate: async (issueId) => {
        await queryClient.cancelQueries(["issues"]);

        // get the current issues from the cache
        const currentPage = queryClient.getQueryData(["issues"]);

        if (!currentPage) {
          return;
        }

        // remove resolved issue from the cache so it immediately
        // disappears from the UI 
        queryClient.setQueryData(["issues"], {
          ...currentPage,
          items: currentPage.items.filter(({ id }) => id !== issueId),
        });

        // save the current data in the mutation context to be able to
        // restore the previous state in case of an error
        return { currentPage };
      },
      onSettled: () => {
        queryClient.invalidateQueries(["issues"]);
      },
    }
  );

  ...
}
Enter fullscreen mode Exit fullscreen mode

OK, that’s a bit more code than we had in the “simple” approach at the beginning of this page. Still, not very complicated though.

But what if the request fails? With the optimistic update we created the illusion that everything went fine. But we shouldn’t keep the user in the dark if we get an error. We have to restore the previous state when the request fails.

That’s easy with the onError callback. Note that the return value of onMutate is passed to onError as context parameter. How handy is that?

export function IssueList() {
  const issuePage = useQuery(["issues"], async () => ...);

  const queryClient = useQueryClient();
  const resolveIssueMutation = useMutation(
    (issueId) => axios.patch(...),
    {
      onMutate: async (issueId) => {
        // optimistically remove the to-be-resolved issue from the list
        ...

        // save the current data in the mutation context to be able to
        // restore the previous state in case of an error
        return { currentPage };
      },
      // restore the previous data in case the request failed
      onError: (err, issueId, context) => {
        if (context?.currentPage) {
          queryClient.setQueryData(["issues"], context.currentPage);
        }
      },
      onSettled: () => {
        queryClient.invalidateQueries(["issues"]);
      },
    }
  );

  ...
}
Enter fullscreen mode Exit fullscreen mode

Now the resolved issue is removed immediately from the table and the data is updated in the background. You have to trust me with the error handling though.

Optimistic update: The resolved issue is removed from the table immediately via

This is all nice, but still not great. For example, we see that the table has one less row while its data is being refetched. So the height of the table changes and the pagination at the bottom jumps around.

On top of the changing height also lets the scroll bar disappear. That creates a wiggly user experience as the table width changes.

Can we make this experience nicer and maybe even snappier?

Get the source code

Populate Missing Table Rows With Prefetched Data

This is the point where the pagination starts to become a headache.

The API endpoint for our GET requests is paginated and only returns 10 issues at a time. So when we click the “Resolve” button to remove an issue from the table there are only 9 issues left in the UI.

But in fact, the backend has more data for us. So once we refetch the issues we again see 10 rows in the table. And that creates the wiggly UX as discussed above.

Now, what if the frontend already had the data for the second page of issues? We could fill the missing row at the bottom with the first issue of the next page. The number of rows in the table would stay constant and we’d have a much cleaner UX.

In the previous article, we already implemented the prefetching logic to create a snappy experience while navigating through the table pages. We extracted the code related to the GET request in a custom hook that looks like this (sorry, I’m just gonna throw this at you without much explanation here):

async function getIssues(page, options) {
  const { data } = await axios.get("https://prolog-api.profy.dev/v2/issue", {
    params: { page, status: "open" },
    signal: options?.signal,
    ...requestOptions,
  });
  return data;
}

export function useIssues(page) {
  const query = useQuery(
    // note that we added the "page" parameter to the query key
    ["issues", page],
    ({ signal }) => getIssues(page, { signal }),
  );

  // Prefetch the next page!
  const queryClient = useQueryClient();
  useEffect(() => {
    if (query.data?.meta.hasNextPage) {
      queryClient.prefetchQuery(
        ["issues", page + 1],
        async ({ signal }) => getIssues(page + 1, { signal }),
      );
    }
  }, [query.data, page, queryClient]);
  return query;
}
Enter fullscreen mode Exit fullscreen mode

We can now use the hook and connect it to a page state.

export function IssueList() {
  // state variable used for pagination
    const [page, setPage] = useState(1);
  const issuePage = useIssues(page);

  const queryClient = useQueryClient();
  const resolveIssueMutation = useMutation(...);

  ...
}
Enter fullscreen mode Exit fullscreen mode

Note: The page state variable and its setter are also connected to the pagination component which is not shown here (if you’re curious you can see it in the very first code snippet at the top of this page).

Now we can add the first issue from the next page to the current page during the optimistic update.

export function IssueList() {
    const [page, setPage] = useState(1);
  const issuePage = useIssues(page);

  const queryClient = useQueryClient();
  const resolveIssueMutation = useMutation(
    (issueId) =>
      axios.patch(...),
    {
      onMutate: async (issueId) => {
        await queryClient.cancelQueries(["issues"]);

        // note that we have to add the page to the query key now
        const currentPage = queryClient.getQueryData([
          "issues",
          page,
        ]);
        // get the prefetched data for the next page 
        const nextPage = queryClient.getQueryData([
          "issues",
          page + 1,
        ]);

        if (!currentPage) {
          return;
        }

        const newItems = currentPage.items.filter(({ id }) => id !== issueId);

        // add the first issue from the next page to the current page
        if (nextPage?.items.length) {
          const lastIssueOnPage =
            currentPage.items[currentPage.items.length - 1];

          // get the first issue on the next page that isn't yet added to the
          // current page (in case a user clicks on multiple issues quickly)
          const indexOnNextPage = nextPage.items.findIndex(
            (issue) => issue.id === lastIssueOnPage.id
          );
          const nextIssue = nextPage.items[indexOnNextPage + 1];

          // there might not be any issues left to add if a user clicks fast
          // and/or the internet connection is slow 
          if (nextIssue) {
            newItems.push(nextIssue);
          }
        }

        queryClient.setQueryData(["issues", page], {
          ...currentPage,
          items: newItems,
        });

        return { currentPage };
      },
      onError: (err, issueId, context) => {
        if (context?.currentPage) {
          queryClient.setQueryData(["issues", page], context.currentPage);
        }
      },
      onSettled: () => {
        // we don't have to add the page to the query key here
        // this invalidates all queries containing the key "issues"
        queryClient.invalidateQueries(["issues"]);
      },
    }
  );

  ...
}
Enter fullscreen mode Exit fullscreen mode

Yes, the code is getting more complex. But the user experience is worth it.

Look at this: when a user clicks the “Resolve” button the row is not only removed but a new row is appended at the bottom to fill the otherwise empty spot. The table layout is stable and we have a super snappy experience.

The removed table row is replaced with an issue from the next page

Looks so simple but took some effort to build. Unfortunately, there’s still one problem left.

Around React Email Course

Edge Case: Concurrent Updates To The Cache

When a user wants to resolve multiple issues very quickly after one another we can run into a tricky situation. In the video below the user clicks twice removing two rows from the table.

Race condition: A row that was removed already reappears for a short time

It looks like there’s some sort of race condition. Both rows disappear as expected. But then we can see one of the removed rows reappear shortly before it disappears again.

What happened?

  1. The user clicks the button on the first issue which is optimistically removed immediately.
  2. A PATCH request is sent to update the issue. The response arrives right away.
  3. A GET request is sent to refetch the table data.
  4. The second issue is now at the top of the table. The user clicks again to resolve it. This optimistically removes the row from the table.
  5. Again a PATCH request is sent to update the second issue. Shortly after another GET request is sent.
  6. At around the same time, the response to the first GET request arrives preventing it from being canceled.
  7. The query cache is updated with the data from the first GET response.
  8. The second issue is added again to the top of the table although it was already optimistically removed.
  9. Finally, the response to the second GET request arrives with the final data. The cache is updated.
  10. The second issue disappears again from the table.

So it seems that concurrent GET requests cause this problem. Even though pending requests should be canceled. According to my tests, this happens quite frequently and becomes really annoying and confusing.

So the goal is to prevent parallel GET requests as much as possible.

One way to achieve this is to invalidate the “issues” query only when there’s no pending mutation (aka PATCH request). That again means we need to keep track of the number of pending mutations.

This might sound like another state variable at first. But we don’t want to trigger a re-render of the component when each mutation starts. So instead we can better use a ref.

export function IssueList() {
    ...

  // keep track of the number of pending mutations
  const pendingMutationCount = useRef(0);
  const resolveIssueMutation = useMutation(
    (issueId) =>
      axios.patch(...),
    {
      onMutate: async (issueId) => {
        // increment number of pending mutations
        pendingMutationCount.current += 1;

        ...

        return { currentPage };
      },
      onError: (err, issueId, context) => { ... },
      onSettled: () => {
        // only invalidate queries if there's no pending mutation
        // this makes it unlikely that a previous request updates
        // the cache with outdated data
        pendingMutationCount.current -= 1;
        if (pendingMutationCount.current === 0) {
          queryClient.invalidateQueries(["issues"]);
        }
      },
    }
  );

  ...
}
Enter fullscreen mode Exit fullscreen mode

Not sure if that’s hacky or not but it does the job. Look how snappy this table has become even when a user goes into “click rage”.

Even a user going in click rage can’t break the component

Get the source code

Top comments (0)