DEV Community

Cover image for Optimising Performance with Next.js [Part 1]
Edmund O'Connell
Edmund O'Connell

Posted on

Optimising Performance with Next.js [Part 1]

Initial Server Response Time

Next.js offers the a vast toolkit of mechanisms with which to optimise the performance of a website written in React. If SEO is important to your site, optimising it's performance will be critical to boosting it's SERP rankings, not to mention your bounce and conversation rates.

Analysing performance for your website is most easily achieved using Lighthouse in Chrome. Lighthouse can also be included as part of your Next.js setup.

When using Lighthouse in Chrome remember to run in Incognito mode as any installed extensions can and likely will negatively impact your scores.

There are several ways to improve the performance score achieved in Lighthouse. In this article I shall explain the techniques employed by our team to reduce one of the most critical performance measures - the Initial Server Response Time.

Image description

Server-Side Rendering (SSR) vs Static Site Generation (SSG)

Next.js offers two distinct pre-rendering methods to optimise the performance of web applications by minimising the amount of JavaScript sent to the client. These methods are Server-Side Rendering (SSR) and Static Site Generation (SSG).

In the initial development phase of a Next.js website, it is a common practice to employ getServerSideProps for constructing the web pages. This method typically constitutes our standard approach.

getServerSideProps is explicitly designed to execute on the server side, ensuring that database operations are executed in the server-side code. This facilitates the fetching of data and the rendering of the page at request time, adhering to standard server-side just in time rendering principles.

The need for speed

The issue with this approach is speed. Even a snappy read request against an indexed collection in a low latency data repository is likely to consume 150-200ms. This will likely serve as the biggest bottleneck to your initial server response time and will cost your performance big. In our case the database round trip was fully ⅓rd of the page load time.

The key to eliminating this latency and obtaining lightning fast page load speeds is to move all your pages over to static page generation. In Static Generation, the HTML is generated at build time. The database operations run at build time and bake in the data into pre-cached pages.

Incremental Static Regeneration

In practical terms this means that your website is static. Changes to the data in your repository will no longer be automatically reflected on your website. This can be an issue for websites with changing dynamic data and could simply be unsuitable where the database updates need to be propagated instantly.

Fortunately Next.js provides us with a way of reconstituting these pages at periodic intervals - Incremental Static Regeneration (ISR). Next.js allows us to update static pages after build time.

export const getStaticProps = async (ctx) => {
  const props = await getStaticPropsHandler(ctx);

  return {
    props: {
      ...props,
      viewport: 'mobile',
      isSsrMobile: true,
    },
    revalidate: 10800,
  };
};
Enter fullscreen mode Exit fullscreen mode

The revalidate attribute value in the code block here is in seconds. In this example where a request is received and the cached page is older than 10800 seconds (3 hours) then it will serve the cached page. However in the background it will regenerate this page which will then be served for future requests.

The value of this attribute can be reduced down to as little a one second. ISR is effectively "free" on Vercel. It doesn't hit your quotas for serverless or edge invocations.

async headers() {
  return [
    {
      source: '/(.*?)',
      headers: [
        {
          key: 'Cache-Control',
          value: `public max-age=10800 s-maxage=10800, stale-while-revalidate=299`
        },
      ],
    },
  ];
}
Enter fullscreen mode Exit fullscreen mode

Remember to update your Cache-Control value in headers() section in next.config.js to reflect your chosen cache time.

The tradeoff

The tradeoff involved in moving to getStaticProps will depend on your website. For us this meant a massive increase in build times. Our website singmalls.app is a directory of shopping mall directories and has well over 30,000 pages of merchant screens. Previously merchant pages were run from a single dynamic page using getServerSideProps. Now each page needed to be generated at build time.

Since each page involves database fetching this increased our build time to well over an hour which proved to be a real issue as Vercel, which only allows a hard maximum of 45 minutes of build time. Anything in excess of this time will result in a deployment failure.

Navigating Vercel's Build Time per Deployment Limit

Vercel's 45 minute Build Time per Deployment limit extends across all their plans from Hobby, Pro or Enterprise so simply upgrading your plan will not help.

This limit then will affect any website using getStaticProps which has a large number of pages which are database driven.

To circumvent this issue we created a node.js script which runs at build time which queries the database in parallel batches. This script builds a series of json files which reside locally on the build server.

By querying these json files at build time rather than a remote database we were able to massively reduce the time required to build each page and took us under Vercel's 45 minute build time limit.

Conclusion

Moving from getServerSideProps to getStaticProps was by far the biggest win in allowing us to improve our Initial Server Response Time metric to the high 90s. By shifting the database queries to build time and not page request time you shift the pain to your build process and away from the end user who benefits in a much improved zippy user experience.

Your SEO will noticeably improve as result as itis well documented that Google rewards web pages with fast load times.

Top comments (0)