DEV Community

Stefan Judis
Stefan Judis

Posted on • Originally published at contentful.com

Faster static site builds Part 1- Process only what you need

This post was initially posted on contentful.com.


Static sites are gaining popularity. Big publishers like the Smashing Magazine rely on static site generation to serve content faster. And they do it without worrying about security fixes or scalable server setups. All you need for static site generation is a CI service that kicks off your build and a static hosting provider to serve your generated static files which we then enrich with serverless technologies.

I’m a big believer in the static sites approach, but this approach comes with a set of challenges depending on what you want to accomplish. One problem is to guarantee short build times: file generation takes time, and if you want to generate twenty thousand pages, the build time increases — which leads to frustration and delays in the publishing workflow.

You might say that you won’t run into these challenges with your project or personal website, and I believed the same thing a few months ago. But recently I was facing the problem of builds taking longer and longer. My private website uses Contentful and Nuxt.js (based on Vue.js). It is deployed via Netlify to Netlify and I was hitting a build time well over 10 minutes — not acceptable.

In this first of two articles on static sites, I will share with you how you can speed up your build process on Netlify with a custom caching layer. The second article will go into the implementation of incremental builds using Nuxt.js.

Beautiful image placeholder with SQIP

Why did the build time increase so much in the first place? A few months ago I came across SQIP. SQIP is a new tool by Tobias Baldauf to generate beautiful SVG placeholder images using Primitive. These placeholders can improve the perceived performance of lazy loaded images. Primitive examines the images and generates SVGs that represent the image with primitive shapes which look surprisingly good when you apply a blur effect.

Sqip image preview in 3 stages - shapes, blurred and loaded

Using these beautiful preview images the user knows what to expect when the image loading kicks in which leads to a better user experience than spinners or random loading graphics.

The way it works is that you place a small SVG graphic below the image that will appear eventually and fade in.

If you’re not interested in implementing these good-looking placeholder images, and only want to read about caching on Netlify, you can jump right to the “Caching for the win” section.

Generation of preview images using SQIP

Here is how it works – my images are stored in Contentful, and to generate the SVG previews I go through these steps:

  • Getting information of all assets stored in Contentful
  • Download all the images
  • Generate placeholder SVGs of the images
  • Create a JavaScript file that includes all the SVGs to inline them later

All the following code sections are small parts of all a longer script which will be linked at the end of the article, and the code makes heavy use of async functions which make the handling of asynchronous operations so much better! As a result, whenever you see an await somewhere, it is placed inside of an async function in the overall implementation.

Following best practices, the resulting script requires all the dependencies on top of the file whereas in the included code sections I place them right before I use them to make the snippets easier to understand.

Fetch all assets from Contentful

Getting all the asset information from the Contentful API is straightforward. I only have to initialize the Contentful SDK client, and the getAssets function gives me the information I need.

const contentful = require('contentful')
const client = contentful.createClient({  })

//Getting asset information

// Contentful collection responses have a default limit 
// of 100 -> increase it to 1000 to avoid the need for
// pagination at this stage
const {items} = await client.getAssets({limit: 1000})
let images = items
  // only treat image files
  // there can also be pdfs and so one
  .filter(
    ({fields}) => fields.file && ['image/png', 'image/jpeg'].indexOf(fields.file.contentType) !== -1
  )
  // strip out useless information
  // and flatten data structure with needed information
  .map(({sys, fields}) => ({
    id: sys.id,
    revision: sys.revision,
    url: fields.file.url,
    filename: `${sys.id}-${sys.revision}.${fields.file.contentType.split('/')[1]}`
  }))
Enter fullscreen mode Exit fullscreen mode

First I have to filter all the assets to strip out files that are not PNGs or JPEGs. Then I get rid of all the meta information that I’m not interested in via a map function.

At this point, I have an array images holding id, revision and the particular image url. The collection also includes a filename property which is the combination of asset ID and its revision.

The connection of these two attributes is necessary because whenever I update an asset, I also want to generate a new preview SVG – this is where the revision number comes into play as it changes in this case.

Download images to create SVGs

With this collection of information of all the assets for my site, I continue with downloading all the assets. The download package I found on npm is a perfect fit.

const download = require('download')
const IMAGE_FOLDER = '...'

// Downloading images for missing SVGs
await Promise.all(
  // map all image objects to Promises representing
  // the image download
  images.map(({url, filename}) => {
    return download(
      url.replace(/\/\//, 'https://'),
      IMAGE_FOLDER,
      { filename }
    )
  })
)
Enter fullscreen mode Exit fullscreen mode

All the asset entries are mapped to promises returned by the download function and everything wrapped into a Promise.all so that I can be sure that all the images are downloaded to the predefined IMAGE_FOLDER. This is where async/await shines!

SQIP it

SQIP can be used programmatically which means that you can require the module and you are good to go.

const {writeFile} = require('fs-extra')
const sqip = require('sqip')

// Writing of generated preview SVGs to disk
await Promise.all(images.map(({id, revision, filename}) => {
  const {final_svg} = sqip({
    filename: path.join(IMAGE_FOLDER, filename),
    numberOfPrimitives: 10,
    mode: 0,
    blur: 0
  })

  return writeFile(
    path.join(IMAGE_FOLDER, `${id}-${revision}.svg`),
    final_svg
  )
}))
Enter fullscreen mode Exit fullscreen mode

The sqip module doesn’t write files to disk though. It returns an object including the generated SVG in the final_svg property. You may say that I could use the SVG string value and store the SVG directly in the images collection, but I went with writing the SVG to disk first.

I also use the fs-extra package that provides some convenience methods over the native fs module, and also maps callback functions to their promisified versions so that I don’t have to make, e.g. writeFile promises based myself.

This has the advantage that I can have a look at the generated SVGs on my hard drive quickly, and it will also come in handy later in the caching section of this article.

The SQIP module accepts the following arguments:

  • numberOfPrimitives defines the number of shapes (10 shapes works for me with rather small SVG files but a good preview experience)
  • mode defines which type of shapes the generated SVG should include (triangle, square, circles, all of these)
  • blur defines the level of applied blur (I went with no blur in the SVG as I discovered that the result of CSS blur leads to better results)

Read the SVGs

Next step was to read all the generated SVGs and make them ready to be used in my JavaScript application.

const {readFile} = require('fs-extra')

// Reading SVGs
images = await Promise.all(images.map(async (image) => {
  const svg = await readFile(path.join(IMAGE_FOLDER, `${image.id}-${image.revision}.svg`), 'utf8')


  // add ID to SVG for easier debugging later
  image.svg = svg.replace('<svg', `<svg id="${image.id}"`)

  return image
}))
Enter fullscreen mode Exit fullscreen mode

fs-extra also provides a readFile function, so I’m ready to flow promises based.

The collection of asset objects gets enriched with the string value of the generated SVG. This string value also adds the asset ID to the SVG so that I can later see what asset was the base for a particular SVG preview image.

Map SVGs to JavaScript to have them available in Nuxt.js (or any other JS environment)

The last step – the collection of assets now includes meta information, and also the generated stringified SVGs in the svg property of every item. Time to make it reusable in a JavaScript environment.

const JS_DESTINATION = path.resolve(__dirname, 'image-map.js')

// Writing JS mapping file
writeFile(
  JS_DESTINATION,
  `export default {\n  ${images.map(({id, svg}) => `'${id}': '${svg}'`).join(', ')}\n}\n`
)
Enter fullscreen mode Exit fullscreen mode

This step writes a JavaScript file which is ignored in my git repository. The JavaScript file exports an object that defines every SVG via asset ID. This way I could later import this file and use the asset ID to get the generated SVG at run and build time.

import imageMap from '~/plugins/image-map.js'

const preview = imageMap[this.asset.sys.id] || null
Enter fullscreen mode Exit fullscreen mode

The execution of the resulting script, including nice logging messages takes two to four minutes on my MacBook Pro for 55 assets (depending on what else is running on my machine).

▶ ./scripts/sqip-it-without-cache               [19:46:49]
Getting asset information
Asset information queried - 55 assets
// --------------------------------------------
Downloading images for SVGs...
Images downloaded
// --------------------------------------------
Creating SVGs...
SVGs created
// --------------------------------------------
Reading SVGs...
SVGs read
// --------------------------------------------
Writing JS mapping file
JS file written
// --------------------------------------------
▶                                                [19:50:46]
Enter fullscreen mode Exit fullscreen mode

When it runs on Netlify though, the script execution could easily take five to seven minutes resulting in build times around the mentioned ten minutes.

Netlify build overview showing builds with 10 minutes build time

The repeated regeneration is not an optimal approach. With this script, every build would do the same heavy lifting – over and over again. Whenever you repeat operations, may it be image optimizations or other massive computations that take several minutes, it’s time to improve.

The beauty of a continuous delivery pipeline is that things can go live regularly and quickly – ten minutes to bring a typo fix into production is not the environment I want to deal with for my small site.

So how do I sort out this mess?

I could generate the image previews myself and also upload them to Contentful which has the downside of having two assets depending on each other that I need to deal with (the image and the preview) – not an option.

I could commit the preview to the git repository, but I always feel bad committing large assets to git. Big binary files are not what git is made for, and it increases the size of the repository drastically – no option either.

Caching for the win

Netlify runs every deploy in a docker container without the possibility to reuse things from the previous deploy (except for dependencies – but I don’t want to misuse the node_modules folder for my own things). My initial solution was an S3 bucket acting as a cache layer during my builds.

The cache layer would hold the downloaded images and generated previews from the previous build, and due to the ID and revision naming convention, a file existence check would be enough to figure out what new assets need to be generated. This approach worked fine but then Phil from Netlify shared a secret with me (be careful though – it’s not documented and usage is at own risk).

It turns out there is a folder that persists across builds – /opt/build/cache/. You can use this folder to store files across builds which leads to a few additional steps in my script but decreases the time of the SVG generation drastically:

  • Getting information of all assets stored in Contentful
  • Check what SVGs have already been generated
  • Download missing images
  • Generate placeholder SVGs of missing images
  • Create a JavaScript file that includes all the SVGs to inline them later

Define a caching folder locally and in Netlify

The image folder that I defined in the script now becomes a cache folder (SQIP_CACHE) depending on the environment.

const isProduction = process.env.NODE_ENV === 'production'
const SQIP_CACHE = isProduction
  ? path.join('/', 'opt', 'build', 'cache', 'sqip')
  : path.resolve(__dirname, '.sqip')
Enter fullscreen mode Exit fullscreen mode

This way I could run the script on my development machine and place all the files in a folder that is also ignored by git, but when running on Netlify it uses the persistent folder.

Check of existent generated files

Remember the images collection I used previously?

const {readFile} = require('fs-extra')

// Reading cached SVGs
images = await Promise.all(images.map(async (image) => {
  try {
    const svg = await readFile(`${SQIP_CACHE}/${image.id}-${image.revision}.svg`, 'utf8')
    if (svg.startsWith('<svg')) {
      image.svg = svg
    }
  } catch (e) {}

  return image
}))
Enter fullscreen mode Exit fullscreen mode

I then add another step to the previous script and see if an SVG with the right asset ID and revision combination is available in the cache folder.

If so, read the file and define the svg property of the image entry, if not, go on.

Generation of new preview SVGs

The generation of SVG files stays the same, except that I can now check if there is already a generated SVG value available like so:

// Creating missing SVGs...
await Promise.all(images.map(({id, revision, filename, svg}) => {
  // if there was an SVG in the cache
  // do nothing \o/
  if (!svg) {
    const command = `${SQIP_EXEC} -o ${id}-${revision}.svg -n 10 -m 0 -b 0 ${filename}`

    return execute(
      command,
      {cwd: SQIP_CACHE}
    )
  }

  return Promise.resolve()
}))
Enter fullscreen mode Exit fullscreen mode

With the improved script I can avoid repeated computation, and the build times on my local machine and Netlify went down to not even one second for repeated builds with a filled cache!

QUIP image preview with shapes next to the real image

If you want to play around with it, the provided gist includes all you need to generate and cache beautiful image previews with a Contentful example space.

Think of a kill switch – clearing the cache

There was one last thing though – caching can be hard and especially when you implement a cache on remote servers which you can not access you should be able to throw everything away and start over again.

In my case running on Netlify, I went for a custom webhook that clears the caching directory before anything happens when this webhook triggers the build.

const {emptyDir} = require('fs-extra')

if (process.env.WEBHOOK_TITLE === 'CLEAR_CUSTOM_CACHE') {
  console.log(`Clearing ${SQIP_CACHE}`)
  await emptyDir(SQIP_CACHE)
}
Enter fullscreen mode Exit fullscreen mode

Problem solved!

Keep your builds as fast as possible

The addition of the preview cache improved the building experience of my static site drastically. I love the Contentful, Nuxt.js and Netlify setup and now that the build times are at three minutes again I can start thinking about the next improvement – the speedup of the generation of the static HTML files.

My plan is to use Netlify’s cache folders only to generate particular files rather than the whole site. For example, when I add a new blog post, just a few pages need an update, not all of the 150 pages and all the JavaScript, image and CSS files. That’s computation that can now be avoided.

Contentful's sync endpoint provides granular information about what changed compared to the last sync, and is a perfect fit for this use case which makes incremental builds possible – a topic a lot of big static site generators struggle with. You can read about that soon. I'll let you know!

Top comments (1)

Collapse
 
bayuangora profile image
Bayu Angora

What if those images hosted externally?