loading...
Cover image for Smarter caching with service workers

Smarter caching with service workers

codr profile image Codr ・3 min read

If you ever built a web app you must have faced the challenge of making it work offline. In this tutorial I share my recent experience on how I made Codr work offline.

Codr is a web app for coding puzzles and challenges, but you don't need to write any code to succeed. A great tool for beginners to learn code, and relaxation therapy for the expert.

Our stack is NodeJS with Express as backend and html/js/jquery for the frontend. To make a webapp work offline you need to use a service worker (aka web worker). It's a separate thread within your browser dedicated to a certain website. If you've never used service workers before, go on Google/Youtube and follow some beginner tutorials first.

A website has generally two types of content, static and dynamic. Static content are images, css, html, javascript and other files. Dynamic content on the other hand is loaded from the web server, such as live stats, blog comments, etc... Offline users should have access to all static content, and display some offline status message where appropriate: "to access this page please go online".

At Codr we have practice challenges/puzzles that require no online connectivity. The number of practice challenges is limited, and each challenge is stored in a separate file so it can easily be cached. But for the ranked mode we do insist on being online, since these are generated dynamically, and to keep the reward system fair and square.

Note: there exist several nice out-of-the-box solutions such as Workbox by Google, which can handle your offline caching needs. But I needed a custom solution for my specific needs.

Our server makes a list of all files that need to be cached, and inserts that at the top of the service worker script:

const assetsToCache = <%- assetsToCache %>
// note I am using EJS templates

All basic service worker tutorials show you how to cache files, but very few of them explain how to force cache updates. One solution is to slightly change your service worker file, such that the browser will trigger an update and do a full re-cache of the files. But what if only 1 out of 100 files has changed? It's a bit dumb to re-download the other 99 files as well, right.

In my backend I generate a hash/checksum of each cached file (using bcrypt). Such that the structure of my assets looks like this:

const assetsToCache = [
  {file: '/codr/views/home.html', checksum: 'XYZ123'},
  ...
]

Having such a structure, I can use IndexedDB storage (client-side) for keeping track of all cached files and their checksums. So whenever the service worker triggers an update, it will only re-cache the changed files (where the checksum is different). Here's the (almost) full code of my service worker:

importScripts('./js/localforage.min.js');
// I am using localforage for IndexedDB access

self.addEventListener("install", function(event) {
    console.log("installing");
    self.skipWaiting();
    event.waitUntil(procDB());
});

async function procDB() {
    const cache = await caches.open(CACHE_NAME);
    for (const entry of assetsToCache) {
        const value = await localforage.getItem(entry.file)
        if (!value || value !== entry.checksum) {
            console.log('caching: ' + entry.file)
            await cache.add(entry.file)
            await localforage.setItem(entry.file, entry.checksum)
        } else {
            // console.log('skip re-cache: ' + entry.file)
        }
    }
}

Happy coding! :)

Posted on May 7 by:

codr profile

Codr

@codr

Improve your coding skills with Codr. A free educational platform founded by Ilya Nevolin.

Discussion

markdown guide