DEV Community

Cover image for How we improved website performance by 24% with 3 unusual changes
SwissGreg for SwissDev Jobs

Posted on • Updated on • Originally published at swissdevjobs.ch

How we improved website performance by 24% with 3 unusual changes

Last weekend we had a chance to fine-tune the performance of a website that we started over a year ago.

It is a job board for Software Developers who are looking for work opportunities in Switzerland. Performance of SwissDevJobs.ch matters for 2 reasons::

  1. Good user experience - which means both time to load (becoming interactive), and feeling of snappiness while using the website.

  2. SEO - our traffic relies heavily on Google Search and, you probably know, that Google favors websites with good performance (they even introduced the speed report in Search Console).

If you search for "website performance basics" you will get many actionable points, like:

  • Use a CDN (Content Delivery Network) for static assets with a reasonable cache time
  • Optimize image size and format
  • Use GZIP or Brotli compression
  • Reduce the size of non-critical JS and CSS-code

We implemented most of those low-hanging fruits.
Additionally, as our main page is basically a filterable list (written in React) we introduced react-window to render only 10 list items at a time, instead of 250.

All of this helped us to improve the performance heavily but looking at the speed reports it felt like we can do better.

So we started digging into the more unusual ways in which we can make it faster and... we have been quite successful! Here is report from this week:

Speed report SwissDev Jobs November

This report shows that the time of full load decreased by 24%!

What did we do to achieve it?

  1. Use rel="preload" for the JSON data
    JSON rel preload

    This simple line in the index.html file indicates to the browser that it should fetch it before it's actually requested by an AJAX/fetch call from JavaScript.

    When it comes to the point when the data is needed, it will be read from the browser cache instead of fetching again. It helped us to shave of ~0,5s of loading time

    We wanted to implement this one earlier but there used to be some problems in the Chrome browser that caused double download. Now it seems to work.

  2. Implement super simple cache on the server side

    After implementing JSON preloading we found that downloading the job list is still the bottleneck (it takes around 0,8s to get the response from the server). Therefore, we decided to look into server-side cache. Therefore, we decided to look into server-side cache. First, we tried node-cache but, surprisingly, it did not improve the fetch time.

    It is worth to mention that the /api/jobs endpoint is a simple getAll endpoint so there is little room for improvement.

    However, we decided to go deeper and built our own simple cache with... a single JS variable. It looks the following:

    JSON rel preload

    The only thing not visible here is the POST /jobs endpoint which deletes the cache (cachedJobs = undefined)

    As simple as it is! Another 0,4s of load time off!

  3. The last thing we looked at is the size of CSS and JS bundles that we load. We noticed that the "font-awesome" bundle weights over 70kb.

    At the same time, we used only about 20% of the icons.

    How did we approach it? We used icomoon.io to select the icons we used and created our own self-hosted lean icon package.

    50kb saved!

Those 3 unusual changes helped us to speed up the website's loading time by 24%. Or, as some other reports show, by 43% (to 1,2s).

We are quite happy with these changes. However, we belive that we can do better than that!

If you have your own, unusual techniques that could help - we would be grateful for sharing them in the comments!

Latest comments (20)

Collapse
 
darthknoppix profile image
Seth Corker

Which are your preferred tools to monitor performance over time?

Collapse
 
venoel profile image
venoel

Now it seems to work.

Can we rely on such conclusion?

Collapse
 
hoabuiyt profile image
hoabuiyt

that good to read

Collapse
 
robbie106gti profile image
Robert • Edited

Have you tried server-side rendering, this is what helps Angular get up to speed. That's all I can think of considering all great ideas in comments and your post

Collapse
 
ben profile image
Ben Halpern

Nice post!

Collapse
 
swissgreg profile image
SwissGreg

Thanks!

Collapse
 
harrylincoln profile image
Harry Lincoln • Edited

Chrome's Lighthouse audit plugin suggests:

  • change your .pngs for .jpgs
  • minify your js
  • quite a bit of unused css

Great site though!

Collapse
 
swissgreg profile image
SwissGreg

Thanks for the suggestions.

Are you using the built-in Lighthouse performance audit in Chrome?

Asking, because On My Machine™ it does not show anything about minifying JS (should be done by default in CRA npm build)

As for images - do you mean jpgs or maybe some of the new formats like WebP?

And the last thing - CSS, do you know any straight forward solution here for Create React App?

Collapse
 
pavelloz profile image
Paweł Kowalski • Edited

It looks like you could use some more webpack dynamic imports to modularize your output js/css further down - with http2 it should help a lot. Maybe even think about separating "shell" css from the details, and load details asynchronously. This could make a big difference in that big css.

If you are brave, play around with purgeCSS - Ive found it an amazing thing if it works :)

Also, doing metatag dns prefetch to your CDN domain could also speed things up - ssl handshakes can slow things up a bit.

It looks like your fonts have all the possible characters in them (19KB is pretty big) - you might want to check out this article - florianbrinkmann.com/en/glyphhange... - i found this recipe to work wonders on fonts i used in one project:
before: 19KB
after: 5KB

Additionally, instead of loading them in the main CSS, inlining them into <style> in body, makes the request starts earlier, hence minimizing FOUT.

Collapse
 
pavelloz profile image
Paweł Kowalski

Also, normally i wouldnt even mention this, but i think this is one of those rare cases where looking at the DOM depth and size could be a good investment if you look into performance issues.

Im a big fan of svg, but im not entirely sure that copying whole SVG tree every time its needed on the map is the way to go - if there is a big DOM tree with a depth, light png might be first easy step to make it a little bit shallower. But its very possible that with react and all that, its possible that your wiggle room will be small.

Collapse
 
swissgreg profile image
SwissGreg

Hey Paweł,

Thank you for the detailed suggestions, appreciate a lot!

Addressing your points:

  1. We are using Create-React-App, not sure if it supports dynamic imports without a custom webpack config, will have a look.
  2. Gonna try purgeCSS, looks like fun :)
  3. Already doing DNS-prefetch.
  4. I thought that the fonts are already optimized - we use something like: nunito-latin which should already be stripped down to only latin characters - or is it possible to go further?
  5. Good point, will look into unnecessary DOM elements.
  6. What do you mean exactly with the SVG tree copying?
Thread Thread
 
pavelloz profile image
Paweł Kowalski

Fonts: ouh yeah, you can strip it down to characters you need - its very effective.

SVG - inlined svg is just a bunch of xml tags, so for 20 (even the same icons) your dom has 20x copy of that xml structure. I presume tiny png would be much flatter and i saw that you dont zoom/animate/manipulate icons on the map, so its less of a sin to migrate. Second option (probably better) would be to use svg symbols and use tag, to not duplicate the tree.

Collapse
 
juancarlospaco profile image
Juan Carlos • Edited

WebP for images is cool.
I just prefer to drop iconfonts if possible or replace with inlined <svg>, depends how you design tho.

Collapse
 
swissgreg profile image
SwissGreg

We need to finally look into WebP :)

Collapse
 
juancarlospaco profile image
Juan Carlos

I have been using it, and is like 25~50% size of JPG depending what requirements are,
even more my web framework has builtin support for WebP.

Collapse
 
youpiwaza profile image
max

Thanks for sharing, duely noted ;)

I'd recommand you add some OpenGraph metas to your website (pretty cool !), in order to improve sharing.

And maybe add a filter for remote working/freelancing, i'd be interested :p

Cheers !

Collapse
 
swissgreg profile image
SwissGreg

Thanks for the suggestion.

For OpenGraph - already added some tags + Twitter card tags + JSON LD for Google.

The filter for remote work would be nice, unfortunately there is only a few companies (I mean like less than 10) in Switzerland that are fully open to remote work.

Collapse
 
youpiwaza profile image
max

Thanks.

I'd still enjoy to see those companies haha

Maybe later

Collapse
 
bayuangora profile image
Bayu Angora

Is the preload method above good for service worker and main js file?

Collapse
 
swissgreg profile image
SwissGreg

I recommend you to check: developer.mozilla.org/en-US/docs/W...

Generally it should worker with the service worker, too.

Preload is mostly useful for resources that are loaded later in the chain.
In our case, the /api/jobs endpoint is called after the JS code is downloaded and processed, so it makes sense to start loading it earlier.