Am a guy that’s obsessed with web page speed optimisation, so here is what I did to optimise my Hugo site speed to load inless than 0.7 sec.
1# Removing Unused CSS, Minifying the CSS and adding Fingerprint:
Yeah, I started by looking closely into CSS I felt ain't useful on the website and removing them.
Then I minified the CSS using the below code and added fingerprint, wondering what the fingerprint does?
the fingerprint helps in ensuring that fresh content are served, MD5 is the most common implementation of this idea.
MD5 hash will be calculated and added to the CSS file, something like this “styles.min.c1783285df9c580224cb4b14efea9ff1.css”.
The browser will keep cache this file and render it to users always.
So let's say you make few changes to these CSS files and redeploy it, a new md5 hash will be added to it.
So if you force clear cache the CSS for all browsers, the “styles.min.c1783285df9c580224cb4b14efea9ff1.css” will appear as 404 files if you inspect via Chrome Dev Tool.
So the fingerprinting of js and CSS assets helps you deliver fresh contents to users.
Below is code to minify and add md5 fingerprint to your Hugo site, add the below codes to your header files, the resources.Get “/styles.css” change the path of the CSS to your own CSS path.
{{ $style := resources.Get "/styles.css" | minify | fingerprint "md5" }}
2# Compress, Minify JS files, Optimize and Add fingerprints:
JavaScript happens to be most of the rendering blocking assets in Pagespeed Insights, so since I have more than one JavaScript files.
I will need to compress all the JavaScript files into one, so as to reduce the number of HTTP request made the server.
Now that the JS files are combined and concatenated into the bundle.js file, let's minify them to remove the unnecessary spaces and comments using resources.Minify
We’ve compressed the js into one, we’ve minified it, let's add the fingerprint, below are the lines of code that performs all the above explanation,
{{ $main := resources.Get "js/main.js" }}
{{ $menu := resources.Get "js/menu.js" }}
{{ $prism := resources.Get "js/prism.js" }}
{{ $theme := resources.Get "js/theme.js" }}
{{ $secureJS := slice $main $menu $prism $theme | resources.Concat "bundle.js" | resources.Minify | resources.Fingerprint "sha512" }}
<script defer type="text/javascript" src="{{ $secureJS.RelPermalink }}" integrity="{{ $secureJS.Data.Integrity }}"></script>
If you check the above code well, you will see I added defer to the
<script.....></script>.
Yeah, loading Javascript files Asyncronaously will help your page speed and improve website load time.
You might be thinking why didn’t I use async instead of defer, well defer seems to be the best practices.
You can as well look into this guide and make your own decision, choosing can vary based on what project you are working on.
To implement it, you need to create a javascript.html file in your Layouts/partial/, paste the following codes and call the JavaScript.html into the place you want it to be using
{{ partial "javascript.html" . }}
We aren’t done yet when you create a web application, you will need to see metrics on who visits the website, which brings in Analytics tools and the common on is google analytics, this is a third-party tool.
Third-party assets are known to slow down web pages because you have no control over them, you can’t set a caching rule on it, but will still do something to it anyway.
In Hugo here is the code that adds Google analytics to your sites and also loads it asynchronously
{{- if .Site.GoogleAnalytics }}
{{ template "_internal/google_analytics_async.html" . }}
{{- end}}
To set your analytics tracking code, just add this to your config.toml file
googleAnalytics = “Your Tracking Code”
That's all for JS files.
3# Optimize your images:
Images occupy 50% of a webpage, stop the habit of not optimising images before uploading them to your sites, according to web.dev, they advise webmasters and web dev to adopt the use of WebP image formats.
But seems not all browser supports that yet, alternative its better to use JPG images instead of the heavy png images.
Also using inline Images can help your website page speed, using im\line images live SVG or base64.
But Base64 encoded images can increase your site KB size but it will reduce your HTTP request.
Best place to use inlined mages are logos, favicon if possible, and images above the fold.
So next time you work on projects always optimise images you use, you can use sqooush.app to compress and even change image extensions and other edits, also you can use base64 to change your images to base64 inline images.
4# Host Fonts Locally:
When building a website, we all want to use fancy fonts, the option everyone uses is the Google font, well google font can really hurt your LCP (Large Contentful Paint) because you can’t control this font since it's not hosted on your site.
Well since we can host it locally why won't don't we do it then, download all the font sizes you will be using in your webpages.
Here is a correct font declaration when hosting your fonts locally, download the font to a directory on your website.
@font-face {
font-family: geomanist;
font-weight: 300;
font-style: normal;
font-display:swap;
src: url(../fonts/geomanist-light-webfont.woff2) format("woff2"),
url(../fonts/geomanist-light-webfont.woff) format("woff")
}
font-display: swap;
The ‘swap’ value tells the browser that text using this font should be displayed immediately using a system font. Once the custom font is ready, the system font is swapped out.
Also preloading or perfecting Fonts are good e.g
rel="preload" as="font" attributes will ask the browser to start downloading the required resource as soon as possible.
It will also tell the browser that this is a font, so it can appropriately prioritise it in its resource queue.
5# Preconnect External resources:
Well we can’t do without using external resources on our web apps since they are external resources, we have little control over, scripts like Google Analytics, Google fonts and many more external scripts that are common on web apps need to be fetched earlier so they won't have much impact on our web page speed.
This can be done using DNS-prefetch, reconnect and prefetch itself, below is the one I used.

Robin has a simple and straight forward guides preload, prefetch and prebrowsing.
6# Using a CDN:
Cloudflare to the rescue again, though netlify promise a CDN, which is called Netlify edge, I must say its Netlify Edge is so awesome when I used it, it did break some functionality on my website, so I left it to use Cloudflare, but I must say my page speed increased with 20% using netlify edge.
But Cloudflare the promising guy, who has data centre almost all over the world, think of it, it's an awesome package, but we the free Cloudflare users can also access it.
CDN works like this, it caches your CSS, js and image files and saves them to their storage on all their available data centre, so when a user from Australia visit your webpage, the cached resources of your site in Cloudflare CDN at Australia get delivered to the Australia users.
So the data gets loaded so fast because your files are available in the data centre near the visitor.
But when you don’t use any CDN, and your hosting provider doesn’t have data centre around the world, then some regions will experience slow load page, because your website will be rendering files from US data centre to Russia users which makes it a bit slower.
7# Configure your cache settings well:
This might be tricky at first, All website deployed on Netlify has a default cache control which is max-age=31536000, public, must-revalidate, which means all content should be cached but also re-validated.
The rationale is that “This favors you if you use the netlify edge can also as a content creator — you can make changes any of your content and see results instantly.
But am using Cloudflare CDN and I won't be changing contents or remodifying all time, so that cache control isn’t good enough for me, I want all my files to be stored in Cloudflare cache and I want the expiry period to be a year.
So I decided to override the netlify default cache using netlify.toml file and I set the
[[headers]]
for “/*”
[headers.values]
Cache-Control = “public”
[[headers]]
for = "/posts/*" # This defines which paths this specific [[headers]] block will cover.
[headers.values]
Cache-Control = "public, s-max-age=604800"
If you notice, I didn’t set max-age or s-max-age to the for “/*” header, I only set “public”, I did that because I already set the max-age value in Cloudflare.
You can also choose to set the whole thing using Netlify.toml, all you will do over your Cloudflare dashboard is to set Browser Cache TTL to Respect Existing Header.
For the [[header]] “/posts/“, this is the path to my blogpost page, where frequent changes could occur, so set s-max-age to 604800 (a week), so am using s-max-age to modify the cache setup for that particular path.
You can learn more about cache controls and Cloudflare cache with the below resources.
Getting started with cloudflare
Top comments (0)