Introduction
If you have ever tried to score 100 / 100 on page speed insights (or other website performance audit tools) one of the things you will have come across is critical CSS and possibly critical JS.
For those of you who don't know what it is: critical CSS is all of the style declarations required to render content "above the fold" (the part of a site you see when you first load the page). The browser needs this in order to render what visitors first see.
Critical JS is the same thing, all of the JS required to make the site function (minimally) on "above the fold" content.
To ensure a page displays as quickly as possible the advice is to add all of the styles needed to render the "above the fold" content inline within a <style>
tag.
Also if you have any critical JS you should do the same with inline <script>
tags.
This way when a user visits your site for the first time they only have to load the HTML (with your inline styles and scripts) and the page can be rendered without having to wait for any other resources to load.
This is essential to improve your First Contentful Paint times and often your Largest Contentful Paint times, both key factors in web vitals.
As an added bonus, inlining your CSS will often fix a lot of problems with Cumulative Layout Shift, another web vital.
Inlining critical CSS and critical JS is especially important on mobile connections where the round trip time to the server can be as high as half a second!
However there is a big problem with inlining CSS and JS that may already be obvious to you, wasted bytes!
First load times are improved massively, but what about the second page that person visits, or returning visitors?
Every time a person visits more than one page on your site you are having to push all of your critical CSS and JS down the wire, even though at that point the browser will have cached all of your external CSS and JS files.
What we need is a solution that "inlines" CSS and JS the first time someone visits but then utilises browser caching for every visit that happens after that.
Hopefully this article will provide a solution for that!
Note: For the sake of this article I will assume you know how to identify your critical CSS and JS and have that in a file ready to go. This article is purely on how to fix the wasted bytes down the wire described previously.
Creating a plan for tackling inline CSS and JS
The solution is actually quite simple in principle.
We serve inline CSS and JS to people who have never visited us before, cache that CSS and JS on their machine and then find a way to make sure that we don't send critical CSS and JS in the HTML if that machine already has it stored.
At first I thought "this is a job for a service worker" as you gotta love what those little guys can do!
But with a bit of thought I realised that there could be a simpler fix for the majority of sites that is easier to implement.
Instead of using a service worker, we shift all of the work to the server.
So first thing is first, separating our visitors into first time visitors and returning visitors.
This one is simple, cookies.
When the server receives a request for our page we check to see if the user has a certain cookie set, if not send the page with inlined critical CSS and JS, if the cookie is set send the page without the inlined critical JS and CSS.
That is simple enough.
Next we need to get the critical CSS and JS cached on the user's browser the first time they visit.
Yet again I jumped to service workers but this one is also simple.
On our server if there is no cookie set (first time visitor) we add the critical CSS as the first file in the <head>
. As there is no real penalty for redeclaring the same CSS twice other than the parsing of the file (which should be less than 50ms for most sites) we can just leave this as it is.
We do the same for our JavaScript, we make sure it is the first file in our footer, before all of the other JavaScript.
I will circle back to the JavaScript in a minute (as you may have noticed a couple of issues with adding the JS twice), for now let's focus on the CSS.
CSS process
So our process is pretty simple.
Our user requests a page - no cookie is set as they are a first time visitor.
Our server then has a conditional statement along the lines of the following: (I have simplified this and used PHP for the example as it should be easy to follow for most people):
PHP
$inlineCSS = "";
// check if the cookie has NOT been set so we can set it
if(!isset($_COOKIE['hasVisitedCSS'])){
// update the $inlineCSS variable to put our inlineCSS within a `<style>` tag.
$inlineCSS = '<style>' . file_get_contents('ourInlineStyles.min.css') . '</style>';
// Set the cookie that we check for so we don't send the inline styles again.
// We set the cookie called "hasVisitedCSS" to have a value of the current time (for use later) and set the expiry one year from now.
setcookie("hasVisitedCSS", time(), time() + 31536000);
}
HTML
<html>
<head>
<title>Title</title>
<?php echo $inlineCSS; ?>
<link rel="stylesheet" type="text/css" href="ourInlineStyles.min.css" />
<!-- our other styles -->
</head>
The above works fine until you update your site's styles.
At that point the CSS someone has cached is out of date so if they return to your site they have to download it before the page renders.
That is why we set the value of the cookie to the current time with time()
.
All we have to do is check when our CSS file was last modified and compare it to that time. If their cookie was installed before we modified the file we simple inline the CSS for them and update the time on the cookie.
The HTML doesn't change, so our server side code looks something like:
PHP
$inlineCSS = "";
// check if the cookie has NOT been set
if(!isset($_COOKIE['hasVisitedCSS'])){
// update the $inlineCSS variable to put our inlineCSS within a `<style>` tag.
$inlineCSS = '<style>' . file_get_contents('ourInlineStyles.min.css') . '</style>';
// Set the cookie that we check for so we don't send the inline styles again.
// We set the cookie called "hasVisitedCSS" to have a value of the current time (for use later) and set the expiry one year from now.
setcookie("hasVisitedCSS", time(), time() + 31536000);
// if the cookie has already been set we compare the time it holds to the last time the file was updated
}else if($_COOKIE['hasVisitedCSS'] < filetime('ourInlineStyles.min.css')){
// we have updated our file since we installed the cookie so we inline the CSS again.
$inlineCSS = '<style>' . file_get_contents('ourInlineStyles.min.css') . '</style>';
}
Please note although the above "works" do not use it in production, there are no checks for if the file exists, the cookies have not been set to "same site" etc. etc.
Other than the caveat above that is the complete solution. First time visitors and visitors who return after we have updated our CSS get the inline styles and returning visitors and visitors who view more than one page get the CSS served from cache and don't have to download all the CSS again.
This solution also plays well with offline-enabled Progressive Web Apps.
Now to the JS
If you have critical JS we can use a very similar method, but there are a couple of "gotchyas".
First if we add the exact same JS to the page twice (once inline and once in an external file) this is likely to cause all kinds of problems the second time it executes if we don't account for it.
There is a simple way we can sort this though.
At the start of our script we can add a quick check to see if a variable has been added to the page. Then we simply check for this variable in our script and if it hasn't been set we let the script run.
There are much better ways to do the following, this is the simplest example I could think of.
if(!window.mymagicvariable){
init1();
}
//all of your functions can go here, assuming redeclaring them will not break your application.
function init1(){
console.log("hi");
}
window.mymagicvariable = true;
The second "gotchya" is that if we do not have the inline script on the page (as someone is a returning visitor or viewing a second page) then we don't want that script to have the defer
or async
attribute anymore.
This is so that we deliberately block the rendering of the page until our critical JS has run.
Yet again now that we have the method for differentiating between returning and new visitors down this is simple to do:
PHP
$inlineJS = "";
// we need to make sure we always serve the script, even if the file has not been updated so we set our default here.
$externalJS = '<script src="ourInlineScript.js"/>';
// check if the cookie has been set
if(!isset($_COOKIE['hasVisitedJS'])){
// update the $inlineJS variable to put our inlineJS within a `<script>` tag.
$inlineJS = '<script>' . file_get_contents('ourInlineScript.js') . '</script>';
$externalJS = '<script src="ourInlineScript.js" async />';
// Set the cookie that we check for so we don't send the inline script again.
// We set the cookie called "hasVisitedJS" to have a value of the current time (for use later) and set the expiry one year from now.
setcookie("hasVisitedJS", time(), time() + 31536000);
// cookie already set, let's check we haven't updated the file.
}else if($_COOKIE['hasVisitedJS'] < filetime('ourInlineScript.js')){
// we have updated our file since we installed the cookie so we inline the JS again.
$inlineJS = '<script>' . file_get_contents('ourInlineScript.js') . '</script>';
// no need to update this here as it hasn't changed, just purely for illustration
$externalJS = '<script src="ourInlineScript.js"/>';
}
HTML
<html>
<head>
<title>Title</title>
</head>
<body>
<header></header>
<main></main>
<footer></footer>
<?php echo $inlineJS; ?>
<?php echo $externalJS; ?>
<!--other JS goes here-->
</body>
Conclusion
Most of you will have noticed that the above method means on a first visit the same data is loaded twice. To answer your question yes this will increase the overall page load time ever so slightly for first time visitors, but the benefits far outweigh the drawbacks and you will still hit your 100 / 100 for web vitals with this technique easily (assuming you have done everything else correctly).
Now after writing this I decided to implement this using service workers on our more complex websites (SAAS style sites) as that way I can cache the files without having to actually add them to the page (so I can defer downloading them until after everything else and when the network and CPU is quiet on sites with a heavy CPU and network load).
With that being said, the above technique will work for sites where the critical JS and CSS is small with very little impact and is much simpler to implement and maintain than service worker based solutions. I still use the above technique on simple to mid-complexity sites, they all score above 95 on Page Speed Insights and there is no improvement to the first time visitor score removing the caching mechanism I can see.
I will release an article on the service worker way of doing things in the future for those of you with complex sites.
Finally I just want to say: critical CSS and JS are very important, especially with the Google Experience update coming in May. and making heavy use of Web Vitals.
The above is a foundation that you can build on to implement your own solution to critical CSS and JS not adding extra KB to every request for returning visitors and visitors who view more than one page.
Final warning
The above code examples are not tested and not production ready.
There may be security holes in them, bugs or complete mistakes!
Please only use the code examples as a guide on how to implement this yourself.
If you spot any glaring errors in the code let me know and I will correct for them.
Extra bits and pieces
Want to see an experiment?
The site https://klu.io was designed to score top marks in:
- Page Speed Insights (99 or 100 / 100)
- Mozilla Observatory (website security headers) (125 / 100)
It also scores top marks for accessibility (100 / 100), Best Practices (100 / 100) and SEO (100 / 100) in Lighthouse (in developer tools on Google Chrome). Sadly the progressive Web App broke so we had to remove it, but it also used to work offline.
Despite scoring well in the accessibility tests there are accessibility issues that have developed over time that need fixing. I also never finished all the content for it :-(
I will always have a soft spot for the site design and thought I would share it at the end of my articles for a couple of months as we are building a new site and changing the branding.
It was the best way I could think of to give the KLUIO website and brand "a good send off" before it is retired from service.
I hope you enjoyed reading this article, any questions just ask!
Top comments (3)
another test, ignore me
a test
test