"All good but why is the page speed insights score so low?"
The phrase that web developers just hate to hear.
The last week I have spent trying to find a balance between having a SPA (Single Page Application) that loads once and then can be used, with one that is also SEO and crawler friendly, has deep links, and also has good page speed insight scores.
Over a very short period I have had a fast re-introduction to the whole battleground of SEO vs Speed vs Functionality (it sounds like a pick two scenario).
History lesson
In the good old days of web development, php and modperl or static files we would shoot the web page from the server to the browser. Life was full of tables, and good. Sort of.
Then along came web development frameworks like React and about three dozen competitors. Suddenly it was all about hydration and shadow Dom and re-usable components. Recognisable html barely arrived at the browser any more. Instead whole chunks of JS arrived along with the "Flashes of Unstyled Content" (FOUC), and many other headaches.
But it also became about web crawlers failing to crawl your extremely js-heavy content. And it became about poor page speed ("lighthouse") scores too.
For a while google promoted a solution: renderbot! this product (basically chrome in a box) would sit on your server and render the complex pages, then deliver the final-final-really-final html to the dumb crawlers. This sounded good in practice if it could be just plugged in, but the devil was in the details. Now renderbot is render-not: deprecated by google and closed to new GitHub issues.
Solutions
A tempting solution is to just give up and build a static tree of URLs (html files) that can be crawled by all the spiders. Unsatisfying but still a solution many sites use because what else can they really do when crawlers are overloaded by the heavy task of piecing together a web page that displays a few H1 and H3 tags and a paragraph of keywords and about a million javascript instructions to get there?
Enter: server side rendering. Everything old is new again. Web-dev is back to building html complete web pages on the server and shipping them to the client where the glamorous parts start operating with sprinkles of late booted javascript.
But if you're NOT already invested in your main event being React, or Svelte5, or Vue, or NextJS or new sexy platform, then some kind of "enable SSR" (not that this is ever as easy as it sounds, anyway) switch is just not an option. What to do?
What I did
In the case of satellitemap.space the issue I faced is a satellite information website and tool needs an array of search engine friendly URLs that lead directly to satellites! and constellations! it is, after all, how many people might start with trying to get information. "where is the ISS?" "Where did that starlink satellite re-enter?"
I did have deep-links to satellites. But there was a problem. Page speed insights refused to believe the site was interactive until all the network traffic settled down. So it would give good usable URLs that opened the SPA in a certain mode, a failing grade. A deep link to the ISS would be interactive on my phone in a few seconds but google would score the URL as being a failure.
Worse, SEO-rich info related to a SPA deep-link should be mostly concentrated in the "HEAD" section of the page: the page TITLE, the meta tags, and so on. These parts appear first in the index.html that delivers the SPA, and while they can be re-written later, when the deep-link is inspected, dumb crawlers might fail to wait for that.
Enter Astro. Astro allowed me to build a page from any URL pattern, server-side, include complex js driven divs, but have a 90+ lighthouse score. I can still use tailwindcss. I can still can use any js modules or functions that the home page SPA already uses. So it was familiar. It did not require a re-think or much duplication of effort.
As a result of astro I can now add an entire satellite catalog over URLs like: https://satellitemap.space/sat/:norad_id with SEO rich headers and content, but it also loads blindingly fast from the perspective of google lighthouse:
Apart from being able to do API requests before emitting even a header line back to the browser, Astro also enables the trick of only hydrating the sections of page when they are visible, or first interacted with. In the case of the above-mentioned single satellite catalog page these were:
- The Search function
- The full monty webgl globe visuals plugin
- The satellite description accordions
- The Sky transit table
- (and any others I want to add)
So now the complete pageload before any attempt to use search or scroll down to the globe plugin looks like:
that's it. vs any other typical modern website page, this is a lean. A web page with ozempic face.
Scrolling down the page causes the entire globe visualisation to boot up. (And that process was already a lot leaner than the Caesium globe library). So visually it only takes a second or two.
Pressing the Search icon causes the incremental search process to boot up.
None of these late hydrations are relevant to the page crawlers.
Another solution like Astro is Qwik - Qwik City. which is probably slightly faster still. But perfect is the enemy of good. The goal is not to go from 95 to 100 web page speed scores, It was just to not get failing web page speed scores.
Anyway. Astro rocks, I will use it next for constellation information leaf pages, and more.
Top comments (0)