DEV Community

sam lemon
sam lemon

Posted on

SEO: Score 90-100% on Google PageSpeed Insights

One of the most important google ranking factors is page speed.

The best way to improve it is to serve only static HTML files to Search engine bots, Without loading javascript.

How do we detect bots? We can know this by checking the user-agent in every request made to our website.

In this solution I'm going to explain what I implemented

Summary:

I used two S3 buckets

  1. The First one with only static html files (for Search Engine bots)
  2. The Second one with my production website (for users)

These two S3 are behind Cloudfront, which redirect traffic with Lambda@Edge depending on the User-Agent

Let's start!

Go to CloudFront, Select your distribution, Edit Behavior.

Here you need to Whitelist 2 Headers: Origin and User-Agent
Alt Text

Once you have done this. Create a lambda function in us-east-1 Region
Select node.js 10 for runtime and give it lambda@edge permissions.

Alt Text

In the function code, you have to check the user-agent value in the headers, and if its some Search Engine Bots, redirect the origin to S3 bucket with only static html files.

Alt Text

After you are done with this, add a Trigger to the function, select Cloudfront trigger > Deploy to Lambda@Edge > Select your CloudFront distribution > Confirm deploy to Lambda@Edge > Deploy

That's it, Now when Googlebot or any other Search engine wants to index your website, it will get only what it is looking for, The html file.

Make sure your CloudFront Distribution have access to The files in both S3 buckets,

To see how I got 90-100% score in all Paths of the website, with just adding this simple Lambda Function. You can check some Paths in my website in google page speed test.

Top comments (1)

Collapse
 
mspondee profile image
Molossus Spondee • Edited
  1. This is fraud.
  2. You know they check without the user agent occasionally right?