Last reviewed: Aug 2022
SEO - Search Engine Optimisation was introduced briefly in post 6.2 in the context of Google's "Mobile First" policy. That post gave guidance on screen layout designs intended to ensure prominence in Google search returns. It's now time to look a bit more deeply into this and examine how search returns get into the search engines in the first place.
In the early days of the web, Search Engine indexing of your site would have happened automatically. Once your html files were saved on a public server they would be regularly reviewed by "search bots" deployed by Google and the like to crawl the web looking for information to put into the indexes that drive their search engines.
Google's SEO Starter Guide shows how, once you've carefully filled out the header information in your html file, potential users of your site will be able to use Google searches to get its url. The bots are also able to look deeper into your site and add more general site content from
<p> elements and the like.
But in a modern webapp, while there will still be an html file with a
<p> elements etc, these are not directly available to the indexing bots as they were in the past. How is this ever going to work?
All that you then need to do (at least in theory) is to guide the bot by providing a list of the pages that you'd like it to examine. You do this by constructing a "sitemap" file. Here's the
sitemap.xml file for the index site for these posts.
<?xml version="1.0" encoding="UTF-8"?> <urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9" xmlns:xhtml="http://www.w3.org/1999/xhtml"> <url> <loc>https://ngatesystems.com/about</loc> <changefreq>weekly</changefreq> <priority>0.5</priority> </url> <url> <loc>https://ngatesystems.com/waypoints</loc> <changefreq>weekly</changefreq> <priority>1.0</priority> </url> <url> <loc>https://ngatesystems.com/examples</loc> <changefreq>weekly</changefreq> <priority>0.6</priority> </url> </urlset>
This should be placed in your webapp project's
public directory alongside your
index.html file so that it is deployed into the root of your Firebase project.
You then need to tell Google where to find the sitemap using the Google Search Console for your url. Performance of Google searches on your site can also be tracked her - see Google's Basic Search Console usage document for advice on how to use the Search Console.
In my experience, however if you don't have too many pages to register (on large sites you may need to generate your sitemap file), you are probably better off using the Search Console's URL inspection tab to submit your page addresses directly to the crawler bots. I have found that this produces results within hours, whereas you can spend weeks waiting for action on a sitemap.
A particularly useful benefit of this approach is that once Google has crawled your page, the inspection tab can show you the rendered html code that it has used - a huge confidence booster when you're building this via props and JSX.
Of course the ultimate test is to do a search yourself to see what Google knows about you. When testing my index webapp I found that the best way of doing this was to do a Google search on site content. So, for example, once Google had indexed my site, a search for "Github is a free Microsoft service that gives you somewhere you can store and advertise source code on the web." - a quote from the site's "Examples" page - instantly returned the site's url. I'm still struggling to comprehend how Google can do that, especially when I recall that that particular chunk of text isn't even coded in html - it's buried in a Json.
I think I should mention however, that I had no success on getting index entries for the site-index webapp while it was still using its initial, free, Firebase url at ngatesystems.web.app. Indexing on my site only started to work after I paid Google £10 to register my ngatesystems.com domain. I don't feel aggrieved about this - this was after all a ridiculously small payment for the amazing service that they provide.
A lot depends on getting your site really well indexed. If people can't find you easily they won't use you. But it's not just a question of obtaining a mention in search-engine indexes. Once you're well-indexed you'll want search "hits" for your site to be listed prominently. Welcome to the world of Search Engine Optimisation - the art of ensuring that search engines respect your site.
I've already provided a few hints about how you might convince Google that your site is worth its attention. Google's "mobile first" policy indicates one example of a way in which you can influence their opinion. But this is a deep and complex subject and since this site is just intended to provide an introduction to the subject I'm going to duck out now and point you in the direction of reputable experts.
You'll find very useful background on all of this in Ahref's SEO for React applications document. Ahref is a commercial SEO tool provider that is widely used by major web users to tune their sites. Their sites are full of really helpful advice. Once you're making money yourself, you may find it worthwhile to pay them back by using them to improve your SEO so you can make even more money!
For other ways of improving your site's performance and influencing its SEO credentials, you might like to read on to post 6.4 in this series which describes how Next.js technology creates an opportunity for you to optimise response times without adding to your development costs.