DEV Community

Cover image for Google Search Console can't fetch sitemap on GitHub Pages
Stan Kukučka
Stan Kukučka

Posted on • Updated on

Google Search Console can't fetch sitemap on GitHub Pages

You decided to build your website and you did the decision to publish your tweaked website to GitHub pages. That's cool hosting for your static website to choose from.

You do want to now see this page and other related pages indexed in Google as soon as possible. So you verified your page with Google Search Console with one of the options it offers. It is uploading an HTML file to your GitHub account right next to other website files or adding a meta tag to your site's homepage.

That's great. But what if you would like to fetch the website's sitemap XML file and it won't work for you as good as you want this way?

Google search console won't fetch your sitemap XML

There has been notified some indexation delays regards to Google index speed to process new pages on its own. To beat this issue, the best way is to submit sitemap XML into Google Search Console to let know google bot about your website's structure - pages your website has. But what if after adding sitemap XML file it shows up in the submitted sitemaps section "Couldn't fetch" info?

There is no need to upload to GitHub pages file called .nojekyll as some advice has been found pointing out to this practice.

Image description

Create robots.txt file on Github Pages

To let know Google bot about your sitemap just simple create on a root level of GitHub pages file named robots.txt and write there this information.

Image description

First is about where your sitemap XML file is located and the rest of the information sayin' that your website si crawlable by any other crawlers (bots) in other words it's not blocking some specific bots from accessing content provided on a sitemap XML file as well.

Sitemap: https://your-website-url.github.io/sitemap.xml

User-agent: *
Disallow:
Enter fullscreen mode Exit fullscreen mode

After you created this robots.txt file you can add a new sitemap into Google Search Console again and after apx. 12 hours will be fetched for sure.

Another way around is submitting a sitemap XML file the way to enter the whole URL string of your website with sitemap XML file at the end of https://www.google.com/ping?sitemap= like this hereunter.

https://www.google.com/ping?sitemap=<complete_url_of_sitemap>
Enter fullscreen mode Exit fullscreen mode

For example:

https://www.google.com/ping?sitemap=https://your-website-url.github.io/sitemap.xml
Enter fullscreen mode Exit fullscreen mode

Then enter this into your browser where you'll be notified about it as follows "Sitemap Notification Received". This will tell to Google bot to ping your sitemap XML URL and to process it.

Image description

And that's it, whole magic is at the very end of this article. Next morning your sitemap XML file will be fetched as needed.

Thanks to Richy Great for the cover image from Unsplash.

Top comments (2)

Collapse
 
devprogramer profile image
野生程序员

I got "Couldn't fetch" for 30 days.

Collapse
 
stankukucka profile image
Stan Kukučka

@devprogramer (野生程序员) That seems to be a "huge" delay. Is it GitHub hosting or any other? Did you try suggestions from the article above?