DEV Community

Cover image for What am I doing wrong in my robots.txt? - I thought Googlebot used the most specific, least restrictive rule.
Calin Baenen
Calin Baenen

Posted on

What am I doing wrong in my robots.txt? - I thought Googlebot used the most specific, least restrictive rule.

The Problem

So, there are three files on my website that aren't being indexed as they should because, apparently, they are being blocked by my robots.txt file.
According to Googlebot, lines eleven, twelve, and thirteen are the ones causing the trouble:
Image of the twelfth line being highlighted as an error.
However, this behavior is confusing because, as according to the Order of precedence for rules section of How Google Interprets the robots.txt Specification, Google chooses the most specific and least restrictive rule.
To me, Allow: /resources/furries/faq.html is more specific (and less restrictive) than Disallow: /resources/furries/.

The Desired Effect

The intended behavior is that with Disallow: /resources/furries/ everything becomes blocked, including the directory and index.html, along with the files it contains, except everything I explicitly list, faq.html, index.html, etc... which are allowed to be indexed.
Unfortunately, I, seemingly, do not know how to get this behavior.

Thanks in advance.
Cheers!

Top comments (0)