DEV Community

Cover image for What am I doing wrong in my robots.txt? - I thought Googlebot used the most specific, least restrictive rule.
Calin Baenen
Calin Baenen

Posted on

What am I doing wrong in my robots.txt? - I thought Googlebot used the most specific, least restrictive rule.

The Problem

So, there are three files on my website that aren't being indexed as they should because, apparently, they are being blocked by my robots.txt file.
According to Googlebot, lines eleven, twelve, and thirteen are the ones causing the trouble:
Image of the twelfth line being highlighted as an error.
However, this behavior is confusing because, as according to the Order of precedence for rules section of How Google Interprets the robots.txt Specification, Google chooses the most specific and least restrictive rule.
To me, Allow: /resources/furries/faq.html is more specific (and less restrictive) than Disallow: /resources/furries/.

The Desired Effect

The intended behavior is that with Disallow: /resources/furries/ everything becomes blocked, including the directory and index.html, along with the files it contains, except everything I explicitly list, faq.html, index.html, etc... which are allowed to be indexed.
Unfortunately, I, seemingly, do not know how to get this behavior.

Thanks in advance.
Cheers!

Sentry blog image

How I fixed 20 seconds of lag for every user in just 20 minutes.

Our AI agent was running 10-20 seconds slower than it should, impacting both our own developers and our early adopters. See how I used Sentry Profiling to fix it in record time.

Read more

Top comments (0)

Billboard image

Create up to 10 Postgres Databases on Neon's free plan.

If you're starting a new project, Neon has got your databases covered. No credit cards. No trials. No getting in your way.

Try Neon for Free →

👋 Kindness is contagious

Please leave a ❤️ or a friendly comment on this post if you found it helpful!

Okay