TL;DR
A robots.txt file tells search engines which pages they can and cannot crawl. It sits in your website’s root directory and follows simple rules. By creating, uploading, and submitting it to Google Search Console, you can make crawling more efficient, protect sensitive content, and keep unnecessary pages out of search results.
This guide shows you exactly how to create, configure, and test your robots.txt file for better SEO.
Basic Guidelines for Creating a Robots.txt File
A robots.txt file is a plain text document that guides search engine bots.
It’s essential for:
- Blocking admin panels and login pages
- Preventing crawling of duplicate or irrelevant pages
- Guiding bots toward important content
Important rules:
- Your site can only have one robots.txt file.
- Place it in the main folder of your domain.
- Save it as
robots.txt
(lowercase). - Follow the Robots Exclusion Protocol.
- Rules are case-sensitive.
- Must be accessible at:
https://yourdomain.com/robots.txt
.
Creating a Robots.txt File
Use Notepad (Windows) or TextEdit (Mac) to create your file.
Avoid Microsoft Word — it adds hidden formatting.
Save as UTF-8 encoding and name it robots.txt
.
Example:
User-agent: *
Disallow: /admin/
Allow: /
Sitemap: https://www.yourdomain.com/sitemap.xml
What this means:
- All crawlers are blocked from
/admin/
- All other pages are allowed
- Sitemap is linked for indexing support
How to Write Robots.txt Rules
Directives:
User-agent
Defines which bot the rule applies to:
User-agent: Googlebot
Use *
for all crawlers:
User-agent: *
Disallow
Blocks access to a page or directory:
Disallow: /checkout/
Disallow: /search/
Allow
Overrides Disallow
to permit crawling:
Allow: /search/results/
Sitemap
Points bots to your XML sitemap:
Sitemap: https://yourdomain.com/sitemap.xml
Wildcards
-
*
matches any sequence of characters. -
$
matches the end of a URL. Example:
Disallow: /*.pdf$
Blocks all PDFs.
Uploading the Robots.txt File
- Upload to your domain’s root folder:
https://example.com/robots.txt
. - On hosted platforms, adjust in SEO settings.
- In frameworks like Next.js, place it in the
public/
folder.
Testing the Robots.txt Markup
Step 1: Check public access:
Visit https://yourdomain.com/robots.txt
in a browser.
Step 2: Use Google Search Console:
- Open robots.txt Tester
- Identify and fix syntax errors
- Google also offers an open-source robots.txt parser
Submitting Robots.txt to Google
Although Google finds your robots.txt automatically, you can speed things up:
- Log into Google Search Console
- Go to robots.txt Tester
- Click Submit
Useful Robots.txt Examples
Block All Crawlers
User-Agent: *
Disallow: /
Block a Directory
User-Agent: *
Disallow: /private/
Disallow: /admin/
Block a Single Page
User-Agent: *
Disallow: /confirmation.html
Allow Only Googlebot
User-Agent: GoogleBot
Allow: /
User-Agent: *
Disallow: /
Block PDFs
User-Agent: *
Disallow: /*.pdf$
Allow All Except One Bot
User-Agent: BadBot
Disallow: /
User-Agent: *
Allow: /
Include Sitemap
Sitemap: https://www.yourdomain.com/sitemap.xml
FAQ About Robots.txt
Q: What does a robots.txt file do?
It tells search engines which pages to crawl and which to skip.
Q: Does robots.txt hide my pages from Google?
Not always — if other sites link to them, they may still appear. Use noindex
for full removal.
Q: Can I block AI bots and scrapers?
Yes, block their User-agent
in robots.txt.
Q: What happens if I don’t have robots.txt?
Bots will assume they can crawl everything.
Q: How often should I update it?
Update when your site structure changes or you add/remove sitemaps.
FAQ About Zenith Point Digital Marketing Agency
Q: What does Zenith Point do?
We help businesses grow online with SEO, content strategy, technical optimization, and social media marketing.
Q: Do you handle robots.txt optimization?
Yes, we create, test, and maintain robots.txt files as part of our technical SEO service.
Q: Is this service good for small businesses?
Absolutely — our strategies scale to fit your needs.
Q: How can I start working with Zenith Point?
Contact us through our website, and we’ll arrange a consultation.
Final Thoughts
A well-structured robots.txt file:
- Guides crawlers efficiently
- Protects sensitive areas of your site
- Improves indexing focus
- Saves crawl budget
Zenith Point ensures your robots.txt is precise, compliant, and fully aligned with your SEO goals — so search engines see exactly what you want them to.
Top comments (0)