DEV Community

Cover image for Add Robots.txt to your Django Project
Ordinary Coders
Ordinary Coders

Posted on • Originally published at ordinarycoders.com

Add Robots.txt to your Django Project

1) Create a robots.txt file in the templates > app folder

User-Agent: [name of search engine crawler]
Disallow: [disallowed URL]
Disallow: [disallowed URL]
Sitemap: https://domain.com/sitemap.xml
Enter fullscreen mode Exit fullscreen mode

2) Specify the User-Agent (i.e. Googlebot, Binngbot, Slurp)
Use an * to specify all user agents.

User-Agent: *
Enter fullscreen mode Exit fullscreen mode

3) Disallow URLs or directories

User-Agent: *
Disallow: /page1
Disallow: /directory/
Enter fullscreen mode Exit fullscreen mode

4) Allow URLs

User-Agent: *
Disallow: /directory/
Allow: /directory/page
Enter fullscreen mode Exit fullscreen mode

5) Add the sitemap

User-Agent: *
Disallow: /directory/
Disallow: /page1
Disallow: /page2
Sitemap: https://domain.com/sitemap.xml
Enter fullscreen mode Exit fullscreen mode

6) Add the robots.txt to the urls.py

from django.urls import path
from . import views
from django.contrib.sitemaps.views import sitemap
from .sitemaps import ArticleSitemap
from django.views.generic.base import TemplateView #import TemplateView
app_name = "main"
sitemaps = {
    'blog':ArticleSitemap
}
urlpatterns = [
    path("", views.homepage, name="homepage"),
    path('sitemap.xml', sitemap, {'sitemaps': sitemaps}, name='django.contrib.sitemaps.views.sitemap'),
    path("robots.txt",TemplateView.as_view(template_name="main/robots.txt", content_type="text/plain")),  #add the robots.txt file
]
Enter fullscreen mode Exit fullscreen mode

Detailed Tutorial

Top comments (0)