How to add robots.txt to a Django project

By | 29 August, 2014

So we had this nice, unfinished project, publicly shared on a public server being indexed like a pig’s behind on google servers. Some searches on google had hits to full length urls to this unfinished project on the test server. Easily stopped using the robots.txt, except that its a Django site and its not as simple as putting the file on the server’s root. Its not that hard either, but you have to know how.

Anyway, here are a few solutions i found:

1. direct insertion

You dont create the file itself but create it on the spot using the urls.py file:

from django.http import HttpResponse

urlpatterns = patterns('',
    ...
    (r'^robots\.txt$', lambda r: HttpResponse("User-agent: *\nDisallow: /", mimetype="text/plain"))
)

2. As a template file:

Create the robots.txt file as usual, with all your filters and stuff, and link to it also from your urls.py file:

urlpatterns = patterns('',
    ...
    (r'^robots\.txt$', direct_to_template,
     {'template': 'robots.txt', 'mimetype': 'text/plain'}),
)

All the credit goes to an article i found online from Fred Wenzel, and you can visit it at the link below. Read it because it explins and extends the subject much better then i do:

Three ways to add a robots.txt to your Django project (opens _blank)