Need to add a robots.txt file to your Django project to tell Google and friends what and what not to index on your site?
Here are three ways to add a robots.txt file to Django.
1) The (almost) one-liner
In an article on e-scribe.com, Paul Bissex suggest to add this rule to your urls.py
file:
from django.http import HttpResponse
urlpatterns = patterns('',
...
(r'^robots.txt$', lambda r: HttpResponse("User-agent: *\nDisallow: /", mimetype="text/plain"))
)
The advantage of this solution is, it is a simple one-liner disallowing all bots, with no extra files to be created, and no clutter anywhere. It's as simple as it gets.
The disadvantage, obviously, is the missing scalability. The instant you have more than one rule to add, this approach quickly balloons out of hand. Also, one could argue that urls.py
is not the right place for content of any kind.
2) Direct to template
This one is the most intuitive approach: Just drop a robots.txt
file into your main templates directory and link to it via directtotemplate
:
from django.views.generic.simple import direct_to_template
urlpatterns = patterns('',
...
(r'^robots\.txt$', direct_to_template,
{'template': 'robots.txt', 'mimetype': 'text/plain'}),
)
Just remember to set the MIME type appropriately to text/plain
, and off you go.
Advantage is its simplicity, and if you already have a robots.txt file you want to reuse, there's no overhead for that.
Disadvantage: If your robots file changes somewhat frequently, you need to push changes to your web server every time. That can get tedious. Also, this approach does not save you from typos or the like.
3) The django-robots app
Finally, there's a full-blown django app available that you can install and drop into your INSTALLED_APPS
: It is called django-robots.
For small projects, this would be overkill, but if you have a lot of rules, or if you need a site admin to change them without pushing changes to the web server, this is your app of choice.
Which one is right for me?
Depending on how complicated your rule set is, either one of the solutions may be the best fit for you. Just choose the one that you are the most comfortable with and that fits the way you are using robots.txt in your application.