All sites on the internet must have a robots.txt file to tell search engines what they should do with your site.
Default robots.txt structure:
User-agent: *
Allow: /
To create such files in the Django app all you need is just add them to the templates folder and update the ulrs.py file.
from django.views.generic.base import TemplateView
path("robots.txt",TemplateView.as_view(template_name="robots.txt", content_type="text/plain")),
After that you can test your robots.txt file:
http://127.0.0.1:8000/robots.txt
No comments:
Post a Comment