-
- Downloads
#114: disallow web crawlers by default (robots.txt)
Showing
- src/backend/.env.dist 1 addition, 0 deletionssrc/backend/.env.dist
- src/backend/.env.test 1 addition, 0 deletionssrc/backend/.env.test
- src/backend/settings.py 2 additions, 0 deletionssrc/backend/settings.py
- src/backend/urls.py 12 additions, 1 deletionsrc/backend/urls.py
- src/static/robots_allow.txt 0 additions, 0 deletionssrc/static/robots_allow.txt
- src/static/robots_deny.txt 2 additions, 0 deletionssrc/static/robots_deny.txt
Loading
Please register or sign in to comment