grandtheftwiki.com
robots.txt

Robots Exclusion Standard data for grandtheftwiki.com

Resource Scan

Scan Details

Site Domain grandtheftwiki.com
Base Domain grandtheftwiki.com
Scan Status Ok
Last Scan2026-03-09T01:36:41+00:00
Next Scan 2026-03-16T01:36:41+00:00

Last Scan

Scanned2026-03-09T01:36:41+00:00
URL https://www.grandtheftwiki.com/robots.txt
Domain IPs 172.66.40.223, 172.66.43.33, 2606:4700:3108::ac42:28df, 2606:4700:3108::ac42:2b21
Response IP 172.66.43.33
Found Yes
Hash 0b8e6417320b3962ca64016392655c55f7102aaa8418e26647ae7926f50b3191
SimHash 644c9d21e7b1

Groups

*

Rule Path
Disallow /api.php
Disallow /index.php?
Disallow /*action%3Dedit
Disallow /*action%3Dhistory
Disallow /*action%3Ddelete
Disallow /*action%3Dwatch
Disallow /*diff%3D
Disallow /*oldid%3D
Disallow /*printable%3Dyes
Disallow /Special%3ASearch
Disallow /Special%3ARandom
Disallow /Special%3ARecentChanges
Disallow /Special%3AUserLogin
Disallow /Special%3AUserLogout
Allow /index.php?title=
Allow /$
Allow /Main_Page

Other Records

Field Value
crawl-delay 10

ahrefsbot

Rule Path
Disallow /

Other Records

Field Value
crawl-delay 30

semrushbot

Rule Path
Disallow /

Other Records

Field Value
crawl-delay 30

dotbot

Rule Path
Disallow /

Other Records

Field Value
crawl-delay 30

mj12bot

Rule Path
Disallow /

Other Records

Field Value
crawl-delay 30

blexbot

Rule Path
Disallow /

Other Records

Field Value
crawl-delay 30

googlebot

Rule Path
Disallow /*action%3Dedit
Disallow /*action%3Dhistory
Disallow /Special%3A

Other Records

Field Value
crawl-delay 5

yandexbot

Rule Path
Disallow /*action%3Dedit
Disallow /*action%3Dhistory
Disallow /Special%3A

Other Records

Field Value
crawl-delay 5

Other Records

Field Value
sitemap https://www.grandtheftwiki.com/sitemap.xml

Comments

  • Grand Theft Wiki - robots.txt
  • Bot crawl control to prevent database overload
  • This file is copied to the container's document root during Docker build.
  • It tells search engines and bots how to crawl the site.
  • Key settings:
  • - 10 second crawl delay for most bots
  • - 5 second delay for Google/Yandex
  • - Block expensive operations (edit, history, API)
  • - Block special pages
  • - Aggressive bots get 30 second delay or full block
  • Crawl rate limits
  • Block expensive operations
  • Allow important pages for SEO
  • Specific aggressive bots - even stricter
  • Googlebot - allow but slower
  • YandexBot - allow but slower
  • Sitemap