sagarjournal.org
robots.txt

Robots Exclusion Standard data for sagarjournal.org

Resource Scan

Scan Details

Site Domain sagarjournal.org
Base Domain sagarjournal.org
Scan Status Ok
Last Scan2025-10-08T08:22:24+00:00
Next Scan 2025-11-07T08:22:24+00:00

Last Scan

Scanned2025-10-08T08:22:24+00:00
URL https://sagarjournal.org/robots.txt
Domain IPs 104.21.75.5, 172.67.209.247, 2606:4700:3030::6815:4b05, 2606:4700:3035::ac43:d1f7
Response IP 172.67.209.247
Found Yes
Hash 45be159bd85bf30ddf4cdf42677af0748c13b1a6a818b0036ff7bd888867a44e
SimHash 503468bf63e5

Groups

*

Rule Path
Disallow
Disallow /3rdparty/
Disallow /admin/
Disallow /admin/admin_index.php
Disallow /backup/
Disallow /cache/
Disallow /install/
Disallow /internal/
Disallow /languages/
Disallow /libs/
Disallow /live/
Disallow /LICENSE.txt
Disallow /logs/
Disallow /modules/
Disallow /plugins/
Disallow /readme.html
Disallow /search.php
Disallow /search/
Disallow /searchurl/
Disallow /tag/
Disallow /templates/
Disallow /new/recent/
Disallow /new/yesterday/
Disallow /new/today/
Disallow /new/week/
Disallow /new/month/
Disallow /new/year/
Disallow /new/alltime/
Disallow /recent/
Disallow /yesterday/
Disallow /today/
Disallow /week/
Disallow /month/
Disallow /year/
Disallow /alltime/
Disallow /upvoted/
Disallow /downvoted/
Disallow /commented/

Other Records

Field Value
crawl-delay 5

Comments

  • 1) this filename (robots.txt) must stay lowercase
  • 2) this file must be in the servers root directory
  • ex: http://www.mydomain.com/pliggsubfolder/ -- you must move the robots.txt from
  • /pliggsubfolder/ to the root folder for http://www.mydomain.com/
  • you must then add your subfolder to each 'Disallow' below
  • ex: Disallow: /cache/ becomes Disallow: /pliggsubfolder/cache/