kicoolo.com
robots.txt

Robots Exclusion Standard data for kicoolo.com

Resource Scan

Scan Details

Site Domain kicoolo.com
Base Domain kicoolo.com
Scan Status Failed
Failure StageFetching resource.
Failure ReasonServer returned a client error.
Last Scan2025-10-25T20:30:35+00:00
Next Scan 2025-11-01T20:30:35+00:00

Last Successful Scan

Scanned2025-10-17T16:45:28+00:00
URL https://kicoolo.com/robots.txt
Domain IPs 209.124.66.22
Response IP 209.124.66.22
Found Yes
Hash 45be159bd85bf30ddf4cdf42677af0748c13b1a6a818b0036ff7bd888867a44e
SimHash 503468bf63e5

Groups

*

Rule Path
Disallow
Disallow /3rdparty/
Disallow /admin/
Disallow /admin/admin_index.php
Disallow /backup/
Disallow /cache/
Disallow /install/
Disallow /internal/
Disallow /languages/
Disallow /libs/
Disallow /live/
Disallow /LICENSE.txt
Disallow /logs/
Disallow /modules/
Disallow /plugins/
Disallow /readme.html
Disallow /search.php
Disallow /search/
Disallow /searchurl/
Disallow /tag/
Disallow /templates/
Disallow /new/recent/
Disallow /new/yesterday/
Disallow /new/today/
Disallow /new/week/
Disallow /new/month/
Disallow /new/year/
Disallow /new/alltime/
Disallow /recent/
Disallow /yesterday/
Disallow /today/
Disallow /week/
Disallow /month/
Disallow /year/
Disallow /alltime/
Disallow /upvoted/
Disallow /downvoted/
Disallow /commented/

Other Records

Field Value
crawl-delay 5

Comments

  • 1) this filename (robots.txt) must stay lowercase
  • 2) this file must be in the servers root directory
  • ex: http://www.mydomain.com/pliggsubfolder/ -- you must move the robots.txt from
  • /pliggsubfolder/ to the root folder for http://www.mydomain.com/
  • you must then add your subfolder to each 'Disallow' below
  • ex: Disallow: /cache/ becomes Disallow: /pliggsubfolder/cache/