coworker.org
robots.txt

Robots Exclusion Standard data for coworker.org

Resource Scan

Scan Details

Site Domain coworker.org
Base Domain coworker.org
Scan Status Ok
Last Scan2025-07-08T14:08:16+00:00
Next Scan 2025-08-07T14:08:16+00:00

Last Scan

Scanned2025-07-08T14:08:16+00:00
URL https://coworker.org/robots.txt
Redirect https://www.coworker.org/robots.txt
Redirect Domain www.coworker.org
Redirect Base coworker.org
Domain IPs 104.21.112.1, 104.21.16.1, 104.21.32.1, 104.21.48.1, 104.21.64.1, 104.21.80.1, 104.21.96.1, 2606:4700:3030::6815:1001, 2606:4700:3030::6815:2001, 2606:4700:3030::6815:3001, 2606:4700:3030::6815:4001, 2606:4700:3030::6815:5001, 2606:4700:3030::6815:6001, 2606:4700:3030::6815:7001
Redirect IPs 104.22.38.97, 104.22.39.97, 172.67.29.53, 2606:4700:10::6816:2661, 2606:4700:10::6816:2761, 2606:4700:10::ac43:1d35
Response IP 172.67.29.53
Found Yes
Hash 1d1b1b1b7fd9d4156723a618b38c512f42a6c0ddbacd3fe74e4ace769f2eb9d2
SimHash 26491d09d566

Groups

yahoo! slurp

Rule Path
Disallow /petitions/*/comments

Comments

  • See https://www.robotstxt.org/robotstxt.html for documentation on how to use the robots.txt file
  • Tell Yahoo! Slurp to stop trying to call the AJAX endpoint for the next page of comments
  • Other crawlers seem to be smart enough to not need this.