gsajournals.org
robots.txt
Robots Exclusion Standard data for gsajournals.org
Resource Scan
Scan Details
Site Domain | gsajournals.org |
Base Domain | gsajournals.org |
Scan Status | Ok |
Last Scan | 2024-09-15T06:07:54+00:00 |
Next Scan | 2024-10-15T06:07:54+00:00 |
Last Scan
Scanned | 2024-09-15T06:07:54+00:00 |
URL | https://gsajournals.org/robots.txt |
Redirect | https://documentsdelivered.com/robots.txt |
Redirect Domain | documentsdelivered.com |
Redirect Base | documentsdelivered.com |
Domain IPs | 104.21.46.229, 172.67.142.184, 2606:4700:3033::6815:2ee5, 2606:4700:3033::ac43:8eb8 |
Redirect IPs | 172.66.40.59, 172.66.43.197, 2606:4700:3108::ac42:283b, 2606:4700:3108::ac42:2bc5 |
Response IP | 172.66.40.59 |
Found | Yes |
Hash | 5317042ea0cbd09e1e6fdccf77ba535ca3d2b65062b981c6c6aa8ff56b3df35e |
SimHash | 3347527342a7 |
Groups
mozilla/5.0 (compatible; discobot/1.1; +http://discoveryengine.com/discobot.html)
Rule | Path |
---|---|
Disallow | / |
mozilla/5.0 (compatible; yahoo! slurp china; http://misc.yahoo.com.cn/help.html)
Rule | Path |
---|---|
Disallow | / |
mozilla/5.0+(compatible;+becomebot/3.0;++http://www.become.com/site_owners.html)
Rule | Path |
---|---|
Disallow | / |
vegi bot (we follow your robots.txt settings before crawling, you can slow down the bot by change the crawl-delay parameter in the settings.if you have an enquiry, please email to: abuse-report@terrykyleseoagency.com)
Rule | Path |
---|---|
Disallow | / |
yahooseeker-testing/v3.9 (compatible; mozilla 4.0; msie 5.5; http://search.yahoo.com/)
Rule | Path |
---|---|
Disallow | / |
*
Rule | Path |
---|---|
Disallow | / |
Warnings
- 26 invalid lines.
- `host` is not a known field.