automat.run
robots.txt

Robots Exclusion Standard data for automat.run

Resource Scan

Scan Details

Site Domain automat.run
Base Domain automat.run
Scan Status Ok
Last Scan2025-10-20T09:13:15+00:00
Next Scan 2025-11-19T09:13:15+00:00

Last Scan

Scanned2025-10-20T09:13:15+00:00
URL https://automat.run/robots.txt
Domain IPs 149.202.77.211, 2001:41d0:d:36d3::1
Response IP 149.202.77.211
Found Yes
Hash 57c4af4c39f4dcde3589c02e6016193e2603b62a3f4f1e661724e257747b817a
SimHash e2851df556f1

Groups

googlebot
adsbot-google
adsbot-google-mobile
adsbot-google-mobile-apps
google favicon
googlebot-news
googlebot-image
googlebot-video
mediapartners-google
apis-google
duplexweb-google
bingbot
slurp
duckduckbot
baiduspider
ahrefsbot
rogerbot
yandexbot
dotbot
twitterbot
bingpreview
linkedinbot
yandexbot
facebot
facebookexternalhit
msnbot
msnbot-media

Rule Path
Allow /
Allow /stats.automat.run/matomo.php
Allow /stats.automat.run/piwik.php
Allow /stats.automat.run/matomo.js
Allow /stats.automat.run/piwik.js
Allow /stats.automat.run/js/
Allow /blog/
Allow /steinheuer-art.automat.run/
Allow /dump/
Allow /matomo.php
Allow /piwik.php
Allow /matomo.js
Allow /piwik.js
Allow /js/

Other Records

Field Value
sitemap https://automat.run/blog/index.php?sitemap_xml=true

Comments

  • See http://www.robotstxt.org/orig.html for documentation on how to use the robots.txt file
  • To ban all spiders from the entire site uncomment the next two lines:
  • User-Agent: *
  • Disallow: /
  • To ban all spiders from only specific directories such as /people /u or /tag etc.
  • User-Agent: *
  • Disallow: /people/
  • Disallow: /u/
  • Disallow: /camo/
  • See http://www.robotstxt.org/orig.html for documentation on how to use the robots.txt file
  • To ban all spiders from the entire site uncomment the next two lines:
  • User-Agent: *
  • Disallow: /
  • To ban all spiders from only specific directories such as /people /u or /tag etc.
  • User-Agent: *
  • Disallow: /people/
  • Disallow: /u/
  • Disallow: /camo/
  • Allow:/stats.automat.run/