ecoledirecte.com
robots.txt

Robots Exclusion Standard data for ecoledirecte.com

Resource Scan

Scan Details

Site Domain ecoledirecte.com
Base Domain ecoledirecte.com
Scan Status Ok
Last Scan2024-06-16T12:16:38+00:00
Next Scan 2024-06-30T12:16:38+00:00

Last Scan

Scanned2024-06-16T12:16:38+00:00
URL https://ecoledirecte.com/robots.txt
Redirect https://www.ecoledirecte.com/robots.txt
Redirect Domain www.ecoledirecte.com
Redirect Base ecoledirecte.com
Domain IPs 152.228.241.26, 152.228.241.27, 152.228.241.28, 152.228.241.29, 152.228.241.33, 152.228.241.34
Redirect IPs 152.228.241.57, 152.228.241.60, 152.228.241.63, 152.228.241.66, 152.228.241.69, 152.228.241.72
Response IP 152.228.241.57
Found Yes
Hash fd572bd3d7bfa17678f10799f73d3b7f72781b2c8fff0327ddcc743c97d3553c
SimHash 721e54f0e430

Groups

msnbot

Rule Path
Disallow /

mj12bot

Rule Path
Disallow /

ahrefsbot

Rule Path
Disallow /

sogou spider

Rule Path
Disallow /

seokicks-robot

Rule Path
Disallow /

seokicks

Rule Path
Disallow /

discobot

Rule Path
Disallow /

blekkobot

Rule Path
Disallow /

blexbot

Rule Path
Disallow /

sistrix crawler

Rule Path
Disallow /

uptimerobot/2.0

Rule Path
Disallow /

ezooms robot

Rule Path
Disallow /

perl lwp

Rule Path
Disallow /

netestate ne crawler

Rule Path
Disallow /

wiseguys robot

Rule Path
Disallow /

turnitin robot

Rule Path
Disallow /

exabot

Rule Path
Disallow /

yandex

Rule Path
Disallow /

babya discoverer

Rule Path
Disallow /

ccbot

Rule Path
Disallow /

*

Rule Path
Disallow /*?*
Disallow /v3/
Disallow /assets/

Other Records

Field Value
crawl-delay 10

Comments

  • robotstxt.org
  • Block MJ12bot as it is just noise
  • Block Ahrefs
  • Block Sogou
  • Block SEOkicks
  • SEOkicks
  • Dicoveryengine.com
  • Blekkobot
  • Block BlexBot
  • Block SISTRIX
  • Block Uptime robot
  • Block Ezooms Robot
  • Block Perl LWP
  • Block netEstate NE Crawler
  • Block WiseGuys Robot
  • Block Turnitin Robot
  • Exabot
  • Yandex
  • Babya Discoverer
  • Common Crawl ChatGPT
  • Block all URLs including query strings (? pattern) - contentish objects expose query string only for actions or status reports which might confuse search results.
  • Directories
  • Request-rate: defines pages/seconds to be crawled ratio. 1/20 would be 1 page in every 20 second.
  • Crawl-delay: defines howmany seconds to wait after each succesful crawling.
  • Visit-time: you can define between which hours you want your pages to be crawled. Example usage is: 0100-0330 which means that pages will be indexed between 01:00 AM - 03:30 AM GMT.

Warnings

  • 2 invalid lines.
  • `request-rate` is not a known field.
  • `visit-time` is not a known field.