de.allconstructions.com
robots.txt

Robots Exclusion Standard data for de.allconstructions.com

Resource Scan

Scan Details

Site Domain de.allconstructions.com
Base Domain allconstructions.com
Scan Status Ok
Last Scan2024-11-16T17:16:10+00:00
Next Scan 2024-12-16T17:16:10+00:00

Last Scan

Scanned2024-11-16T17:16:10+00:00
URL https://de.allconstructions.com/robots.txt
Redirect http://de.allconstructions.com/robots.txt
Domain IPs 104.21.32.109, 172.67.185.244, 2606:4700:3030::6815:206d, 2606:4700:3035::ac43:b9f4
Response IP 172.67.185.244
Found Yes
Hash 2a3f9515fd13a66fa0fc8542ffd081e9b69e5d5736b6697198aec24e21610da8
SimHash a41f6815c055

Groups

*

Rule Path
Disallow /account
Disallow /account/
Disallow /admin/
Disallow /admin
Disallow /user/
Disallow /user
Disallow /users/
Disallow /users
Disallow /session
Disallow /session/
Disallow /newsletter/
Disallow /portal/viesbuciu-rezervavimas
Disallow /*/article_print/
Disallow /*/product_print/
Disallow /*/company_print/

slurp

Rule Path
Disallow /*.css$

Other Records

Field Value
crawl-delay 5

twiceler.

Rule Path
Disallow /*.css$

Other Records

Field Value
crawl-delay 5

twiceler

Rule Path
Disallow /*.css$

Other Records

Field Value
crawl-delay 5

aport

Rule Path
Disallow /*.css$

Other Records

Field Value
crawl-delay 20

stackrambler

Rule Path
Disallow /*.css$

Other Records

Field Value
crawl-delay 20

msnbot

Rule Path
Disallow /*.css$

Other Records

Field Value
crawl-delay 10

bingbot

Rule Path
Disallow /*.css$

Other Records

Field Value
crawl-delay 10

kalooga

Rule Path
Disallow /
Disallow /*

*

Rule Path
Disallow /*.css$

baiduspider

Rule Path
Disallow /
Disallow /*

baiduspider-image

Rule Path
Disallow /
Disallow /*

baiduspider-ads

Rule Path
Disallow /
Disallow /*

ahrefsbot

Rule Path
Disallow /

yandex

No rules defined. All paths allowed.

Other Records

Field Value
crawl-delay 2

bytespider

Rule Path
Disallow /

Other Records

Field Value
crawl-delay 7

Comments

  • See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file
  • To ban all spiders from the entire site uncomment the next two lines:
  • User-Agent: *
  • Disallow: /