przenoszenie-stron.pl
robots.txt

Robots Exclusion Standard data for przenoszenie-stron.pl

Resource Scan

Scan Details

Site Domain przenoszenie-stron.pl
Base Domain przenoszenie-stron.pl
Scan Status Ok
Last Scan2024-05-14T15:27:23+00:00
Next Scan 2024-06-13T15:27:23+00:00

Last Scan

Scanned2024-05-14T15:27:23+00:00
URL https://przenoszenie-stron.pl/robots.txt
Domain IPs 109.95.159.40
Response IP 109.95.159.40
Found Yes
Hash 981d60d38b8814de9dbdc799edff43e6bacd1f32d7178acf3a0fd082d9f26998
SimHash ba12d50be140

Groups

semrushbot-sa

Rule Path
Disallow /

ezooms

Rule Path
Disallow /

proximic

Rule Path
Disallow /php/

mj12bot

Rule Path
Disallow /

ahrefsbot

Rule Path
Disallow /

sogou spider

Rule Path
Disallow /

seokicks-robot

Rule Path
Disallow /

blexbot

Rule Path
Disallow /

sistrix crawler

Rule Path
Disallow /

uptimerobot/2.0

Rule Path
Disallow /

ezooms robot

Rule Path
Disallow /

perl lwp

Rule Path
Disallow /

blexbot

Rule Path
Disallow /

netestate ne crawler (+http://www.website-datenbank.de/)

Rule Path
Disallow /

wiseguys robot

Rule Path
Disallow /

turnitin robot

Rule Path
Disallow /

turnitinbot

Rule Path
Disallow /

turnitin bot

Rule Path
Disallow /

turnitinbot/3.0 (http://www.turnitin.com/robot/crawlerinfo.html)

Rule Path
Disallow /

turnitinbot/3.0

Rule Path
Disallow /

heritrix

Rule Path
Disallow /

pimonster

Rule Path
Disallow /

pimonster

Rule Path
Disallow /

searchmetricsbot

Rule Path
Disallow /

eccp/1.0 (search@eniro.com)

Rule Path
Disallow /

yandex

Rule Path
Disallow /

baiduspider
baiduspider-video
baiduspider-image
mozilla/5.0 (compatible; baiduspider/2.0; +http://www.baidu.com/search/spider.html)
mozilla/5.0 (compatible; baiduspider/3.0; +http://www.baidu.com/search/spider.html)
mozilla/5.0 (compatible; baiduspider/4.0; +http://www.baidu.com/search/spider.html)
mozilla/5.0 (compatible; baiduspider/5.0; +http://www.baidu.com/search/spider.html)
baiduspider/2.0
baiduspider/3.0
baiduspider/4.0
baiduspider/5.0

Rule Path
Disallow /

sogou spider

Rule Path
Disallow /

youdaobot

Rule Path
Disallow /

gsa-crawler (enterprise; t4-knhh62cdkc2w3; gsa_manage@nikon-sys.co.jp)

Rule Path
Disallow /

megaindex.ru/2.0

Rule Path
Disallow /

megaindex.ru

Rule Path
Disallow /

megaindex.ru

Rule Path
Disallow /

Comments

  • robots.txt automaticaly generated by PrestaShop e-commerce open-source solution
  • http://www.prestashop.com - http://www.prestashop.com/forums
  • This file is to prevent the crawling and indexing of certain parts
  • of your site by web crawlers and spiders run by sites like Yahoo!
  • and Google. By telling these "robots" where not to go on your site,
  • you save bandwidth and server resources.
  • For more information about the robots.txt standard, see:
  • http://www.robotstxt.org/robotstxt.html
  • Block MJ12bot as it is just noise
  • Block Ahrefs
  • Block Sogou
  • Block SEOkicks
  • Block BlexBot
  • Block SISTRIX
  • Block Uptime robot
  • Block Ezooms Robot
  • Block Perl LWP
  • Block BlexBot
  • Block netEstate NE Crawler (+http://www.website-datenbank.de/)
  • Block WiseGuys Robot
  • Block Turnitin Robot
  • Block Heritrix
  • Block pricepi
  • Block Searchmetrics Bot
  • Block Eniro
  • Block YandexBot
  • Block Baidu
  • Block SoGou
  • Block Youdao
  • Block Nikon JP Crawler
  • Block MegaIndex.ru

Warnings

  • 2 invalid lines.