pietzoomers.com
robots.txt

Robots Exclusion Standard data for pietzoomers.com

Resource Scan

Scan Details

Site Domain pietzoomers.com
Base Domain pietzoomers.com
Scan Status Ok
Last Scan2026-02-02T08:21:52+00:00
Next Scan 2026-03-04T08:21:52+00:00

Last Scan

Scanned2026-02-02T08:21:52+00:00
URL https://pietzoomers.com/robots.txt
Redirect https://www.pietzoomers.com/robots.txt
Redirect Domain www.pietzoomers.com
Redirect Base pietzoomers.com
Domain IPs 104.26.0.78, 104.26.1.78, 172.67.72.99, 2606:4700:20::681a:14e, 2606:4700:20::681a:4e, 2606:4700:20::ac43:4863
Redirect IPs 104.26.0.78, 104.26.1.78, 172.67.72.99, 2606:4700:20::681a:14e, 2606:4700:20::681a:4e, 2606:4700:20::ac43:4863
Response IP 172.67.72.99
Found Yes
Hash c2fac87af355ca4db909066e3c9c4d41725ef4137653aa080f405f75027cab37
SimHash af2cf349cbf1

Groups

*

Rule Path
Allow /*?page=
Allow /*
Disallow *brand%5Bfilter%5D*
Disallow *color_group%5Bfilter%5D*
Disallow *size%5Bfilter%5D*
Disallow *price%5Bfilter%5D*
Disallow *salelabel%5Bfilter%5D*
Disallow *hoofdgroep%5Bfilter%5D*
Disallow /404/
Disallow /app/
Disallow /cgi-bin/
Disallow /downloader/
Disallow /includes/
Disallow /js/
Disallow /lib/
Disallow /magento/
Disallow /pkginfo/
Disallow /report/
Disallow /skin/
Disallow /stats/
Disallow /var/
Disallow /index.php/
Disallow /catalog/product_compare/
Disallow /catalog/category/view/
Disallow /catalog/product/view/
Disallow /catalogsearch/
Disallow /checkout/
Disallow /control/
Disallow /contacts/
Disallow /customer/
Disallow /customize/
Disallow /newsletter/
Disallow /poll/
Disallow /review/
Disallow /sendfriend/
Disallow /tag/
Disallow /wishlist/
Disallow /cron.php
Disallow /cron.sh
Disallow /error_log
Disallow /install.php
Disallow /LICENSE.html
Disallow /LICENSE.txt
Disallow /LICENSE_AFL.txt
Disallow /STATUS.txt
Disallow /*?SID=
Disallow /*.php$
Disallow /*.js$
Disallow /*.css$
Disallow /*.php$
Disallow /*?p=*&
Disallow /*?SID=

Other Records

Field Value
crawl-delay 5

Other Records

Field Value
sitemap https://www.pietzoomers.com/sitemap.xml

Comments

  • Robots.txt last edited by Jos Jonkeren 2022-11-03 15:16
  • Crawlers Setup
  • Google doesn't support the crawl-delay directive, so her crawlers will just ignore it.
  • Bing: set crawl to maximum one page per 5 seconds.
  • Sitemap location
  • Allowable Index
  • Disallow different URL parameters to be indexed
  • Disallow directories
  • Disallow paths (clean URLs)
  • Disallow files
  • Do not index session ID
  • Disllow paths (no clean URLs)