actec.dk
robots.txt

Robots Exclusion Standard data for actec.dk

Resource Scan

Scan Details

Site Domain actec.dk
Base Domain actec.dk
Scan Status Failed
Failure StageFetching resource.
Failure ReasonServer returned a client error.
Last Scan2026-01-28T13:20:58+00:00
Next Scan 2026-03-29T13:20:58+00:00

Last Successful Scan

Scanned2025-11-05T15:45:52+00:00
URL https://actec.dk/robots.txt
Domain IPs 104.26.6.19, 104.26.7.19, 172.67.74.218, 2606:4700:20::681a:613, 2606:4700:20::681a:713, 2606:4700:20::ac43:4ada
Response IP 104.26.6.19
Found Yes
Hash a59368e2cdf8fd80cd4db249e58b96fea5d997be499c1dcb986e017761db9079
SimHash ac8abf5766f5

Groups

googlebot-image

Rule Path
Disallow

yandexbot

Rule Path
Allow /*?p=
Disallow /*?p=*&
Disallow /*?

*

Rule Path
Allow /*?p=
Disallow /404/
Disallow /app/
Disallow /cgi-bin/
Disallow /downloader/
Disallow /errors/
Disallow /includes/
Disallow /magento/
Disallow /media/captcha/
Disallow /media/customer/
Disallow /media/dhl/
Disallow /media/downloadable/
Disallow /media/import/
Disallow /media/pdf/
Disallow /media/sales/
Disallow /media/tmp/
Disallow /media/xmlconnect/
Disallow /pkginfo/
Disallow /report/
Disallow /scripts/
Disallow /shell/
Disallow /stats/
Disallow /var/
Disallow */index.php/
Disallow */catalog/product_compare/
Disallow */catalog/category/view/
Disallow */catalog/product/view/
Disallow */catalog/product/gallery/
Disallow */catalogsearch/
Disallow */control/
Disallow */contacts/
Disallow */customer/
Disallow */customize/
Disallow */newsletter/
Disallow */poll/
Disallow */review/
Disallow */sendfriend/
Disallow */tag/
Disallow */wishlist/
Disallow */checkout/
Disallow */onestepcheckout/
Disallow /cron.php
Disallow /cron.sh
Disallow /error_log
Disallow /install.php
Disallow /LICENSE.html
Disallow /LICENSE.txt
Disallow /LICENSE_AFL.txt
Disallow /STATUS.txt
Disallow /*?dir*
Disallow /*?limit*
Disallow /*?mode*
Disallow /*?___from_store=*
Disallow /*?___store=*
Disallow /*?cat=*
Disallow /*?q=*
Disallow /*?price=*
Disallow /*?availability=*
Disallow /*?brand=*
Disallow /*?p=*&
Disallow /*.php$
Disallow /*?SID=

Other Records

Field Value
sitemap https://actec.dk/sitemap.xml

Comments

  • robots.txt for Magento 1.9.x / v1.6 2018-08-19 / Peeter Marvet
  • (original version from 2015, edited in 2017 to add filter query parameter disallow samples + some wildcards,
  • edited in 2018 to add query params blocking to Yandex as named User-agent does not read *)
  • based on:
  • http://inchoo.net/ecommerce/ultimate-magento-robots-txt-file-examples/
  • http://www.byte.nl/blog/magento-robots-txt/
  • https://astrio.net/blog/optimize-robots-txt-for-magento/
  • comment and clone at https://gist.github.com/petskratt/016c9dbf159a81b9d6aa
  • Keep in mind that by standard robots.txt should NOT contain empty lines, except between UA blocks!
  • Sitemap (uncomment, change and add language/shop specific sitemaps, if running on multiple domains
  • keep in mind sitemap can only point to own domain so something like sitemapindex.php is needed)
  • Sitemap: http://example.com/sitemap.xml
  • Google Image Crawler Setup - having crawler-specific sections makes it ignore generic e.g *
  • Yandex tends to be rather aggressive, may be worth keeping them at arms lenght
  • Crawl-delay: 20
  • Problem is mostly related to layered nav and query params, allow only paging
  • Crawlers Setup
  • Allow paging (unless paging inside a listing with more params, as disallowed below)
  • Directories
  • Disallow: /media/
  • Disallow: /media/catalog/
  • Disallow: /media/wysiwyg/
  • Disallow: /skin/
  • Paths (if using shop id in URL must prefix with * or copy for each)
  • Files
  • Do not crawl sub category pages that are sorted or filtered.
  • This would be very broad, could hurt (incl. SEO).
  • Disallow: /*?*
  • These are more specific, pick what you need - and do not forget to add your custom filters!
  • Paths that can be safely ignored (no clean URLs)