netdecker.cl
robots.txt

Robots Exclusion Standard data for netdecker.cl

Resource Scan

Scan Details

Site Domain netdecker.cl
Base Domain netdecker.cl
Scan Status Ok
Last Scan2025-10-27T12:44:32+00:00
Next Scan 2025-11-03T12:44:32+00:00

Last Scan

Scanned2025-10-27T12:44:32+00:00
URL https://netdecker.cl/robots.txt
Domain IPs 162.241.105.214
Response IP 162.241.105.214
Found Yes
Hash 976a2cc62b2dcfcfae80a14c4b19f5d27c35bf5e9bbd1dd175388651da930f1c
SimHash ee8321524545

Groups

*

Rule Path
Allow /ads.txt
Disallow /ads
Disallow /classes/
Disallow /config/
Disallow /download/
Disallow /mails/
Disallow /modules/
Disallow /translations/
Disallow /tools/
Disallow /lang-es/
Disallow /addresses.php
Disallow /address.php
Disallow /authentication.php
Disallow /cart.php
Disallow /discount.php
Disallow /footer.php
Disallow /get-file.php
Disallow /header.php
Disallow /history.php
Disallow /identity.php
Disallow /images.inc.php
Disallow /init.php
Disallow /my-account.php
Disallow /order.php
Disallow /order-opc.php
Disallow /order-slip.php
Disallow /order-detail.php
Disallow /order-follow.php
Disallow /order-return.php
Disallow /order-confirmation.php
Disallow /pagination.php
Disallow /password.php
Disallow /pdf-invoice.php
Disallow /pdf-order-return.php
Disallow /pdf-order-slip.php
Disallow /product-sort.php
Disallow /search.php
Disallow /statistics.php
Disallow /attachment.php
Disallow /guest-tracking
Disallow /*orderby%3D
Disallow /*orderway%3D
Disallow /*tag%3D
Disallow /*id_currency%3D
Disallow /*search_query%3D
Disallow /*id_lang%3D
Disallow /*back%3D
Disallow /*utm_source%3D
Disallow /*utm_medium%3D
Disallow /*utm_campaign%3D
Disallow /*n%3D

Other Records

Field Value
sitemap http://store.netdecker.cl/sitemap.xml

Comments

  • robots.txt automaticaly generated by PrestaShop e-commerce open-source solution
  • http://www.prestashop.com - http://www.prestashop.com/forums
  • This file is to prevent the crawling and indexing of certain parts
  • of your site by web crawlers and spiders run by sites like Yahoo!
  • and Google. By telling these "robots" where not to go on your site,
  • you save bandwidth and server resources.
  • For more information about the robots.txt standard, see:
  • http://www.robotstxt.org/wc/robots.html
  • Directories
  • Files
  • Sitemap