curaprox.de
robots.txt

Robots Exclusion Standard data for curaprox.de

Resource Scan

Scan Details

Site Domain curaprox.de
Base Domain curaprox.de
Scan Status Ok
Last Scan2024-06-18T18:37:11+00:00
Next Scan 2024-07-18T18:37:11+00:00

Last Scan

Scanned2024-06-18T18:37:11+00:00
URL https://curaprox.de/robots.txt
Domain IPs 185.141.21.196
Response IP 185.141.21.196
Found Yes
Hash f81abc06bdcbf0a1a301a6cb90ad8d34ee1a12f13cd9deb1bb7fca8d5f3fec5b
SimHash c060793b0115

Groups

*

Rule Path
Allow */modules/*.css
Allow */modules/*.js
Allow */modules/*.png
Allow */modules/*.jpg
Allow */modules/*.gif
Allow */modules/*.svg
Allow */modules/*.webp
Allow /js/jquery/*
Disallow /*?order=
Disallow /*?tag=
Disallow /*?id_currency=
Disallow /*?search_query=
Disallow /*?back=
Disallow /*?n=
Disallow /*%26order%3D
Disallow /*%26tag%3D
Disallow /*%26id_currency%3D
Disallow /*%26search_query%3D
Disallow /*%26back%3D
Disallow /*%26n%3D
Disallow /*controller%3Daddresses
Disallow /*controller%3Daddress
Disallow /*controller%3Dauthentication
Disallow /*controller%3Dcart
Disallow /*controller%3Ddiscount
Disallow /*controller%3Dfooter
Disallow /*controller%3Dget-file
Disallow /*controller%3Dheader
Disallow /*controller%3Dhistory
Disallow /*controller%3Didentity
Disallow /*controller%3Dimages.inc
Disallow /*controller%3Dinit
Disallow /*controller%3Dmy-account
Disallow /*controller%3Dorder
Disallow /*controller%3Dorder-slip
Disallow /*controller%3Dorder-detail
Disallow /*controller%3Dorder-follow
Disallow /*controller%3Dorder-return
Disallow /*controller%3Dorder-confirmation
Disallow /*controller%3Dpagination
Disallow /*controller%3Dpassword
Disallow /*controller%3Dpdf-invoice
Disallow /*controller%3Dpdf-order-return
Disallow /*controller%3Dpdf-order-slip
Disallow /*controller%3Dproduct-sort
Disallow /*controller%3Dregistration
Disallow /*controller%3Dsearch
Disallow /*controller%3Dstatistics
Disallow /*controller%3Dattachment
Disallow /*controller%3Dguest-tracking
Disallow /app/
Disallow /cache/
Disallow /classes/
Disallow /config/
Disallow /controllers/
Disallow /download/
Disallow /js/
Disallow /localization/
Disallow /log/
Disallow /mails/
Disallow /modules/
Disallow /override/
Disallow /pdf/
Disallow /src/
Disallow /tools/
Disallow /translations/
Disallow /upload/
Disallow /var/
Disallow /vendor/
Disallow /webservice/
Disallow /adresse
Disallow /adressen
Disallow /authentifizierung
Disallow /warenkorb
Disallow /Rabatt
Disallow /auftragsverfolgung-gast
Disallow /bestellungsverlauf
Disallow /kennung
Disallow /mein-Konto
Disallow /bestellung
Disallow /bestellungsverfolgung
Disallow /bestellschein
Disallow /kennwort-wiederherstellung
Disallow /suche

adstxtcrawler

Rule Path
Disallow /

ahrefsbot

Rule Path
Disallow /

coccocbot-web

Rule Path
Disallow /

dataforseobot

Rule Path
Disallow /

ltx71

Rule Path
Disallow /

mj12bot

Rule Path
Disallow /

yeti

Rule Path
Disallow /

dotbot

Rule Path
Disallow /

pinterestbot

Rule Path
Disallow /

seznambot

Rule Path
Disallow /

sogou web spider

Rule Path
Disallow /

x28-job-bot

Rule Path
Disallow /

obot

Rule Path
Disallow /

trendictionbot

Rule Path
Disallow /

mrgbot

Rule Path
Disallow /

at-bot

Rule Path
Disallow /

serendeputybot

Rule Path
Disallow /

crawlson

Rule Path
Disallow /

ev-crawler

Rule Path
Disallow /

scrapy

Rule Path
Disallow /

barkrowler

Rule Path
Disallow /

seokicks

Rule Path
Disallow /

baiduspider 2.0

Rule Path
Disallow /

zoominfobot

Rule Path
Disallow /

zumbot

Rule Path
Disallow /

sancheezebot

Rule Path
Disallow /

bidswitchbot

Rule Path
Disallow /

mail.ru_bot

Rule Path
Disallow /

*

No rules defined. All paths allowed.

Other Records

Field Value
sitemap https://curaprox.de/sitemap-index.xml

Comments

  • robots.txt automatically generated by PrestaShop e-commerce open-source solution
  • https://www.prestashop.com - https://www.prestashop.com/forums
  • This file is to prevent the crawling and indexing of certain parts
  • of your site by web crawlers and spiders run by sites like Yahoo!
  • and Google. By telling these "robots" where not to go on your site,
  • you save bandwidth and server resources.
  • For more information about the robots.txt standard, see:
  • https://www.robotstxt.org/robotstxt.html
  • Allow Directives
  • Private pages
  • Directories for curaprox.de
  • Files