paulcamper.de
robots.txt

Robots Exclusion Standard data for paulcamper.de

Resource Scan

Scan Details

Site Domain paulcamper.de
Base Domain paulcamper.de
Scan Status Failed
Failure StageFetching resource.
Failure ReasonServer returned a client error.
Last Scan2025-11-26T18:02:40+00:00
Next Scan 2025-12-26T18:02:40+00:00

Last Successful Scan

Scanned2025-10-19T00:55:06+00:00
URL https://paulcamper.de/robots.txt
Redirect https://paulcamper.de/robots-de.txt
Domain IPs 2600:9000:2894:1000:4:af7d:bac0:93a1, 2600:9000:2894:1c00:4:af7d:bac0:93a1, 2600:9000:2894:3200:4:af7d:bac0:93a1, 2600:9000:2894:4600:4:af7d:bac0:93a1, 2600:9000:2894:9a00:4:af7d:bac0:93a1, 2600:9000:2894:a800:4:af7d:bac0:93a1, 2600:9000:2894:b000:4:af7d:bac0:93a1, 2600:9000:2894:be00:4:af7d:bac0:93a1, 3.170.229.15, 3.170.229.7, 3.170.229.84, 3.170.229.86
Response IP 3.170.229.86
Found Yes
Hash dca7c85869611198b69383b4cb99ad510e1464028589f2b8a38e8601061c7283
SimHash eabd4d154d51

Groups

*

Rule Path
Disallow /checkout
Disallow /cart
Disallow /orders
Disallow /user
Disallow /account
Disallow /password
Disallow /api
Disallow /owner-pds
Disallow /.well-known/
Disallow /apple-app-site-association
Disallow /apple-developer-merchantid-domain-association.txt
Disallow /assetlinks.json

Other Records

Field Value
sitemap https://www.paulcamper.de/sitemap.xml.gz

Comments

  • See http://www.robotstxt.org/robotstxt.html for documentation on how to use the robots.txt file
  • To ban all spiders from the entire site uncomment the next two lines:
  • User-agent: *
  • Disallow: /