jellyfishsurfshop.com
robots.txt

Robots Exclusion Standard data for jellyfishsurfshop.com

Resource Scan

Scan Details

Site Domain jellyfishsurfshop.com
Base Domain jellyfishsurfshop.com
Scan Status Failed
Failure StageFetching resource.
Failure ReasonServer returned a client error.
Last Scan4/9/2025, 8:13:07 PM
Next Scan 4/16/2025, 8:13:07 PM

Last Successful Scan

Scanned3/9/2025, 4:47:11 PM
URL https://jellyfishsurfshop.com/robots.txt
Domain IPs 104.21.76.165, 172.67.197.122, 2606:4700:3034::6815:4ca5, 2606:4700:3034::ac43:c57a
Response IP 104.21.76.165
Found Yes
Hash 5f342153a85068bed901cb193b74c4a378cfee47e88816846a5cfc731d078e2e
SimHash d206790b873d

Groups

*

Rule Path
Allow */modules/*.css
Allow */modules/*.js
Allow */modules/*.png
Allow */modules/*.jpg
Allow /js/jquery/*
Disallow /*?order=
Disallow /*?tag=
Disallow /*?id_currency=
Disallow /*?search_query=
Disallow /*?back=
Disallow /*?n=
Disallow /*%26order%3D
Disallow /*%26tag%3D
Disallow /*%26id_currency%3D
Disallow /*%26search_query%3D
Disallow /*%26back%3D
Disallow /*%26n%3D
Disallow /*controller%3Daddresses
Disallow /*controller%3Daddress
Disallow /*controller%3Dauthentication
Disallow /*controller%3Dcart
Disallow /*controller%3Ddiscount
Disallow /*controller%3Dfooter
Disallow /*controller%3Dget-file
Disallow /*controller%3Dheader
Disallow /*controller%3Dhistory
Disallow /*controller%3Didentity
Disallow /*controller%3Dimages.inc
Disallow /*controller%3Dinit
Disallow /*controller%3Dmy-account
Disallow /*controller%3Dorder
Disallow /*controller%3Dorder-slip
Disallow /*controller%3Dorder-detail
Disallow /*controller%3Dorder-follow
Disallow /*controller%3Dorder-return
Disallow /*controller%3Dorder-confirmation
Disallow /*controller%3Dpagination
Disallow /*controller%3Dpassword
Disallow /*controller%3Dpdf-invoice
Disallow /*controller%3Dpdf-order-return
Disallow /*controller%3Dpdf-order-slip
Disallow /*controller%3Dproduct-sort
Disallow /*controller%3Dsearch
Disallow /*controller%3Dstatistics
Disallow /*controller%3Dattachment
Disallow /*controller%3Dguest-tracking
Disallow /app/
Disallow /cache/
Disallow /classes/
Disallow /config/
Disallow /controllers/
Disallow /download/
Disallow /js/
Disallow /localization/
Disallow /log/
Disallow /mails/
Disallow /modules/
Disallow /override/
Disallow /pdf/
Disallow /src/
Disallow /tools/
Disallow /translations/
Disallow /upload/
Disallow /var/
Disallow /vendor/
Disallow /webservice/
Disallow /en/app/
Disallow /en/cache/
Disallow /en/classes/
Disallow /en/config/
Disallow /en/controllers/
Disallow /en/download/
Disallow /en/js/
Disallow /en/localization/
Disallow /en/log/
Disallow /en/mails/
Disallow /en/modules/
Disallow /en/override/
Disallow /en/pdf/
Disallow /en/src/
Disallow /en/tools/
Disallow /en/translations/
Disallow /en/upload/
Disallow /en/var/
Disallow /en/vendor/
Disallow /en/webservice/
Disallow /*en/password-recovery
Disallow /*en/address
Disallow /*en/addresses
Disallow /*en/login
Disallow /*en/cart
Disallow /*en/discount
Disallow /*en/order-history
Disallow /*en/identity
Disallow /*en/my-account
Disallow /*en/order-follow
Disallow /*en/credit-slip
Disallow /*en/order
Disallow /*en/search
Disallow /*en/guest-tracking
Disallow /*en/order-confirmation

bingbot

No rules defined. All paths allowed.

Other Records

Field Value
crawl-delay 1

adsbot-google
amazonbot
anthropic-ai
applebot-extended
bytespider
ccbot
chatgpt-user
claudebot
claude-web
cohere-ai
diffbot
facebookbot
friendlycrawler
google-extended
googleother
gptbot
img2dataset
omgili
omgilibot
peer39_crawler
peer39_crawler/1.0
perplexitybot
youbot
baiduspider
dotbot
imagesiftbot
mj12bot
mojeekbot
petalbot
seekportbot
seznambot
sogou inst spider
sogou web spider
velenpublicwebcrawler
yandexbot
yisouspider

Rule Path
Disallow /

Comments

  • robots.txt automatically generated by PrestaShop e-commerce open-source solution
  • http://www.prestashop.com - http://www.prestashop.com/forums
  • This file is to prevent the crawling and indexing of certain parts
  • of your site by web crawlers and spiders run by sites like Yahoo!
  • and Google. By telling these "robots" where not to go on your site,
  • you save bandwidth and server resources.
  • For more information about the robots.txt standard, see:
  • http://www.robotstxt.org/robotstxt.html
  • Allow Directives
  • Private pages
  • Directories for localhost:8888
  • Files

Warnings

  • 1 invalid line.