javry.com
robots.txt

Robots Exclusion Standard data for javry.com

Resource Scan

Scan Details

Site Domain javry.com
Base Domain javry.com
Scan Status Ok
Last Scan2026-01-30T14:54:05+00:00
Next Scan 2026-03-01T14:54:05+00:00

Last Scan

Scanned2026-01-30T14:54:05+00:00
URL https://javry.com/robots.txt
Domain IPs 104.26.0.101, 104.26.1.101, 172.67.70.239, 2606:4700:20::681a:165, 2606:4700:20::681a:65, 2606:4700:20::ac43:46ef
Response IP 104.26.0.101
Found Yes
Hash 403e529616ee34a099cc518000f0be26e3bf28e7821ef0490567c97aaf4c9839
SimHash a0450a1f2e54

Groups

*

Rule Path
Disallow /users/*
Disallow /account
Disallow /account/*
Disallow /invoices
Disallow /invoices/*
Disallow /fr/invoices
Disallow /fr/invoices/*
Disallow /nl/invoices
Disallow /nl/invoices/*
Disallow /en/invoices
Disallow /en/invoices/*
Disallow /cart
Disallow /cart/*
Disallow /fr/cart
Disallow /fr/cart/*
Disallow /nl/cart
Disallow /nl/cart/*
Disallow /en/cart
Disallow /en/cart/*
Disallow /subscriptions
Disallow /subscriptions/*
Disallow /fr/subscriptions
Disallow /fr/subscriptions/*
Disallow /nl/subscriptions
Disallow /nl/subscriptions/*
Disallow /en/subscriptions
Disallow /en/subscriptions/*
Disallow /fr/order
Disallow /fr/order/*
Disallow /nl/order
Disallow /nl/order/*
Disallow /en/order
Disallow /en/order/*
Disallow /planned_orders/*
Disallow /companies/*
Disallow /payments/*
Disallow /fr/payments/*
Disallow /nl/payments/*
Disallow /en/payments/*
Disallow /orders/*
Disallow /order/*
Disallow /fr/qrequest/*
Disallow /nl/qrequest/*
Disallow /en/qrequest/*
Disallow /fr/proposals/*
Disallow /nl/proposals/*
Disallow /en/proposals/*
Disallow /js/lnkobf.js
Disallow /fr/posts/wp-content/plugins/link-juice-optimizer/public/js/link-juice-optimizer.js
Disallow /nl/posts/wp-content/plugins/link-juice-optimizer/public/js/link-juice-optimizer.js
Disallow /coffee_products/batch_pictures

adsbot-google

Rule Path
Allow /fr/lp/*
Allow /nl/lp/*
Allow /en/lp/*

Other Records

Field Value
sitemap https://javryproduction.s3.amazonaws.com/sitemaps/sitemap.xml.gz

Comments

  • See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file
  • To ban all spiders from the entire site uncomment the next two lines:
  • User-Agent: *
  • Disallow: /