seedancepro.com
robots.txt

Robots Exclusion Standard data for seedancepro.com

Resource Scan

Scan Details

Site Domain seedancepro.com
Base Domain seedancepro.com
Scan Status Ok
Last Scan2025-12-15T12:40:51+00:00
Next Scan 2026-01-14T12:40:51+00:00

Last Scan

Scanned2025-12-15T12:40:51+00:00
URL https://seedancepro.com/robots.txt
Redirect https://www.jxp.com/robots.txt
Redirect Domain www.jxp.com
Redirect Base jxp.com
Domain IPs 104.21.5.157, 172.67.133.153, 2606:4700:3036::6815:59d, 2606:4700:3036::ac43:8599
Redirect IPs 216.150.1.129, 216.150.16.129
Response IP 216.150.16.193
Found Yes
Hash 94dfdeb66bd528d3b3adaff7916dbd4c9ca8afcf36bbe5d6398fb9e4c937b9cd
SimHash 3c3831558c65

Groups

*

Rule Path Comment
Allow / -
Allow /ai-lip-sync/infinitetalk-ai -
Allow /sora/sora2 -
Allow /sora -
Allow /veo -
Allow /veo/veo-3 -
Allow /chronoedit -
Disallow /api/ API endpoints (not meant for indexing)
Disallow /admin/ Admin dashboard (protected)
Disallow /dashboard/ User dashboard (protected)
Disallow /profile/ User profiles (protected)
Disallow /custom-pricing/ Custom pricing page (not for indexing)
Disallow /search?* Search result pages (avoid duplicate content)
Disallow /*?* URLs with parameters (avoid duplicate content)

googlebot

Rule Path
Allow /
Disallow /api/
Disallow /custom-pricing/

bingbot

Rule Path
Allow /
Disallow /api/
Disallow /custom-pricing/

baiduspider

Rule Path
Allow /
Disallow /api/
Disallow /custom-pricing/

gptbot
anthropic-ai
claudebot
ccbot
google-extended
bytespider
diffbot

Rule Path
Allow /llms.txt
Disallow /

oai-searchbot
chatgpt-user
perplexitybot
firecrawlagent
andibot
exabot
phindbot
youbot

Rule Path
Allow /
Disallow /api/
Disallow /custom-pricing/

Other Records

Field Value
sitemap https://www.jxp.com/sitemap.xml

Comments

  • =====================================
  • 🤖 Robots.txt for https://www.jxp.com
  • =====================================
  • -------------------------------------
  • 🌐 General rules for all crawlers
  • -------------------------------------
  • Note: Static assets are allowed (no Disallow for /static/)
  • -------------------------------------
  • 🚫 External domain restrictions
  • -------------------------------------
  • Note: robots.txt only applies to the current domain
  • External domains need their own robots.txt files
  • -------------------------------------
  • 🔍 Googlebot rules
  • -------------------------------------
  • -------------------------------------
  • 🔎 Bingbot rules (search indexing allowed)
  • -------------------------------------
  • -------------------------------------
  • 🔎 Baiduspider rules
  • -------------------------------------
  • -------------------------------------
  • 🤖 AI Crawler Rules (LLM-specific bots - restricted)
  • -------------------------------------
  • Training data collection bots (restricted)
  • AI search/agent bots (allowed for search but not training)
  • -------------------------------------
  • 📌 Sitemap
  • -------------------------------------