eternalhosting.cloud
robots.txt

Robots Exclusion Standard data for eternalhosting.cloud

Resource Scan

Scan Details

Site Domain eternalhosting.cloud
Base Domain eternalhosting.cloud
Scan Status Ok
Last Scan2025-06-15T12:02:20+00:00
Next Scan 2025-06-22T12:02:20+00:00

Last Scan

Scanned2025-06-15T12:02:20+00:00
URL https://eternalhosting.cloud/robots.txt
Domain IPs 104.21.52.174, 172.67.202.85, 2606:4700:3032::6815:34ae, 2606:4700:3036::ac43:ca55
Response IP 104.21.52.174
Found Yes
Hash f385a4aff03b0af68da187f091c927c4201d9cfe0ba3bb1ef9cf207d364c38b1
SimHash f80c07a1e5df

Groups

*

Rule Path
Allow /
Disallow /components/
Disallow /data/
Disallow /assets/css/
Disallow /assets/js/
Disallow /*.php$
Disallow /*.inc$
Disallow /*.conf$
Disallow /*.sql$
Disallow /*.log$
Disallow /*.bak$
Disallow /admin/
Disallow /administrator/
Disallow /wp-admin/
Disallow /cpanel/
Disallow /plesk/
Disallow /*?
Disallow /*.tmp
Disallow /*.backup
Disallow /*~
Allow /
Allow /minecraft-server-hosting
Allow /blog
Allow /privacy-policy
Allow /terms-of-service
Allow /how-to-set-up-your-first-minecraft-server
Allow /choosing-a-server-host-and-a-server-plan
Allow /how-to-reduce-server-lag-of-your-minecraft-server
Allow /how-to-protect-your-minecraft-server-from-hackers-and-griefers
Allow /securing-your-game-server-against-ddos-attacks
Allow /how-to-monetize-your-game-servers
Allow /the-economics-of-running-a-minecraft-server
Allow /10-minecraft-plugins-to-take-your-server-to-the-next-level
Allow /assets/images/
Allow /*.webp
Allow /*.png
Allow /*.jpg
Allow /*.jpeg
Allow /*.gif
Allow /*.svg
Allow /*.ico

googlebot

Rule Path
Allow /

Other Records

Field Value
crawl-delay 1

bingbot

Rule Path
Allow /

Other Records

Field Value
crawl-delay 2

slurp

Rule Path
Allow /

Other Records

Field Value
crawl-delay 2

duckduckbot

Rule Path
Allow /

Other Records

Field Value
crawl-delay 2

yandexbot

Rule Path
Allow /

Other Records

Field Value
crawl-delay 3

ahrefsbot

Rule Path
Disallow /

mj12bot

Rule Path
Disallow /

dotbot

Rule Path
Disallow /

semrushbot

Rule Path
Disallow /

blexbot

Rule Path
Disallow /

petalbot

Rule Path
Disallow /

facebookexternalhit

Rule Path
Allow /

twitterbot

Rule Path
Allow /

linkedinbot

Rule Path
Allow /

discordbot

Rule Path
Allow /

*

Rule Path
Allow /*minecraft*
Allow /*hosting*
Allow /*server*
Allow /*plan*
Allow /*price*
Allow /*help*
Allow /*support*
Allow /*guide*
Allow /*tutorial*
Allow /*how-to*
Disallow /*customer*
Disallow /*billing*
Disallow /*account*
Disallow /*login*
Disallow /*panel*
Disallow /*search*
Disallow /*filter*
Disallow /*sort*
Disallow /*ajax*
Disallow /*api*
Disallow /*?*
Disallow /*%26*
Disallow /*utm_*
Disallow /*ref%3D*
Disallow /*source%3D*

Other Records

Field Value
crawl-delay 5

Other Records

Field Value
sitemap https://eternalhosting.cloud/sitemap.xml
sitemap https://eternalhosting.cloud/sitemap-blog.xml
sitemap https://eternalhosting.cloud/sitemap-pages.xml
sitemap https://eternalhosting.cloud/sitemap-images.xml

Comments

  • ==============================================================================
  • ETERNAL HOSTING - ROBOTS.TXT
  • Optimized for SEO and Security
  • ==============================================================================
  • Allow all major search engines to crawl the site
  • ==============================================================================
  • BLOCKED DIRECTORIES AND FILES
  • ==============================================================================
  • Block access to sensitive directories
  • Block sensitive files
  • Block admin and system directories (even if they don't exist)
  • Block temporary and backup files
  • ==============================================================================
  • EXPLICITLY ALLOWED IMPORTANT PAGES
  • ==============================================================================
  • Main pages (ensure these are crawled)
  • Blog posts (high-value content)
  • Important assets for SEO
  • ==============================================================================
  • SEARCH ENGINE SPECIFIC RULES
  • ==============================================================================
  • Google Bot - Full access with high crawl rate
  • Bing Bot - Full access
  • Yahoo Bot - Full access
  • DuckDuckGo Bot - Full access
  • Yandex Bot - Full access
  • ==============================================================================
  • BLOCK MALICIOUS BOTS AND SCRAPERS
  • ==============================================================================
  • Block common scrapers and malicious bots
  • Block AI training bots (optional - uncomment if you want to block)
  • User-agent: GPTBot
  • Disallow: /
  • User-agent: ChatGPT-User
  • Disallow: /
  • User-agent: CCBot
  • Disallow: /
  • User-agent: anthropic-ai
  • Disallow: /
  • User-agent: Claude-Web
  • Disallow: /
  • ==============================================================================
  • SOCIAL MEDIA CRAWLERS (Allow for social sharing)
  • ==============================================================================
  • Facebook crawler
  • Twitter crawler
  • LinkedIn crawler
  • Discord crawler (for link previews)
  • ==============================================================================
  • SITEMAPS
  • ==============================================================================
  • XML Sitemap location
  • Additional sitemaps (create these files)
  • ==============================================================================
  • CRAWL OPTIMIZATION
  • ==============================================================================
  • Default crawl delay for unlisted bots
  • ==============================================================================
  • HOSTING-SPECIFIC OPTIMIZATIONS
  • ==============================================================================
  • Allow crawling of pricing and plan information
  • Allow crawling of help and support content
  • Block crawling of potential customer data
  • ==============================================================================
  • PERFORMANCE CONSIDERATIONS
  • ==============================================================================
  • Prevent crawling of resource-heavy operations
  • Block crawling of duplicate content