easycloud.in
robots.txt

Robots Exclusion Standard data for easycloud.in

Resource Scan

Scan Details

Site Domain easycloud.in
Base Domain easycloud.in
Scan Status Ok
Last Scan2026-01-20T19:32:43+00:00
Next Scan 2026-02-19T19:32:43+00:00

Last Scan

Scanned2026-01-20T19:32:43+00:00
URL https://easycloud.in/robots.txt
Redirect https://easycloudsoftware.com/robots.txt
Redirect Domain easycloudsoftware.com
Redirect Base easycloudsoftware.com
Domain IPs 162.241.85.218
Redirect IPs 162.241.85.218
Response IP 162.241.85.218
Found Yes
Hash 38818c7a41c83ecc5b7d527911444b917170a4886430632cbea8a233f08859fe
SimHash ad129e130414

Groups

*

Rule Path
Allow /
Disallow /*.php$
Disallow /router.php
Disallow /server.js
Disallow /.git/
Disallow /tmp/
Disallow /temp/
Disallow /*.sql
Disallow /*.sql.gz
Disallow /*.htaccess
Disallow /*.htpasswd
Disallow /*.ini
Disallow /*.log
Allow /*.css$
Allow /*.js$
Allow /images/
Allow /*.jpg$
Allow /*.jpeg$
Allow /*.png$
Allow /*.gif$
Allow /*.svg$
Allow /*.webp$
Allow /*.avif$
Allow /*.pdf$

googlebot-image

Rule Path
Allow /images/

adsbot-google

Rule Path
Allow /

bingbot

Rule Path
Allow /

Other Records

Field Value
sitemap https://easycloudsoftware.com/sitemap.xml

Comments

  • robots.txt for EasyCloud Software - https://easycloudsoftware.com
  • This file tells search engine crawlers which pages they can and cannot access
  • ==============================================================================
  • USER-AGENT DIRECTIVES
  • ==============================================================================
  • Allow all search engines to crawl the entire site
  • ==============================================================================
  • DISALLOW DIRECTIVES - Pages to exclude from indexing
  • ==============================================================================
  • Block access to admin/backend areas (if any exist in future)
  • Disallow: /admin/
  • Disallow: /backend/
  • Block access to configuration files
  • Block access to internal scripts and utilities
  • Block access to version control
  • Block access to temporary files
  • Block access to database backups (if stored on server)
  • Block access to configuration files
  • Block access to log files
  • ==============================================================================
  • ALLOW DIRECTIVES - Explicitly allow important resources
  • ==============================================================================
  • Allow CSS files (important for rendering previews)
  • Allow JavaScript files (needed for functionality)
  • Allow image files (important for image search)
  • Allow PDF documents (if you have any)
  • ==============================================================================
  • CRAWL-DELAY (Optional - use if server is under heavy load)
  • ==============================================================================
  • Crawl-delay: 1
  • ==============================================================================
  • SITEMAP LOCATION
  • ==============================================================================
  • Point to your XML sitemap for better crawling
  • ==============================================================================
  • SPECIFIC BOT DIRECTIVES (Optional - for future use)
  • ==============================================================================
  • Google Image Bot - allow all images
  • Google Ads Bot - allow crawling for ad targeting
  • Bing Bot - allow all
  • ==============================================================================
  • NOTES:
  • ==============================================================================
  • 1. This file should be placed in the root directory of your website
  • 2. It must be accessible at: https://easycloudsoftware.com/robots.txt
  • 3. Changes take effect when search engines re-crawl the file
  • 4. Test your robots.txt at: https://www.google.com/webmasters/tools/robots-testing-tool
  • ==============================================================================