engwing.com
robots.txt

Robots Exclusion Standard data for engwing.com

Resource Scan

Scan Details

Site Domain engwing.com
Base Domain engwing.com
Scan Status Ok
Last Scan2025-08-18T02:50:47+00:00
Next Scan 2025-09-17T02:50:47+00:00

Last Scan

Scanned2025-08-18T02:50:47+00:00
URL https://engwing.com/robots.txt
Domain IPs 2a02:4780:38:1572:4815:99e0:7805:efd9, 2a02:4780:39:5f96:ad6e:d779:5b9b:f764, 84.32.84.28, 84.32.84.76
Response IP 77.37.48.247
Found Yes
Hash 32ea85a1d6d6b00cda1373fc328392864d2b606e0202981de81bb0c805d639fd
SimHash 512ecf50e6bb

Groups

*

Rule Path
Disallow /wp-admin/
Disallow /wp-includes/
Disallow /wp-content/plugins/
Disallow /wp-content/themes/
Disallow /wp-content/cache/
Disallow /wp-login.php
Disallow /wp-register.php
Disallow /trackback/
Disallow /feed/
Disallow /comments/
Disallow /author/
Disallow /xmlrpc.php
Disallow /wp-json/
Disallow /?*

googlebot

Rule Path
Disallow

bingbot

Rule Path
Disallow

yandexbot

Rule Path
Disallow

baiduspider

Rule Path
Disallow

slurp

Rule Path
Disallow

duckduckgo

Rule Path
Disallow

startpage

Rule Path
Disallow

ecosiabot

Rule Path
Disallow

sogou spider

Rule Path
Disallow

yourcustomuseragent

Rule Path
Disallow /private/
Disallow /private/
Disallow /config/
Disallow /data/
Disallow /uploads/private/
Disallow /*.sql$
Disallow /*.log$
Disallow /*.ini$
Disallow /*.htaccess$
Disallow /*.htpasswd$
Disallow /administrator/
Disallow /admin/
Disallow /login/
Disallow /panel/
Disallow /dashboard/

badbot1

Rule Path
Disallow /

badbot2

Rule Path
Disallow /

Other Records

Field Value
crawl-delay 10

Other Records

Field Value
sitemap https://engwing.com/sitemap.xml

Comments

  • Specify the User-Agent for different search engines.
  • Allow Googlebot to access everything.
  • Allow Bingbot to access everything.
  • Allow YandexBot to access everything.
  • Allow BaiduSpider to access everything.
  • Allow Slurp (Yahoo) to access everything.
  • Allow DuckDuckGo to access everything.
  • Allow Startpage to access everything.
  • Allow Ecosia to access everything.
  • Allow Sogou Spider to access everything.
  • Allow any other custom user-agents here.
  • Block access to sensitive directories or files.
  • Block access to common sensitive files.
  • Block common website administration paths.
  • Block some known bots and scrapers.
  • Specify crawl delay if necessary (in seconds).
  • Sitemap location for search engines (replace with your sitemap URL).