support.twidpay.com
robots.txt

Robots Exclusion Standard data for support.twidpay.com

Resource Scan

Scan Details

Site Domain support.twidpay.com
Base Domain twidpay.com
Scan Status Ok
Last Scan2025-09-30T03:42:22+00:00
Next Scan 2025-10-30T03:42:22+00:00

Last Scan

Scanned2025-09-30T03:42:22+00:00
URL https://support.twidpay.com/robots.txt
Domain IPs 162.159.140.147, 172.66.0.145
Response IP 172.66.0.145
Found Yes
Hash e66f9cdf3c03749839a62e7d27c08507333214659d358f2cc1f7bce484dda048
SimHash 269d0d6d7550

Groups

*

Rule Path
Disallow /support/search
Disallow /support/tickets/
Disallow /support/login
Disallow /support/login-verification
Disallow /login/normal/
Allow /helpdesk/attachments
Disallow /helpdesk/
Disallow /public/tickets/
Disallow /*/hit$

Other Records

Field Value
sitemap https://support.twidpay.com/support/sitemap.xml

Comments

  • See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file
  • To ban all spiders from the entire site uncomment the next two lines:
  • User-Agent: *
  • Disallow: /