blowoutmedical.com
robots.txt

Robots Exclusion Standard data for blowoutmedical.com

Resource Scan

Scan Details

Site Domain blowoutmedical.com
Base Domain blowoutmedical.com
Scan Status Ok
Last Scan2024-05-09T13:55:59+00:00
Next Scan 2024-06-08T13:55:59+00:00

Last Scan

Scanned2024-05-09T13:55:59+00:00
URL https://blowoutmedical.com/robots.txt
Domain IPs 172.66.40.245, 172.66.43.11, 2606:4700:3108::ac42:28f5, 2606:4700:3108::ac42:2b0b
Response IP 172.66.43.11
Found Yes
Hash a66891d5942ff67b408373902346d2d349b8434fb546bed95962f7c0d110c915
SimHash 39b45f0d055c

Groups

*

Rule Path
Disallow /lib/
Disallow /*.php$
Disallow /pkginfo/
Disallow /report/
Disallow /var/
Disallow /catalog/
Disallow /customer/
Disallow /sendfriend/
Disallow /review/
Disallow /*SID%3D
Disallow /ordertracking/
Disallow *?r_p=
Disallow /newsletter/subscriber/new/
Disallow /checkout/
Disallow /onestepcheckout/
Disallow /customer/
Disallow /customer/account/
Disallow /customer/account/loginPost/
Disallow /customer/account/createpost/
Disallow /catalogsearch/
Disallow /catalog/product_compare/
Disallow /catalog/category/view/
Disallow /catalog/product/view/
Disallow /app/
Disallow /bin/
Disallow /dev/
Disallow /lib/
Disallow /phpserver/
Disallow /tag/
Disallow /review/
Disallow /composer.json
Disallow /composer.lock
Disallow /CONTRIBUTING.md
Disallow /CONTRIBUTOR_LICENSE_AGREEMENT.html
Disallow /COPYING.txt
Disallow /Gruntfile.js
Disallow /LICENSE.txt
Disallow /LICENSE_AFL.txt
Disallow /nginx.conf.sample
Disallow /package.json
Disallow /php.ini.sample
Disallow /RELEASE_NOTES.txt
Disallow /*?*product_list_mode=
Disallow /*?*product_list_order=
Disallow /*?*product_list_limit=
Disallow /*?*product_list_dir=
Disallow /*.git
Disallow /*.CVS
Disallow /*.Zip$
Disallow /*.Svn$
Disallow /*.Idea$
Disallow /*.Sql$
Disallow /*.Tgz$

Other Records

Field Value
sitemap https://www.blowoutmedical.com/sitemap.xml

Comments

  • ****************************************************************************
  • robots.txt
  • : Robots, spiders, and search engines use this file to detmine which
  • content they should *not* crawl while indexing your website.
  • : This system is called "The Robots Exclusion Standard."
  • : It is strongly encouraged to use a robots.txt validator to check
  • for valid syntax before any robots read it!
  • Examples:
  • Instruct all robots to stay out of the admin area.
  • : User-agent: *
  • : Disallow: /ad6b091c_admin/
  • Restrict Google and MSN from indexing your images.
  • : User-agent: Googlebot
  • : Disallow: /images/
  • : User-agent: MSNBot
  • : Disallow: /images/
  • ****************************************************************************
  • Disable newsletter, checkout & customer account
  • Disable Search pages
  • Disable common folders
  • Disallow: /pub/
  • Disable Tag & Review (Avoid duplicate content)
  • Common files
  • Disable sorting (Avoid duplicate content)
  • Disable version control folders and others