zw3b.com
robots.txt

Robots Exclusion Standard data for zw3b.com

Resource Scan

Scan Details

Site Domain zw3b.com
Base Domain zw3b.com
Scan Status Ok
Last Scan2024-11-12T16:11:47+00:00
Next Scan 2024-11-19T16:11:47+00:00

Last Scan

Scanned2024-11-12T16:11:47+00:00
URL https://zw3b.com/robots.txt
Redirect https://www.zw3b.com/robots.txt
Redirect Domain www.zw3b.com
Redirect Base zw3b.com
Domain IPs 158.69.126.137, 2607:5300:60:9389::1
Redirect IPs 158.69.126.137, 2607:5300:60:9389::1
Response IP 158.69.126.137
Found Yes
Hash 991ffc423c4f9eaa8cb0f3126650f09928097d997aed3e96c2f87e7ab52f7782
SimHash 2e311b5567f0

Groups

adsbot-google
adsbot-google-mobile
adsbot-google-mobile-apps
adidxbot
applebot
applenewsbot
baiduspider
baiduspider-image
baiduspider-news
baiduspider-video
bingbot
bingpreview
bublupbot
ccbot
cliqzbot
coccoc
coccocbot-image
coccocbot-web
daumoa
dazoobot
deusu
duckduckbot
duckduckgo-favicons-bot
euripbot
exploratodo
facebot
feedly
findxbot
gooblog
googlebot
googlebot-image
googlebot-mobile
googlebot-news
googlebot-video
haosouspider
ichiro
istellabot
jikespider
lycos
mail.ru
mediapartners-google
mojeekbot
msnbot
msnbot-media
orangebot
pinterest
plukkie
qwantify
rambler
seznambot
sosospider
slurp
sogou blog
sogou inst spider
sogou news spider
sogou orion spider
sogou spider2
sogou web spider
sputnikbot
teoma
twitterbot
wotbox
yacybot
yandex
yandexmobilebot
yeti
yioopbot
yoozbot
youdaobot

Rule Path
Disallow
Disallow /actions/
Disallow /ajax/
Disallow /pub/AWStats/

*

Rule Path
Disallow /

Other Records

Field Value
sitemap https://www.zw3b.site/sitemap.xml

Comments

  • ROBOTS.TXT
  • Alphabetically ordered whitelisting of legitimate web robots, which obey the
  • Robots Exclusion Standard (robots.txt). Each bot is shortly described in a
  • comment above the (list of) user-agent(s). Comment out or delete lines which
  • contain User-agents you do not wish to allow on your website.
  • Important: Blank lines are not allowed in the final robots.txt file!
  • Updates can be retrieved from: https://www.ditig.com/robots-txt-template
  • This document is licensed with a CC BY-NC-SA 4.0 license.
  • Last update: 2021-11-04
  • so.com chinese search engine
  • google.com landing page quality checks
  • google.com app resource fetcher
  • bing ads bot
  • apple.com search engine
  • baidu.com chinese search engine
  • bing.com international search engine
  • bublup.com suggestion/search engine
  • commoncrawl.org open repository of web crawl data
  • cliqz.com german in-product search engine
  • coccoc.com vietnamese search engine
  • daum.net korean search engine
  • dazoo.fr french search engine
  • deusu.de german search engine
  • duckduckgo.com international privacy search engine
  • eurip.com european search engine
  • exploratodo.com latin search engine
  • facebook.com social network
  • feedly.com feed fetcher
  • findx.com european search engine
  • goo.ne.jp japanese search engine
  • google.com international search engine
  • so.com chinese search engine
  • goo.ne.jp japanese search engine
  • istella.it italian search engine
  • jike.com / chinaso.com chinese search engine
  • lycos.com & hotbot.com international search engine
  • mail.ru russian search engine
  • google.com adsense bot
  • mojeek.com search engine
  • bing.com international search engine
  • orange.com international search engine
  • pinterest.com social networtk
  • botje.nl dutch search engine
  • qwant.com french search engine
  • rambler.ru russian search engine
  • seznam.cz czech search engine
  • soso.com chinese search engine
  • yahoo.com international search engine
  • sogou.com chinese search engine
  • sputnik.ru russian search engine
  • ask.com international search engine
  • twitter.com bot
  • wotbox.com international search engine
  • yacy.net p2p search software
  • yandex.com russian search engine
  • search.naver.com south korean search engine
  • yioop.com international search engine
  • yooz.ir iranian search engine
  • youdao.com chinese search engine
  • crawling rule(s) for above bots
  • disallow all other bots

Warnings

  • 3 invalid lines.