wikimultia.org
robots.txt

Robots Exclusion Standard data for wikimultia.org

Resource Scan

Scan Details

Site Domain wikimultia.org
Base Domain wikimultia.org
Scan Status Ok
Last Scan2026-02-01T16:27:14+00:00
Next Scan 2026-02-08T16:27:14+00:00

Last Scan

Scanned2026-02-01T16:27:14+00:00
URL https://wikimultia.org/robots.txt
Domain IPs 80.87.194.3
Response IP 80.87.194.3
Found Yes
Hash dcd46b3c9276eca3b7a7a9f78f3bcb76628dcfcc25523e41d44a91ce6c6d39ca
SimHash a2586959edf5

Groups

mj12bot

Rule Path
Disallow /

israbot

Rule Path
Disallow

orthogaffe

Rule Path
Disallow

ubicrawler

Rule Path
Disallow /

doc

Rule Path
Disallow /

zao

Rule Path
Disallow /

sitecheck.internetseer.com

Rule Path
Disallow /

zealbot

Rule Path
Disallow /

msiecrawler

Rule Path
Disallow /

sitesnagger

Rule Path
Disallow /

webstripper

Rule Path
Disallow /

webcopier

Rule Path
Disallow /

fetch

Rule Path
Disallow /

offline explorer

Rule Path
Disallow /

teleport

Rule Path
Disallow /

teleportpro

Rule Path
Disallow /

webzip

Rule Path
Disallow /

linko

Rule Path
Disallow /

httrack

Rule Path
Disallow /

microsoft.url.control

Rule Path
Disallow /

xenu

Rule Path
Disallow /

larbin

Rule Path
Disallow /

libwww

Rule Path
Disallow /

zyborg

Rule Path
Disallow /

download ninja

Rule Path
Disallow /

fast

Rule Path
Disallow /

wget

Rule Path
Disallow /

grub-client

Rule Path
Disallow /

k2spider

Rule Path
Disallow /

npbot

Rule Path
Disallow /

webreaper

Rule Path
Disallow /

*

Rule Path
Allow /api.php?action=mobileview&
Allow /load.php?
Allow /api/rest_v1/?doc
Allow /sitemap.xml
Disallow /index.php?
Disallow /index.php?diff=
Disallow /index.php?oldid=
Disallow /index.php?title=Help
Disallow /index.php?title=Image
Disallow /index.php?title=MediaWiki
Disallow /index.php?title=Special%3A
Disallow /index.php?title=Template
Disallow /skins/
Disallow /api/
Disallow /trap/
Disallow /wiki/Special%3A
Disallow /wiki/%D0%A1%D0%BF%D0%B5%D1%86%D0%B8%D0%B0%D0%BB%D1%8C%D0%BD%D1%8B%D0%B5%3ASearch
Disallow /wiki/%D0%A1%D0%BF%D0%B5%D1%86%D0%B8%D0%B0%D0%BB%D1%8C%D0%BD%D1%8B%D0%B5%3ASearch
Disallow /wiki/%D0%A3%D1%87%D0%B0%D1%81%D1%82%D0%BD%D0%B8%D0%BA%3A
Disallow /wiki/%D0%A3%D1%87%D0%B0%D1%81%D1%82%D0%BD%D0%B8%D0%BA%3A
Disallow /wiki/%D0%A3%D1%87%D0%B0%D1%81%D1%82%D0%BD%D0%B8%D1%86%D0%B0%3A
Disallow /wiki/%D0%A3%D1%87%D0%B0%D1%81%D1%82%D0%BD%D0%B8%D1%86%D0%B0%3A
Disallow /wiki/%D0%9E%D0%B1%D1%81%D1%83%D0%B6%D0%B4%D0%B5%D0%BD%D0%B8%D0%B5_%D1%83%D1%87%D0%B0%D1%81%D1%82%D0%BD%D0%B8%D0%BA%D0%B0%3A
Disallow /wiki/%D0%9E%D0%B1%D1%81%D1%83%D0%B6%D0%B4%D0%B5%D0%BD%D0%B8%D0%B5_%D1%83%D1%87%D0%B0%D1%81%D1%82%D0%BD%D0%B8%D0%BA%D0%B0%3A
Disallow /wiki/%D0%9E%D0%B1%D1%81%D1%83%D0%B6%D0%B4%D0%B5%D0%BD%D0%B8%D0%B5_%D1%83%D1%87%D0%B0%D1%81%D1%82%D0%BD%D0%B8%D1%86%D1%8B%3A
Disallow /wiki/%D0%9E%D0%B1%D1%81%D1%83%D0%B6%D0%B4%D0%B5%D0%BD%D0%B8%D0%B5_%D1%83%D1%87%D0%B0%D1%81%D1%82%D0%BD%D0%B8%D1%86%D1%8B%3A
Disallow /wiki/%C3%90%C2%A8%C3%90%C2%B0%C3%90%C2%B1%C3%90%C2%BB%C3%90%C2%BE%C3%90%C2%BD%3A
Disallow /wiki/%D0%A8%D0%B0%D0%B1%D0%BB%D0%BE%D0%BD%3A
Disallow /wiki/%D0%9E%D0%B1%D1%81%D1%83%D0%B6%D0%B4%D0%B5%D0%BD%D0%B8%D0%B5_%D1%88%D0%B0%D0%B1%D0%BB%D0%BE%D0%BD%D0%B0%3A
Disallow /wiki/%D0%A1%D0%B2%D0%BE%D0%B9%D1%81%D1%82%D0%B2%D0%BE%3A
Disallow /wiki/%D0%9E%D0%B1%D1%81%D1%83%D0%B6%D0%B4%D0%B5%D0%BD%D0%B8%D0%B5_%D1%81%D0%B2%D0%BE%D0%B9%D1%81%D1%82%D0%B2%D0%B0%3A

Other Records

Field Value
sitemap https://wikimultia.org/sitemap/sitemap-index-wikimultia.xml

Comments

  • Observed spamming large amounts of https://wikimultia.org/?curid=NNNNNN
  • and ignoring 429 ratelimit responses, claims to respect robots:
  • http://mj12bot.com/
  • Work bots:
  • Crawlers that are kind enough to obey, but which we'd rather not have
  • unless they're feeding search engines.
  • Some bots are known to be trouble, particularly those designed to copy
  • entire sites. Please obey robots.txt.
  • Misbehaving: requests much too fast:
  • Sorry, wget in its recursive mode is a frequent problem.
  • Please read the man page and use it properly; there is a
  • --wait option you can use to set the delay between hits,
  • for instance.
  • The 'grub' distributed client has been *very* poorly behaved.
  • Doesn't follow robots.txt anyway, but...
  • Hits many times per second, not acceptable
  • http://www.nameprotect.com/botinfo.html
  • A capture bot, downloads gazillions of pages with no public benefit
  • http://www.webreaper.net/
  • Для всеÑ
  • Страницы и подстраницы участников и участниц
  • Страницы и подстраницы обсуждения участников и участниц
  • Шаблоны и иÑ
  • Свойства и иÑ

Warnings

  • 13 invalid lines.