fitnesstrener.ru
robots.txt

Robots Exclusion Standard data for fitnesstrener.ru

Resource Scan

Scan Details

Site Domain fitnesstrener.ru
Base Domain fitnesstrener.ru
Scan Status Ok
Last Scan2024-10-23T11:07:32+00:00
Next Scan 2024-11-22T11:07:32+00:00

Last Scan

Scanned2024-10-23T11:07:32+00:00
URL https://fitnesstrener.ru/robots.txt
Domain IPs 77.221.130.42
Response IP 77.221.130.42
Found Yes
Hash 7e38903503e80ead8a84f718f8be4cc81865c8f839a044b63d585a1da9258d9f
SimHash f23e8d4807f2

Groups

*

Rule Path
Disallow /administrator/
Disallow /bin/
Disallow /cache/
Disallow /cli/
Disallow /components/
Disallow /includes/
Disallow /installation/
Disallow /language/
Disallow /layouts/
Disallow /libraries/
Disallow /logs/
Disallow /modules/
Disallow /plugins/
Disallow /tmp/
Allow /*.jpg
Allow /*.png
Allow /*.css

dotbot

Rule Path
Disallow /

giftghostbot

Rule Path
Disallow /

seznam

Rule Path
Disallow /

paperlibot

Rule Path
Disallow /

genieo

Rule Path
Disallow /

dataprovider/6.101

Rule Path
Disallow /

dataprovidersiteexplorer

Rule Path
Disallow /

dazoobot/1.0

Rule Path
Disallow /

diffbot

Rule Path
Disallow /

domainstatsbot/1.0

Rule Path
Disallow /

dotbot/1.1

Rule Path
Disallow /

dubaiindex

Rule Path
Disallow /

siteexplorer

Rule Path
Disallow /

ecommercebot

Rule Path
Disallow /

expertsearchspider

Rule Path
Disallow /

feedbin

Rule Path
Disallow /

fetch/2.0a

Rule Path
Disallow /

ffbot/1.0

Rule Path
Disallow /

focusbot/1.1

Rule Path
Disallow /

huaweisymantecspider

Rule Path
Disallow /

huaweisymantecspider/1.0

Rule Path
Disallow /

jobdiggerspider

Rule Path
Disallow /

lemurwebcrawler

Rule Path
Disallow /

lipperheylinkexplorer

Rule Path
Disallow /

lssrocketcrawler/1.0

Rule Path
Disallow /

lyt.srv1.5

Rule Path
Disallow /

miadev/0.0.1

Rule Path
Disallow /

najdi.si/3.1

Rule Path
Disallow /

bountiibot

Rule Path
Disallow /

experibot_v1

Rule Path
Disallow /

bixocrawler

Rule Path
Disallow /

bixocrawler testcrawler

Rule Path
Disallow /

crawler4j

Rule Path
Disallow /

crowsnest/0.5

Rule Path
Disallow /

cukbot

Rule Path
Disallow /

dataprovider/6.92

Rule Path
Disallow /

dblbot/1.0

Rule Path
Disallow /

diffbot/0.1

Rule Path
Disallow /

digg deeper/v1

Rule Path
Disallow /

discobot/1.0

Rule Path
Disallow /

discobot/1.1

Rule Path
Disallow /

discobot/2.0

Rule Path
Disallow /

discoverybot/2.0

Rule Path
Disallow /

dlvr.it/1.0

Rule Path
Disallow /

domainstatsbot/1.0

Rule Path
Disallow /

drupact/0.7

Rule Path
Disallow /

ezooms/1.0

Rule Path
Disallow /

fastbot crawler beta 2.0

Rule Path
Disallow /

fastbot crawler beta 4.0

Rule Path
Disallow /

feedly social

Rule Path
Disallow /

feedly/1.0

Rule Path
Disallow /

feedlybot/1.0

Rule Path
Disallow /

feedspot

Rule Path
Disallow /

feedspotbot/1.0

Rule Path
Disallow /

clickagy intelligence bot v2

Rule Path
Disallow /

classbot

Rule Path
Disallow /

cispa vulnerability notification

Rule Path
Disallow /

cirrusexplorer/1.1

Rule Path
Disallow /

checksem/nutch-1.10

Rule Path
Disallow /

catchbot/5.0

Rule Path
Disallow /

catchbot/3.0

Rule Path
Disallow /

catchbot/2.0

Rule Path
Disallow /

catchbot/1.0

Rule Path
Disallow /

camontspider/1.0

Rule Path
Disallow /

buzzbot/1.0

Rule Path
Disallow /

buzzbot

Rule Path
Disallow /

businessseek.biz_spider

Rule Path
Disallow /

bubing

Rule Path
Disallow /

blexbot

Rule Path
Disallow /

fyberspider/1.3

Rule Path
Disallow /

findlinks/1.1.6-beta5

Rule Path
Disallow /

g2reader-bot/1.0

Rule Path
Disallow /

findlinks/1.1.6-beta6

Rule Path
Disallow /

findlinks/2.0

Rule Path
Disallow /

findlinks/2.0.1

Rule Path
Disallow /

findlinks/2.0.2

Rule Path
Disallow /

findlinks/2.0.4

Rule Path
Disallow /

findlinks/2.0.5

Rule Path
Disallow /

findlinks/2.0.9

Rule Path
Disallow /

findlinks/2.1

Rule Path
Disallow /

findlinks/2.1.5

Rule Path
Disallow /

findlinks/2.1.3

Rule Path
Disallow /

findlinks/2.2

Rule Path
Disallow /

findlinks/2.5

Rule Path
Disallow /

findlinks/2.6

Rule Path
Disallow /

ffbot/1.0

Rule Path
Disallow /

findlinks/1.0

Rule Path
Disallow /

findlinks/1.1.3-beta8

Rule Path
Disallow /

findlinks/1.1.3-beta9

Rule Path
Disallow /

findlinks/1.1.4-beta7

Rule Path
Disallow /

findlinks/1.1.6-beta1

Rule Path
Disallow /

findlinks/1.1.6-beta1 yacy

Rule Path
Disallow /

findlinks/1.1.6-beta2

Rule Path
Disallow /

findlinks/1.1.6-beta3

Rule Path
Disallow /

findlinks/1.1.6-beta4

Rule Path
Disallow /

bixo

Rule Path
Disallow /

bixolabs/1.0

Rule Path
Disallow /

crawlera/1.10.2

Rule Path
Disallow /

dataprovider site explorer

Rule Path
Disallow /

vagabondo

Rule Path
Disallow /

seokicks-robot

Rule Path
Disallow /

sistrix

Rule Path
Disallow /

wbsearchbot

Rule Path
Disallow /

proximic

Rule Path
Disallow /

queryseekerspider

Rule Path
Disallow /

ia_archiver

Rule Path
Disallow /

archive.org_bot

Rule Path
Disallow /

special_archiver

Rule Path
Disallow /

heritrix

Rule Path
Disallow /

netestate ne crawler

Rule Path
Disallow /

ahrefsbot

Rule Path
Disallow /

alexibot

Rule Path
Disallow /

mj12bot

Rule Path
Disallow /

surveybot

Rule Path
Disallow /

xenu's

Rule Path
Disallow /

xenu's link sleuth 1.1c

Rule Path
Disallow /

rogerbot

Rule Path
Disallow /

truliabot

Rule Path
Disallow /

feedjira

Rule Path
Disallow /

semrushbot

Rule Path
Disallow /

semrushbot-sa

Rule Path
Disallow /

semrushbot-ba

Rule Path
Disallow /

semrushbot-si

Rule Path
Disallow /

semrushbot-swa

Rule Path
Disallow /

semrushbot-ct

Rule Path
Disallow /

semrushbot-bm

Rule Path
Disallow /

Other Records

Field Value
sitemap http://www.fitnesstrener.ru/index.php?option=com_xmap&view=xml&tmpl=component&id=1

Comments

  • If the Joomla site is installed within a folder such as at
  • e.g. www.example.com/joomla/ the robots.txt file MUST be
  • moved to the site root at e.g. www.example.com/robots.txt
  • AND the joomla folder name MUST be prefixed to the disallowed
  • path, e.g. the Disallow rule for the /administrator/ folder
  • MUST be changed to read Disallow: /joomla/administrator/
  • For more information about the robots.txt standard, see:
  • http://www.robotstxt.org/orig.html
  • For syntax checking, see:
  • http://tool.motoricerca.info/robots-checker.phtml

Warnings

  • 8 invalid lines.