laboratoriolinux.es
robots.txt

Robots Exclusion Standard data for laboratoriolinux.es

Resource Scan

Scan Details

Site Domain laboratoriolinux.es
Base Domain laboratoriolinux.es
Scan Status Ok
Last Scan2025-12-30T03:25:06+00:00
Next Scan 2026-01-29T03:25:06+00:00

Last Scan

Scanned2025-12-30T03:25:06+00:00
URL https://laboratoriolinux.es/robots.txt
Domain IPs 2001:8d8:100f:f000::200, 217.160.0.132
Response IP 217.160.0.132
Found Yes
Hash 3c474b4265d2603a9ebf258aca491f1dfb781b62972a49b15e48222d5f580ebb
SimHash 221f1d1a43fa

Groups

*

Rule Path
Disallow /administrator/
Disallow /api/
Disallow /bin/
Disallow /cache/
Disallow /cli/
Disallow /components/
Disallow /includes/
Disallow /installation/
Disallow /language/
Disallow /layouts/
Disallow /libraries/
Disallow /logs/
Disallow /modules/
Disallow /plugins/
Disallow /tmp/

seokicks

No rules defined. All paths allowed.

Other Records

Field Value
crawl-delay 5

msnbot

No rules defined. All paths allowed.

Other Records

Field Value
crawl-delay 5

msiecrawler

Rule Path
Disallow /

webcopier

Rule Path
Disallow /

httrack

Rule Path
Disallow /

microsoft.url.control

Rule Path
Disallow /

libwww

Rule Path
Disallow /

baiduspider

Rule Path
Disallow /

gurujibot

Rule Path
Disallow /

hl_ftien_spider

Rule Path
Disallow /

sogou spider

Rule Path
Disallow /

yeti

Rule Path
Disallow /

yodaobot

Rule Path
Disallow /

mj12bot

Rule Path
Disallow /

yandexbot

Rule Path
Disallow /

semrushbot-b

Rule Path
Disallow /

ahrefsbot

Rule Path
Disallow /

barkrowler

Rule Path
Disallow /

Comments

  • If the Joomla site is installed within a folder
  • eg www.example.com/joomla/ then the robots.txt file
  • MUST be moved to the site root
  • eg www.example.com/robots.txt
  • AND the joomla folder name MUST be prefixed to all of the
  • paths.
  • eg the Disallow rule for the /administrator/ folder MUST
  • be changed to read
  • Disallow: /joomla/administrator/
  • For more information about the robots.txt standard, see:
  • https://www.robotstxt.org/orig.html
  • User-agent: Googlebot
  • Crawl-delay: 5
  • Lista de bots bloqueados