dehesadelaserna.com
robots.txt

Robots Exclusion Standard data for dehesadelaserna.com

Resource Scan

Scan Details

Site Domain dehesadelaserna.com
Base Domain dehesadelaserna.com
Scan Status Ok
Last Scan2024-09-12T02:37:35+00:00
Next Scan 2024-10-12T02:37:35+00:00

Last Scan

Scanned2024-09-12T02:37:35+00:00
URL https://dehesadelaserna.com/robots.txt
Domain IPs 82.98.168.209
Response IP 82.98.168.209
Found Yes
Hash 2a2c603c634797976c3c1445498880a5749fc8c015cf91e6c49a9a811e18645c
SimHash a0dc583bc8c5

Groups

*

Rule Path
Disallow /wp-login
Disallow /wp-admin/
Disallow /wp-includes/
Disallow /wp-content/plugins/
Disallow /wp-content/cache/
Disallow /wp-content/themes/
Disallow /*/attachment/
Disallow /author/
Disallow /*/p/
Disallow /p/*/
Disallow /p/
Disallow /downloads/*/
Disallow /*/category/
Disallow /category/*/
Disallow /*/page/
Disallow /page/*/
Disallow /tag/*/page/
Disallow /tag/*/feed/
Disallow /page/
Disallow /comments/
Disallow /xmlrpc.php
Disallow /?attachment_id*
Disallow *.pdf$
Disallow *.xls$
Disallow *.xlsx$
Disallow *.csv$
Disallow /*?

*

Rule Path
Disallow /*?s=
Disallow /search

*

Rule Path
Disallow /trackback
Disallow /*trackback
Disallow /*trackback*
Disallow /*/trackback

*

Rule Path
Allow /feed/$
Disallow /feed/
Disallow /comments/feed/
Disallow /*/feed/
Disallow /*/feed/$
Disallow /*/feed/rss/$
Disallow /*/trackback/$
Disallow /*/*/feed/$
Disallow /*/*/feed/rss/$
Disallow /*/*/trackback/$
Disallow /*/*/*/feed/$
Disallow /*/*/*/feed.xml
Disallow /*/*/*/feed/rss/$
Disallow /*/*/*/trackback/$

noxtrumbot

No rules defined. All paths allowed.

Other Records

Field Value
crawl-delay 20

msnbot

No rules defined. All paths allowed.

Other Records

Field Value
crawl-delay 20

slurp

No rules defined. All paths allowed.

Other Records

Field Value
crawl-delay 20

googlebot

Rule Path
Allow /*.css$
Allow /*.js$

orthogaffe

Rule Path
Disallow /

ubicrawler

Rule Path
Disallow /

doc

Rule Path
Disallow /

zao

Rule Path
Disallow /

twiceler

Rule Path
Disallow /

sitecheck.internetseer.com

Rule Path
Disallow /

zealbot

Rule Path
Disallow /

msiecrawler

Rule Path
Disallow /

sitesnagger

Rule Path
Disallow /

webstripper

Rule Path
Disallow /

webcopier

Rule Path
Disallow /

fetch

Rule Path
Disallow /

offline explorer

Rule Path
Disallow /

teleport

Rule Path
Disallow /

teleportpro

Rule Path
Disallow /

webzip

Rule Path
Disallow /

linko

Rule Path
Disallow /

httrack

Rule Path
Disallow /

microsoft.url.control

Rule Path
Disallow /

xenu

Rule Path
Disallow /

larbin

Rule Path
Disallow /

libwww

Rule Path
Disallow /

zyborg

Rule Path
Disallow /

download ninja

Rule Path
Disallow /

nutch

Rule Path
Disallow /

spock

Rule Path
Disallow /

omniexplorer_bot

Rule Path
Disallow /

turnitinbot

Rule Path
Disallow /

becomebot

Rule Path
Disallow /

geniebot

Rule Path
Disallow /

dotbot

Rule Path
Disallow /

mlbot

Rule Path
Disallow /

linguee bot

Rule Path
Disallow /

aihitbot

Rule Path
Disallow /

exabot

Rule Path
Disallow /

sbider/nutch

Rule Path
Disallow /

jyxobot

Rule Path
Disallow /

magent

Rule Path
Disallow /

mj12bot

Rule Path
Disallow /

speedy spider

Rule Path
Disallow /

shopwiki

Rule Path
Disallow /

huasai

Rule Path
Disallow /

datacha0s

Rule Path
Disallow /

baiduspider

Rule Path
Disallow /

atomic_email_hunter

Rule Path
Disallow /

mp3bot

Rule Path
Disallow /

winhttp

Rule Path
Disallow /

betabot

Rule Path
Disallow /

core-project

Rule Path
Disallow /

panscient.com

Rule Path
Disallow /

java

Rule Path
Disallow /

libwww-perl

Rule Path
Disallow /

wget

Rule Path
Disallow /

webreaper

Rule Path
Disallow /

grub-client

Rule Path
Disallow /

k2spider

Rule Path
Disallow /

npbot

Rule Path
Disallow /

Other Records

Field Value
sitemap https://dehesadelaserna.com/sitemap_index.xml

Comments

  • Bloqueo basico para todos los bots y crawlers
  • puede dar problemas por bloqueo de recursos en Google Search Console
  • Desindexar páginas y etiquetas
  • Bloqueo de las URL dinamicas
  • Bloqueo de busquedas
  • Bloqueo de trackbacks
  • Bloqueo de feeds para crawlers
  • Ralentizamos algunos bots que se suelen volver locos
  • Previene problemas de recursos bloqueados en Google Search Console
  • Bloqueo de bots y crawlers poco utiles

Warnings

  • 2 invalid lines.
  • `noindex` is not a known field.