seools.com
robots.txt

Robots Exclusion Standard data for seools.com

Resource Scan

Scan Details

Site Domain seools.com
Base Domain seools.com
Scan Status Ok
Last Scan2025-12-19T10:39:17+00:00
Next Scan 2025-12-26T10:39:17+00:00

Last Scan

Scanned2025-12-19T10:39:17+00:00
URL https://seools.com/robots.txt
Domain IPs 185.125.78.117
Response IP 185.125.78.117
Found Yes
Hash 3882606f7b3aa070dec0d88ed2eec7876c80c72e3987a8310f8cd682de07cbc8
SimHash 11f65fb04f3e

Groups

googlebot

Rule Path
Allow /

googlebot-image

Rule Path
Allow /

googlebot-video

Rule Path
Allow /

googlebot-news

Rule Path
Allow /

adsbot-google

Rule Path
Allow /

adsbot-google-mobile

Rule Path
Allow /

mediapartners-google

Rule Path
Allow /

google-inspectiontool

Rule Path
Allow /

bingbot

Rule Path
Allow /

bingpreview

Rule Path
Allow /

applebot

Rule Path
Allow /

duckduckbot

Rule Path
Allow /

slurp

Rule Path
Allow /

ecosia-bot

Rule Path
Allow /

qwantify

Rule Path
Allow /

mojeekbot

Rule Path
Allow /

seznambot

Rule Path
Allow /

ahrefsbot

Rule Path
Disallow /

semrushbot

Rule Path
Disallow /

dotbot

Rule Path
Disallow /

mj12bot

Rule Path
Disallow /

blexbot

Rule Path
Disallow /

dataforseobot

Rule Path
Disallow /

barkrowler

Rule Path
Disallow /

ccbot

Rule Path
Disallow /

seokicks-robot

Rule Path
Disallow /

sistrix crawler

Rule Path
Disallow /

sistrix

Rule Path
Disallow /

serpstatbot

Rule Path
Disallow /

serankingbot

Rule Path
Disallow /

siteauditbot

Rule Path
Disallow /

screaming frog seo spider

Rule Path
Disallow /

sitebulb

Rule Path
Disallow /

deepcrawl

Rule Path
Disallow /

lumarbot

Rule Path
Disallow /

rytebot

Rule Path
Disallow /

oncrawl

Rule Path
Disallow /

linkdexbot

Rule Path
Disallow /

seoscanners.net

Rule Path
Disallow /

woorank

Rule Path
Disallow /

seobilitybot

Rule Path
Disallow /

rankactivelinkbot

Rule Path
Disallow /

netestate ne crawler

Rule Path
Disallow /

mail.ru_bot

Rule Path
Disallow /

pinterestbot

Rule Path
Disallow /

facebookexternalhit

Rule Path
Disallow /

twitterbot

Rule Path
Disallow /

slackbot

Rule Path
Disallow /

whatsapp

Rule Path
Disallow /

telegrambot

Rule Path
Disallow /

discordbot

Rule Path
Disallow /

telegrambot

Rule Path
Disallow /

crawler

Rule Path
Disallow /

python-requests

Rule Path
Disallow /

python-httpx

Rule Path
Disallow /

go-http-client

Rule Path
Disallow /

java

Rule Path
Disallow /

libwww-perl

Rule Path
Disallow /

curl

Rule Path
Disallow /

wget

Rule Path
Disallow /

*

Rule Path
Disallow /wp-admin/
Allow /wp-admin/admin-ajax.php
Disallow /wp-includes/
Allow /wp-includes/js/
Allow /wp-includes/images/
Disallow /trackback/
Disallow /wp-login.php
Disallow /wp-register.php

*

Rule Path
Disallow /

Comments

  • ==========================================
  • robots.txt — Permitir solo buscadores fiables
  • y bloquear scrapers / herramientas SEO
  • ==========================================
  • --------- GOOGLE ---------
  • --------- BING / MICROSOFT ---------
  • --------- APPLE (Siri / Spotlight) ---------
  • --------- DUCKDUCKGO ---------
  • --------- YAHOO (hereda de Bing, pero se permite por compatibilidad) ---------
  • --------- ECOSIA ---------
  • (usa resultados de Bing, pero tiene bot propio)
  • --------- QWANT ---------
  • --------- MOJEEK (buscador independiente) ---------
  • --------- SEZNAM (Rep. Checa) ---------
  • --------- OPCIONALES (descomenta si te interesan esos mercados) ---------
  • Yandex (RU/Europa del Este)
  • User-agent: Yandex
  • Allow: /
  • Baidu (China)
  • User-agent: Baiduspider
  • Allow: /
  • Sogou (China)
  • User-agent: Sogou web spider
  • Allow: /
  • Petal (Huawei)
  • User-agent: PetalBot
  • Allow: /
  • --------- ARCHIVE.ORG (Wayback Machine) ---------
  • Déjalo comentado si quieres permitir el archivado público
  • User-agent: ia_archiver
  • Disallow: /
  • ==========================================
  • BLOQUEAR EXPLÍCITAMENTE BOTS NO DESEADOS
  • (rastreadores SEO, scrapers y auditores)
  • ==========================================
  • Bloqueo genérico de agentes "genéricos" y librerías comunes de scraping
  • ---------ESPECIFICO WORDPRESS---------
  • --------- BLOQUEO POR DEFECTO A TODO LO NO PERMITIDO ARRIBA ---------