uuworld.org
robots.txt

Robots Exclusion Standard data for uuworld.org

Resource Scan

Scan Details

Site Domain uuworld.org
Base Domain uuworld.org
Scan Status Ok
Last Scan2025-10-14T22:24:30+00:00
Next Scan 2025-11-13T22:24:30+00:00

Last Scan

Scanned2025-10-14T22:24:30+00:00
URL https://uuworld.org/robots.txt
Redirect https://www.uuworld.org/robots.txt
Redirect Domain www.uuworld.org
Redirect Base uuworld.org
Domain IPs 151.101.130.216, 151.101.194.216, 151.101.2.216, 151.101.66.216
Redirect IPs 151.101.130.216, 151.101.194.216, 151.101.2.216, 151.101.66.216
Response IP 151.101.130.216
Found Yes
Hash 6db542a3686fb5568fa685eb5307b062c7781333fa8cfae49e7d7c47ba8e2bb7
SimHash 38969d01c764

Groups

*

Rule Path
Allow /core/*.css$
Allow /core/*.css?
Allow /core/*.js$
Allow /core/*.js?
Allow /core/*.gif
Allow /core/*.jpg
Allow /core/*.jpeg
Allow /core/*.png
Allow /core/*.svg
Allow /profiles/*.css$
Allow /profiles/*.css?
Allow /profiles/*.js$
Allow /profiles/*.js?
Allow /profiles/*.gif
Allow /profiles/*.jpg
Allow /profiles/*.jpeg
Allow /profiles/*.png
Allow /profiles/*.svg
Disallow /core/
Disallow /profiles/
Disallow /README.md
Disallow /composer/Metapackage/README.txt
Disallow /composer/Plugin/ProjectMessage/README.md
Disallow /composer/Plugin/Scaffold/README.md
Disallow /composer/Plugin/VendorHardening/README.txt
Disallow /composer/Template/README.txt
Disallow /modules/README.txt
Disallow /sites/README.txt
Disallow /themes/README.txt
Disallow /web.config
Disallow /admin/
Disallow /comment/reply/
Disallow /filter/tips
Disallow /node/add/
Disallow /search/
Disallow /user/register
Disallow /user/password
Disallow /user/login
Disallow /user/logout
Disallow /media/oembed
Disallow /*/media/oembed
Disallow /index.php/admin/
Disallow /index.php/comment/reply/
Disallow /index.php/filter/tips
Disallow /index.php/node/add/
Disallow /index.php/search/
Disallow /index.php/user/password
Disallow /index.php/user/register
Disallow /index.php/user/login
Disallow /index.php/user/logout
Disallow /index.php/media/oembed
Disallow /index.php/*/media/oembed

ahrefsbot
bytedance
ezooms
megaindex.ru
mj12bot
semrushbot
spbot
yandex

Rule Path
Disallow /

ai2bot
ai2bot-dolma
amazonbot
anthropic-ai
brightbot 1.0
bytespider
ccbot
chatgpt-user
claude-web
claudebot
cohere-ai
cohere-training-data-crawler
crawlspace
diffbot
duckassistbot
facebookbot
friendlycrawler
gptbot
iaskspider/2.0
icc-crawler
imagesiftbot
img2dataset
isscyberriskcrawler
kangaroo bot
meta-externalagent
meta-externalfetcher
oai-searchbot
omgili
omgilibot
pangubot
perplexitybot
petalbot
scrapy
semrushbot-ocob
semrushbot-swa
sidetrade indexer bot
timpibot
velenpublicwebcrawler
webzio-extended
yisouspider
youbot

Rule Path
Disallow /

*

Rule Path
Allow *?page=*
Disallow *?*=*

*

No rules defined. All paths allowed.

Other Records

Field Value
crawl-delay 10

*

Rule Path
Disallow /autodiscover/
Disallow /files/includes/
Disallow /framework/
Disallow /generic/

Comments

  • robots.txt
  • This file is to prevent the crawling and indexing of certain parts
  • of your site by web crawlers and spiders run by sites like Yahoo!
  • and Google. By telling these "robots" where not to go on your site,
  • you save bandwidth and server resources.
  • This file will be ignored unless it is at the root of your host:
  • Used: http://example.com/robots.txt
  • Ignored: http://example.com/site/robots.txt
  • For more information about the robots.txt standard, see:
  • http://www.robotstxt.org/robotstxt.html
  • CSS, JS, Images
  • Directories
  • Files
  • Paths (clean URLs)
  • Paths (no clean URLs)
  • Block bots
  • 3/19/25 https://github.com/ai-robots-txt/ai.robots.txt?tab=readme-ov-file
  • https://raw.githubusercontent.com/ai-robots-txt/ai.robots.txt/refs/heads/main/robots.txt
  • Per https://support.platform.sh/hc/en-us/requests/316353
  • Disallow crawling URLs with queries except ?page as used on lists and sitemap.xml
  • Crawl delay
  • Directories