ebsco.com
robots.txt

Robots Exclusion Standard data for ebsco.com

Resource Scan

Scan Details

Site Domain ebsco.com
Base Domain ebsco.com
Scan Status Ok
Last Scan2024-11-16T03:28:00+00:00
Next Scan 2024-11-23T03:28:00+00:00

Last Scan

Scanned2024-11-16T03:28:00+00:00
URL https://ebsco.com/robots.txt
Redirect https://www.ebsco.com/robots.txt
Redirect Domain www.ebsco.com
Redirect Base ebsco.com
Domain IPs 23.185.0.3, 2620:12a:8000::3, 2620:12a:8001::3
Redirect IPs 23.185.0.3, 2620:12a:8000::3, 2620:12a:8001::3
Response IP 23.185.0.3
Found Yes
Hash aedd7e368c22fd95267e2626623a62e77778834f9140d3129e0ca2b1427710db
SimHash 3d92bd1bcf64

Groups

*

Rule Path
Allow /core/*.css$
Allow /core/*.css?
Allow /core/*.js$
Allow /core/*.js?
Allow /core/*.gif
Allow /core/*.jpg
Allow /core/*.jpeg
Allow /core/*.png
Allow /core/*.svg
Allow /profiles/*.css$
Allow /profiles/*.css?
Allow /profiles/*.js$
Allow /profiles/*.js?
Allow /profiles/*.gif
Allow /profiles/*.jpg
Allow /profiles/*.jpeg
Allow /profiles/*.png
Allow /profiles/*.svg
Disallow /core/
Disallow /profiles/
Disallow /README.md
Disallow /composer/Metapackage/README.txt
Disallow /composer/Plugin/ProjectMessage/README.md
Disallow /composer/Plugin/Scaffold/README.md
Disallow /composer/Plugin/VendorHardening/README.txt
Disallow /composer/Template/README.txt
Disallow /modules/README.txt
Disallow /sites/README.txt
Disallow /themes/README.txt
Disallow /web.config
Disallow /admin/
Disallow /comment/reply/
Disallow /filter/tips
Disallow /node/add/
Disallow /search/
Disallow /user/register
Disallow /user/password
Disallow /user/login
Disallow /user/logout
Disallow /media/oembed
Disallow /*/media/oembed
Disallow /index.php/admin/
Disallow /index.php/comment/reply/
Disallow /index.php/filter/tips
Disallow /index.php/node/add/
Disallow /index.php/search/
Disallow /index.php/user/password
Disallow /index.php/user/register
Disallow /index.php/user/login
Disallow /index.php/user/logout
Disallow /index.php/media/oembed
Disallow /index.php/*/media/oembed

aihitbot
bbot
brands-bot-logo
checkmarknetwork
clarabot
crawler4j
datanyze
dataprovider
daum
ec2linkfinder
experibot
extlinksbot
ezooms
indeedbot
infopath
infopath.2
infotiger
infotigerbot
jobboersebot
mappy
mauibot
mj12bot
neevabot
niocbot
obot
panscient.com
petalbot
psbot
scrapy
seekport
semanticscholarbot
serpstatbot
sistrix
sitebot
swebot
taptubot
the knowledge ai
turnitinbot
twengabot
twiceler
vscooter

Rule Path
Disallow /

googlebot
adsbot
ahrefsbot
yandex

Rule Path
Allow /core/*.css$
Allow /core/*.css?
Allow /core/*.js$
Allow /core/*.js?
Allow /core/*.gif
Allow /core/*.jpg
Allow /core/*.jpeg
Allow /core/*.png
Allow /core/*.svg
Allow /profiles/*.css$
Allow /profiles/*.css?
Allow /profiles/*.js$
Allow /profiles/*.js?
Allow /profiles/*.gif
Allow /profiles/*.jpg
Allow /profiles/*.jpeg
Allow /profiles/*.png
Allow /profiles/*.svg
Disallow /core/
Disallow /profiles/
Disallow /README.txt
Disallow /web.config
Disallow /admin/
Disallow /comment/reply/
Disallow /filter/tips
Disallow /node/
Disallow /node/add/
Disallow /taxonomy/
Disallow /search/
Disallow /user/register/
Disallow /user/password/
Disallow /user/logout/
Disallow /index.php/admin/
Disallow /index.php/comment/reply/
Disallow /index.php/filter/tips
Disallow /index.php/node/add/
Disallow /index.php/search/
Disallow /index.php/user/password/
Disallow /index.php/user/register/
Disallow /index.php/user/login/
Disallow /index.php/user/logout/

Other Records

Field Value
sitemap https://a101334.sitemaphosting3.com/3981381/sitemap.xml

Comments

  • robots.txt
  • This file is to prevent the crawling and indexing of certain parts
  • of your site by web crawlers and spiders run by sites like Yahoo!
  • and Google. By telling these "robots" where not to go on your site,
  • you save bandwidth and server resources.
  • This file will be ignored unless it is at the root of your host:
  • Used: http://example.com/robots.txt
  • Ignored: http://example.com/site/robots.txt
  • For more information about the robots.txt standard, see:
  • http://www.robotstxt.org/robotstxt.html
  • CSS, JS, Images
  • Directories
  • Files
  • Paths (clean URLs)
  • Paths (no clean URLs)
  • Disallowed bots
  • Google-specific rule
  • Allow bots as per request
  • CSS, JS, Images
  • Directories
  • Files
  • Paths (clean URLs)
  • temp remove to drop Disallow: /user/login/
  • Paths (no clean URLs)
  • sitemap link