freehackers.org
robots.txt

Robots Exclusion Standard data for freehackers.org

Resource Scan

Scan Details

Site Domain freehackers.org
Base Domain freehackers.org
Scan Status Ok
Last Scan2024-09-25T13:59:40+00:00
Next Scan 2024-10-09T13:59:40+00:00

Last Scan

Scanned2024-09-25T13:59:40+00:00
URL https://www.freehackers.org/robots.txt
Redirect https://freehackers.org/robots.txt
Redirect Domain freehackers.org
Redirect Base freehackers.org
Domain IPs 2a01:e0a:df:6ba0:e0c3:d6ff:fe5e:60af, 82.64.38.244
Redirect IPs 2a01:e0a:df:6ba0:e0c3:d6ff:fe5e:60af, 82.64.38.244
Response IP 82.64.38.244
Found Yes
Hash 8307ffec6d28e59b3ac9df6c248b7e6409e8884ced9e89c38fe593f946bdc9ee
SimHash c61299536375

Groups

*

Rule Path
Disallow /zeta/linux/-/tree
Disallow /mirrors/

*

Rule Path
Disallow /autocomplete/users
Disallow /autocomplete/projects
Disallow /search
Disallow /admin
Disallow /profile
Disallow /dashboard
Disallow /users
Disallow /api/v*
Disallow /help
Disallow /s/
Disallow /-/profile
Disallow /-/user_settings/profile
Disallow /-/ide/
Disallow /-/experiment
Allow /users/sign_in
Allow /users/sign_up
Allow /users/*/snippets

*

Rule Path
Disallow /*/new
Disallow /*/edit
Disallow /*/raw
Disallow /*/realtime_changes

*

Rule Path
Disallow /groups/*/analytics
Disallow /groups/*/contribution_analytics
Disallow /groups/*/group_members
Disallow /groups/*/-/saml/sso

*

Rule Path
Disallow /*/*.git$
Disallow /*/archive/
Disallow /*/repository/archive*
Disallow /*/activity
Disallow /*/blame
Disallow /*/commits
Disallow /*/commit
Disallow /*/commit/*.patch
Disallow /*/commit/*.diff
Disallow /*/compare
Disallow /*/network
Disallow /*/graphs
Disallow /*/merge_requests/*.patch
Disallow /*/merge_requests/*.diff
Disallow /*/merge_requests/*/diffs
Disallow /*/deploy_keys
Disallow /*/hooks
Disallow /*/services
Disallow /*/protected_branches
Disallow /*/uploads/
Disallow /*/project_members
Disallow /*/settings
Disallow /*/-/import
Disallow /*/-/environments
Disallow /*/-/jobs
Disallow /*/-/requirements_management
Disallow /*/-/pipelines
Disallow /*/-/pipeline_schedules
Disallow /*/-/dependencies
Disallow /*/-/licenses
Disallow /*/-/metrics
Disallow /*/-/incidents
Disallow /*/-/value_stream_analytics
Disallow /*/-/analytics
Disallow /*/insights

Comments

  • See http://www.robotstxt.org/robotstxt.html for documentation on how to use the robots.txt file
  • To ban all spiders from the entire site uncomment the next two lines:
  • User-Agent: *
  • Disallow: /
  • orzel, 2024-09-13: excludes some directories
  • Add a 1 second delay between successive requests to the same server, limits resources used by crawler
  • Only some crawlers respect this setting, e.g. Googlebot does not
  • Crawl-delay: 1
  • Based on details in https://gitlab.com/gitlab-org/gitlab/blob/master/config/routes.rb,
  • https://gitlab.com/gitlab-org/gitlab/blob/master/spec/routing, and using application
  • Global routes
  • Restrict allowed routes to avoid very ugly search results
  • Generic resource routes like new, edit, raw
  • This will block routes like:
  • - /projects/new
  • - /gitlab-org/gitlab-foss/issues/123/-/edit
  • Group details
  • Project details