[PR #734] [CLOSED] Reject crawlers by default #5698

Closed
opened 2026-02-05 10:14:34 +03:00 by OVERLORD · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/BookStackApp/BookStack/pull/734
Author: @benrubson
Created: 3/6/2018
Status: Closed

Base: masterHead: robots


📝 Commits (1)

  • da5903c Reject crawlers by default

📊 Changes

1 file changed (+1 additions, -1 deletions)

View changed files

📝 public/robots.txt (+1 -1)

📄 Description

Hi,

This PR modifies robots.txt file so that crawlers are rejected by default.
I think it's a more secure default setting.
One who is in an indexing process generally knows what he does, and then modifies robots.txt accordingly.

Thank you 👍

Ben


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/BookStackApp/BookStack/pull/734 **Author:** [@benrubson](https://github.com/benrubson) **Created:** 3/6/2018 **Status:** ❌ Closed **Base:** `master` ← **Head:** `robots` --- ### 📝 Commits (1) - [`da5903c`](https://github.com/BookStackApp/BookStack/commit/da5903c7d2977fdd0918bf56002ca4385f9eec42) Reject crawlers by default ### 📊 Changes **1 file changed** (+1 additions, -1 deletions) <details> <summary>View changed files</summary> 📝 `public/robots.txt` (+1 -1) </details> ### 📄 Description Hi, This PR modifies `robots.txt` file so that crawlers are rejected by default. I think it's a more secure default setting. One who is in an indexing process generally knows what he does, and then modifies `robots.txt` accordingly. Thank you 👍 Ben --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
OVERLORD added the pull-request label 2026-02-05 10:14:34 +03:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: starred/BookStack#5698