aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorHans5958 <hans5958@outlook.com>2025-03-27 19:43:37 +0700
committerGitHub <noreply@github.com>2025-03-27 08:43:37 -0400
commitd1d63d9c1878b4567fec6a1d2bb86364de2b513e (patch)
treec4d27eb1a8b5d2fa951906d3ec08c0f7b6410ba7
parentecc6b47f90bd4ffc48d1555df8012d534fa5d180 (diff)
downloadanubis-d1d63d9c1878b4567fec6a1d2bb86364de2b513e.tar.xz
anubis-d1d63d9c1878b4567fec6a1d2bb86364de2b513e.zip
docs: fix broken link to default policy file (#137)
-rw-r--r--docs/docs/admin/policies.md2
1 files changed, 1 insertions, 1 deletions
diff --git a/docs/docs/admin/policies.md b/docs/docs/admin/policies.md
index abd6139..c4034a3 100644
--- a/docs/docs/admin/policies.md
+++ b/docs/docs/admin/policies.md
@@ -52,7 +52,7 @@ Here is a minimal policy file that will protect against most scraper bots:
}
```
-This allows requests to [`/.well-known`](https://en.wikipedia.org/wiki/Well-known_URI), `/favicon.ico`, `/robots.txt`, and challenges any request that has the word `Mozilla` in its User-Agent string. The [default policy file](https://github.com/TecharoHQ/anubis/blob/main/cmd/anubis/botPolicies.json) is a bit more cohesive, but this should be more than enough for most users.
+This allows requests to [`/.well-known`](https://en.wikipedia.org/wiki/Well-known_URI), `/favicon.ico`, `/robots.txt`, and challenges any request that has the word `Mozilla` in its User-Agent string. The [default policy file](https://github.com/TecharoHQ/anubis/blob/main/data/botPolicies.json) is a bit more cohesive, but this should be more than enough for most users.
If no rules match the request, it is allowed through.