From d1d63d9c1878b4567fec6a1d2bb86364de2b513e Mon Sep 17 00:00:00 2001 From: Hans5958 Date: Thu, 27 Mar 2025 19:43:37 +0700 Subject: [PATCH] docs: fix broken link to default policy file (#137) --- docs/docs/admin/policies.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/docs/admin/policies.md b/docs/docs/admin/policies.md index abd6139..c4034a3 100644 --- a/docs/docs/admin/policies.md +++ b/docs/docs/admin/policies.md @@ -52,7 +52,7 @@ Here is a minimal policy file that will protect against most scraper bots: } ``` -This allows requests to [`/.well-known`](https://en.wikipedia.org/wiki/Well-known_URI), `/favicon.ico`, `/robots.txt`, and challenges any request that has the word `Mozilla` in its User-Agent string. The [default policy file](https://github.com/TecharoHQ/anubis/blob/main/cmd/anubis/botPolicies.json) is a bit more cohesive, but this should be more than enough for most users. +This allows requests to [`/.well-known`](https://en.wikipedia.org/wiki/Well-known_URI), `/favicon.ico`, `/robots.txt`, and challenges any request that has the word `Mozilla` in its User-Agent string. The [default policy file](https://github.com/TecharoHQ/anubis/blob/main/data/botPolicies.json) is a bit more cohesive, but this should be more than enough for most users. If no rules match the request, it is allowed through.