Weighs the soul of incoming HTTP requests to stop AI crawlers

docs: fix broken link to default policy file (#137)

authored by

Hans5958 and committed by
GitHub
d1d63d9c ecc6b47f

+1 -1
+1 -1
docs/docs/admin/policies.md
··· 52 52 } 53 53 ``` 54 54 55 - This allows requests to [`/.well-known`](https://en.wikipedia.org/wiki/Well-known_URI), `/favicon.ico`, `/robots.txt`, and challenges any request that has the word `Mozilla` in its User-Agent string. The [default policy file](https://github.com/TecharoHQ/anubis/blob/main/cmd/anubis/botPolicies.json) is a bit more cohesive, but this should be more than enough for most users. 55 + This allows requests to [`/.well-known`](https://en.wikipedia.org/wiki/Well-known_URI), `/favicon.ico`, `/robots.txt`, and challenges any request that has the word `Mozilla` in its User-Agent string. The [default policy file](https://github.com/TecharoHQ/anubis/blob/main/data/botPolicies.json) is a bit more cohesive, but this should be more than enough for most users. 56 56 57 57 If no rules match the request, it is allowed through. 58 58