Weighs the soul of incoming HTTP requests to stop AI crawlers

docs(blog/v1.20.0): how did CI not catch this?

Signed-off-by: Xe Iaso <me@xeiaso.net>

Xe Iaso d47a3406 ff5991b5

+1 -1
+1 -1
docs/blog/2025-06-27-release-1.20.0/index.mdx
··· 171 171 172 172 Anubis was created because crawler bots don't respect [`robots.txt` files](https://www.robotstxt.org/). Administrators have been working on refining and crafting their `robots.txt` files for years, and one common comment is that people don't know where to start crafting their own rules. 173 173 174 - Anubis now ships with a [`robots2policy` tool](http://localhost:3000/docs/admin/robots2policy) that lets you convert your `robots.txt` file to an Anubis policy. 174 + Anubis now ships with a [`robots2policy` tool](/docs/admin/robots2policy) that lets you convert your `robots.txt` file to an Anubis policy. 175 175 176 176 ```text 177 177 robots2policy -input https://github.com/robots.txt