From d40b5cfdab11c62dc2ed226bde32b19ea7107f21 Mon Sep 17 00:00:00 2001 From: Xe Iaso Date: Sun, 20 Apr 2025 20:09:27 -0400 Subject: lib: move config to yaml (#307) * lib: move config to yaml Signed-off-by: Xe Iaso * web: run go generate Signed-off-by: Xe Iaso * Add Haiku to known instances (#304) Signed-off-by: Asmodeus <46908100+AsmodeumX@users.noreply.github.com> * Add headers bot rule (#300) * Closes #291: add headers support to bot policy rules * Fix config validator * update docs for JSON -> YAML Signed-off-by: Xe Iaso * docs: document http header based actions Signed-off-by: Xe Iaso * lib: add missing test Signed-off-by: Xe Iaso * Apply suggestions from code review Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> Signed-off-by: Xe Iaso --------- Signed-off-by: Xe Iaso Signed-off-by: Asmodeus <46908100+AsmodeumX@users.noreply.github.com> Co-authored-by: Asmodeus <46908100+AsmodeumX@users.noreply.github.com> Co-authored-by: Neur0toxine Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> --- docs/docs/CHANGELOG.md | 1 + docs/docs/admin/environments/docker-compose.mdx | 4 +- docs/docs/admin/installation.mdx | 2 +- docs/docs/admin/native-install.mdx | 8 +- docs/docs/admin/policies.md | 131 ------------- docs/docs/admin/policies.mdx | 241 ++++++++++++++++++++++++ docs/docs/index.mdx | 2 +- 7 files changed, 250 insertions(+), 139 deletions(-) delete mode 100644 docs/docs/admin/policies.md create mode 100644 docs/docs/admin/policies.mdx (limited to 'docs') diff --git a/docs/docs/CHANGELOG.md b/docs/docs/CHANGELOG.md index 45c1f59..1c634a8 100644 --- a/docs/docs/CHANGELOG.md +++ b/docs/docs/CHANGELOG.md @@ -23,6 +23,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 - Added example Apache configuration to the documentation [#277](https://github.com/TecharoHQ/anubis/issues/277) - Move per-environment configuration details into their own pages - Added headers support to bot policy rules +- Moved configuration file from JSON to YAML by default - Added documentation on how to use Anubis with Traefik in Docker ## v1.16.0 diff --git a/docs/docs/admin/environments/docker-compose.mdx b/docs/docs/admin/environments/docker-compose.mdx index b40e0ea..6783808 100644 --- a/docs/docs/admin/environments/docker-compose.mdx +++ b/docs/docs/admin/environments/docker-compose.mdx @@ -12,13 +12,13 @@ services: METRICS_BIND: ":9090" SERVE_ROBOTS_TXT: "true" TARGET: "http://nginx" - POLICY_FNAME: "/data/cfg/botPolicy.json" + POLICY_FNAME: "/data/cfg/botPolicy.yaml" OG_PASSTHROUGH: "true" OG_EXPIRY_TIME: "24h" ports: - 8080:8080 volumes: - - "./botPolicy.json:/data/cfg/botPolicy.json:ro" + - "./botPolicy.yaml:/data/cfg/botPolicy.yaml:ro" nginx: image: nginx volumes: diff --git a/docs/docs/admin/installation.mdx b/docs/docs/admin/installation.mdx index 9c88930..2333b1d 100644 --- a/docs/docs/admin/installation.mdx +++ b/docs/docs/admin/installation.mdx @@ -62,7 +62,7 @@ Anubis uses these environment variables for configuration: | `METRICS_BIND_NETWORK` | `tcp` | The address family that the Anubis metrics server listens on. See `BIND_NETWORK` for more information. | | `OG_EXPIRY_TIME` | `24h` | The expiration time for the Open Graph tag cache. | | `OG_PASSTHROUGH` | `false` | If set to `true`, Anubis will enable Open Graph tag passthrough. | -| `POLICY_FNAME` | unset | The file containing [bot policy configuration](./policies.md). See the bot policy documentation for more details. If unset, the default bot policy configuration is used. | +| `POLICY_FNAME` | unset | The file containing [bot policy configuration](./policies.mdx). See the bot policy documentation for more details. If unset, the default bot policy configuration is used. | | `SERVE_ROBOTS_TXT` | `false` | If set `true`, Anubis will serve a default `robots.txt` file that disallows all known AI scrapers by name and then additionally disallows every scraper. This is useful if facts and circumstances make it difficult to change the underlying service to serve such a `robots.txt` file. | | `SOCKET_MODE` | `0770` | _Only used when at least one of the `*_BIND_NETWORK` variables are set to `unix`._ The socket mode (permissions) for Unix domain sockets. | | `TARGET` | `http://localhost:3923` | The URL of the service that Anubis should forward valid requests to. Supports Unix domain sockets, set this to a URI like so: `unix:///path/to/socket.sock`. | diff --git a/docs/docs/admin/native-install.mdx b/docs/docs/admin/native-install.mdx index 8faa5cb..a615929 100644 --- a/docs/docs/admin/native-install.mdx +++ b/docs/docs/admin/native-install.mdx @@ -86,20 +86,20 @@ Once it's installed, make a copy of the default configuration file `/etc/anubis/ sudo cp /etc/anubis/default.env /etc/anubis/gitea.env ``` -Copy the default bot policies file to `/etc/anubis/gitea.botPolicies.json`: +Copy the default bot policies file to `/etc/anubis/gitea.botPolicies.yaml`: ```text -sudo cp /usr/share/doc/anubis/botPolicies.json /etc/anubis/gitea.botPolicies.json +sudo cp /usr/share/doc/anubis/botPolicies.yaml /etc/anubis/gitea.botPolicies.yaml ``` ```text -sudo cp ./doc/botPolicies.json /etc/anubis/gitea.botPolicies.json +sudo cp ./doc/botPolicies.yaml /etc/anubis/gitea.botPolicies.yaml ``` @@ -114,7 +114,7 @@ BIND_NETWORK=tcp DIFFICULTY=4 METRICS_BIND=[::1]:8240 METRICS_BIND_NETWORK=tcp -POLICY_FNAME=/etc/anubis/gitea.botPolicies.json +POLICY_FNAME=/etc/anubis/gitea.botPolicies.yaml TARGET=http://localhost:3000 ``` diff --git a/docs/docs/admin/policies.md b/docs/docs/admin/policies.md deleted file mode 100644 index c4034a3..0000000 --- a/docs/docs/admin/policies.md +++ /dev/null @@ -1,131 +0,0 @@ ---- -title: Policy Definitions ---- - -Out of the box, Anubis is pretty heavy-handed. It will aggressively challenge everything that might be a browser (usually indicated by having `Mozilla` in its user agent). However, some bots are smart enough to get past the challenge. Some things that look like bots may actually be fine (IE: RSS readers). Some resources need to be visible no matter what. Some resources and remotes are fine to begin with. - -Bot policies let you customize the rules that Anubis uses to allow, deny, or challenge incoming requests. Currently you can set policies by the following matches: - -- Request path -- User agent string - -Here's an example rule that denies [Amazonbot](https://developer.amazon.com/en/amazonbot): - -```json -{ - "name": "amazonbot", - "user_agent_regex": "Amazonbot", - "action": "DENY" -} -``` - -When this rule is evaluated, Anubis will check the `User-Agent` string of the request. If it contains `Amazonbot`, Anubis will send an error page to the user saying that access is denied, but in such a way that makes scrapers think they have correctly loaded the webpage. - -Right now the only kinds of policies you can write are bot policies. Other forms of policies will be added in the future. - -Here is a minimal policy file that will protect against most scraper bots: - -```json -{ - "bots": [ - { - "name": "well-known", - "path_regex": "^/.well-known/.*$", - "action": "ALLOW" - }, - { - "name": "favicon", - "path_regex": "^/favicon.ico$", - "action": "ALLOW" - }, - { - "name": "robots-txt", - "path_regex": "^/robots.txt$", - "action": "ALLOW" - }, - { - "name": "generic-browser", - "user_agent_regex": "Mozilla", - "action": "CHALLENGE" - } - ] -} -``` - -This allows requests to [`/.well-known`](https://en.wikipedia.org/wiki/Well-known_URI), `/favicon.ico`, `/robots.txt`, and challenges any request that has the word `Mozilla` in its User-Agent string. The [default policy file](https://github.com/TecharoHQ/anubis/blob/main/data/botPolicies.json) is a bit more cohesive, but this should be more than enough for most users. - -If no rules match the request, it is allowed through. - -## Writing your own rules - -There are three actions that can be returned from a rule: - -| Action | Effects | -| :---------- | :-------------------------------------------------------------------------------- | -| `ALLOW` | Bypass all further checks and send the request to the backend. | -| `DENY` | Deny the request and send back an error message that scrapers think is a success. | -| `CHALLENGE` | Show a challenge page and/or validate that clients have passed a challenge. | - -Name your rules in lower case using kebab-case. Rule names will be exposed in Prometheus metrics. - -### Challenge configuration - -Rules can also have their own challenge settings. These are customized using the `"challenge"` key. For example, here is a rule that makes challenges artificially hard for connections with the substring "bot" in their user agent: - -```json -{ - "name": "generic-bot-catchall", - "user_agent_regex": "(?i:bot|crawler)", - "action": "CHALLENGE", - "challenge": { - "difficulty": 16, - "report_as": 4, - "algorithm": "slow" - } -} -``` - -Challenges can be configured with these settings: - -| Key | Example | Description | -| :----------- | :------- | :--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -| `difficulty` | `4` | The challenge difficulty (number of leading zeros) for proof-of-work. See [Why does Anubis use Proof-of-Work?](/docs/design/why-proof-of-work) for more details. | -| `report_as` | `4` | What difficulty the UI should report to the user. Useful for messing with industrial-scale scraping efforts. | -| `algorithm` | `"fast"` | The algorithm used on the client to run proof-of-work calculations. This must be set to `"fast"` or `"slow"`. See [Proof-of-Work Algorithm Selection](./algorithm-selection) for more details. | - -### Remote IP based filtering - -The `remote_addresses` field of a Bot rule allows you to set the IP range that this ruleset applies to. - -For example, you can allow a search engine to connect if and only if its IP address matches the ones they published: - -```json -{ - "name": "qwantbot", - "user_agent_regex": "\\+https\\:\\/\\/help\\.qwant\\.com/bot/", - "action": "ALLOW", - "remote_addresses": ["91.242.162.0/24"] -} -``` - -This also works at an IP range level without any other checks: - -```json -{ - "name": "internal-network", - "action": "ALLOW", - "remote_addresses": ["100.64.0.0/10"] -} -``` - -## Risk calculation for downstream services - -In case your service needs it for risk calculation reasons, Anubis exposes information about the rules that any requests match using a few headers: - -| Header | Explanation | Example | -| :---------------- | :--------------------------------------------------- | :--------------- | -| `X-Anubis-Rule` | The name of the rule that was matched | `bot/lightpanda` | -| `X-Anubis-Action` | The action that Anubis took in response to that rule | `CHALLENGE` | -| `X-Anubis-Status` | The status and how strict Anubis was in its checks | `PASS-FULL` | - -Policy rules are matched using [Go's standard library regular expressions package](https://pkg.go.dev/regexp). You can mess around with the syntax at [regex101.com](https://regex101.com), make sure to select the Golang option. diff --git a/docs/docs/admin/policies.mdx b/docs/docs/admin/policies.mdx new file mode 100644 index 0000000..a5f6f1e --- /dev/null +++ b/docs/docs/admin/policies.mdx @@ -0,0 +1,241 @@ +--- +title: Policy Definitions +--- + +import Tabs from "@theme/Tabs"; +import TabItem from "@theme/TabItem"; + +Out of the box, Anubis is pretty heavy-handed. It will aggressively challenge everything that might be a browser (usually indicated by having `Mozilla` in its user agent). However, some bots are smart enough to get past the challenge. Some things that look like bots may actually be fine (IE: RSS readers). Some resources need to be visible no matter what. Some resources and remotes are fine to begin with. + +Bot policies let you customize the rules that Anubis uses to allow, deny, or challenge incoming requests. Currently you can set policies by the following matches: + +- Request path +- User agent string +- HTTP request header values + +As of version v1.17.0 or later, configuration can be written in either JSON or YAML. + +Here's an example rule that denies [Amazonbot](https://developer.amazon.com/en/amazonbot): + + + + +```json +{ + "name": "amazonbot", + "user_agent_regex": "Amazonbot", + "action": "DENY" +} +``` + + + + +```yaml +- name: amazonbot + user_agent_regex: Amazonbot + action: DENY +``` + + + + +When this rule is evaluated, Anubis will check the `User-Agent` string of the request. If it contains `Amazonbot`, Anubis will send an error page to the user saying that access is denied, but in such a way that makes scrapers think they have correctly loaded the webpage. + +Right now the only kinds of policies you can write are bot policies. Other forms of policies will be added in the future. + +Here is a minimal policy file that will protect against most scraper bots: + + + + +```json +{ + "bots": [ + { + "name": "cloudflare-workers", + "headers_regex": { + "CF-Worker": ".*" + }, + "action": "DENY" + }, + { + "name": "well-known", + "path_regex": "^/.well-known/.*$", + "action": "ALLOW" + }, + { + "name": "favicon", + "path_regex": "^/favicon.ico$", + "action": "ALLOW" + }, + { + "name": "robots-txt", + "path_regex": "^/robots.txt$", + "action": "ALLOW" + }, + { + "name": "generic-browser", + "user_agent_regex": "Mozilla", + "action": "CHALLENGE" + } + ] +} +``` + + + + +```yaml +bots: + - name: cloudflare-workers + headers_regex: + CF-Worker: .* + action: DENY + - name: well-known + path_regex: ^/.well-known/.*$ + action: ALLOW + - name: favicon + path_regex: ^/favicon.ico$ + action: ALLOW + - name: robots-txt + path_regex: ^/robots.txt$ + action: ALLOW + - name: generic-browser + user_agent_regex: Mozilla + action: CHALLENGE +``` + + + + +This allows requests to [`/.well-known`](https://en.wikipedia.org/wiki/Well-known_URI), `/favicon.ico`, `/robots.txt`, and challenges any request that has the word `Mozilla` in its User-Agent string. The [default policy file](https://github.com/TecharoHQ/anubis/blob/main/data/botPolicies.json) is a bit more cohesive, but this should be more than enough for most users. + +If no rules match the request, it is allowed through. + +## Writing your own rules + +There are three actions that can be returned from a rule: + +| Action | Effects | +| :---------- | :-------------------------------------------------------------------------------- | +| `ALLOW` | Bypass all further checks and send the request to the backend. | +| `DENY` | Deny the request and send back an error message that scrapers think is a success. | +| `CHALLENGE` | Show a challenge page and/or validate that clients have passed a challenge. | + +Name your rules in lower case using kebab-case. Rule names will be exposed in Prometheus metrics. + +### Challenge configuration + +Rules can also have their own challenge settings. These are customized using the `"challenge"` key. For example, here is a rule that makes challenges artificially hard for connections with the substring "bot" in their user agent: + + + + +```json +{ + "name": "generic-bot-catchall", + "user_agent_regex": "(?i:bot|crawler)", + "action": "CHALLENGE", + "challenge": { + "difficulty": 16, + "report_as": 4, + "algorithm": "slow" + } +} +``` + + + + +```yaml +# Punish any bot with "bot" in the user-agent string +- name: generic-bot-catchall + user_agent_regex: (?i:bot|crawler) + action: CHALLENGE + challenge: + difficulty: 16 # impossible + report_as: 4 # lie to the operator + algorithm: slow # intentionally waste CPU cycles and time +``` + + + + +Challenges can be configured with these settings: + +| Key | Example | Description | +| :----------- | :------- | :--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| `difficulty` | `4` | The challenge difficulty (number of leading zeros) for proof-of-work. See [Why does Anubis use Proof-of-Work?](/docs/design/why-proof-of-work) for more details. | +| `report_as` | `4` | What difficulty the UI should report to the user. Useful for messing with industrial-scale scraping efforts. | +| `algorithm` | `"fast"` | The algorithm used on the client to run proof-of-work calculations. This must be set to `"fast"` or `"slow"`. See [Proof-of-Work Algorithm Selection](./algorithm-selection) for more details. | + +### Remote IP based filtering + +The `remote_addresses` field of a Bot rule allows you to set the IP range that this ruleset applies to. + +For example, you can allow a search engine to connect if and only if its IP address matches the ones they published: + + + + +```json +{ + "name": "qwantbot", + "user_agent_regex": "\\+https\\:\\/\\/help\\.qwant\\.com/bot/", + "action": "ALLOW", + "remote_addresses": ["91.242.162.0/24"] +} +``` + + + + +```yaml +- name: qwantbot + user_agent_regex: \+https\://help\.qwant\.com/bot/ + action: ALLOW + # https://help.qwant.com/wp-content/uploads/sites/2/2025/01/qwantbot.json + remote_addresses: ["91.242.162.0/24"] +``` + + + + +This also works at an IP range level without any other checks: + + + + +```json +{ + "name": "internal-network", + "action": "ALLOW", + "remote_addresses": ["100.64.0.0/10"] +} +``` + + + + +```yaml +name: internal-network +action: ALLOW +remote_addresses: + - 100.64.0.0/10 +``` + + + + +## Risk calculation for downstream services + +In case your service needs it for risk calculation reasons, Anubis exposes information about the rules that any requests match using a few headers: + +| Header | Explanation | Example | +| :---------------- | :--------------------------------------------------- | :--------------- | +| `X-Anubis-Rule` | The name of the rule that was matched | `bot/lightpanda` | +| `X-Anubis-Action` | The action that Anubis took in response to that rule | `CHALLENGE` | +| `X-Anubis-Status` | The status and how strict Anubis was in its checks | `PASS-FULL` | + +Policy rules are matched using [Go's standard library regular expressions package](https://pkg.go.dev/regexp). You can mess around with the syntax at [regex101.com](https://regex101.com), make sure to select the Golang option. diff --git a/docs/docs/index.mdx b/docs/docs/index.mdx index 7f00850..04e3f96 100644 --- a/docs/docs/index.mdx +++ b/docs/docs/index.mdx @@ -19,7 +19,7 @@ Anubis [weighs the soul of your connection](https://en.wikipedia.org/wiki/Weighi This program is designed to help protect the small internet from the endless storm of requests that flood in from AI companies. Anubis is as lightweight as possible to ensure that everyone can afford to protect the communities closest to them. -Anubis is a bit of a nuclear response. This will result in your website being blocked from smaller scrapers and may inhibit "good bots" like the Internet Archive. You can configure [bot policy definitions](./admin/policies.md) to explicitly allowlist them and we are working on a curated set of "known good" bots to allow for a compromise between discoverability and uptime. +Anubis is a bit of a nuclear response. This will result in your website being blocked from smaller scrapers and may inhibit "good bots" like the Internet Archive. You can configure [bot policy definitions](./admin/policies.mdx) to explicitly allowlist them and we are working on a curated set of "known good" bots to allow for a compromise between discoverability and uptime. ## Support -- cgit v1.2.3