Compare commits

...

87 commits

Author SHA1 Message Date
Devon Hudson
8b36740bad
Fix InFlightGauge typing to allow upgrading to prometheus_client 0.24 (#19379)
Some checks are pending
Tests / check-schema-delta (push) Blocked by required conditions
Tests / check-lockfile (push) Waiting to run
Tests / lint-rustfmt (push) Blocked by required conditions
Tests / lint-readme (push) Blocked by required conditions
Tests / linting-done (push) Blocked by required conditions
Tests / changes (push) Waiting to run
Tests / check-sampleconfig (push) Blocked by required conditions
Tests / lint (push) Blocked by required conditions
Tests / Typechecking (push) Blocked by required conditions
Tests / lint-crlf (push) Waiting to run
Tests / lint-newsfile (push) Waiting to run
Tests / lint-clippy (push) Blocked by required conditions
Tests / lint-clippy-nightly (push) Blocked by required conditions
Tests / lint-rust (push) Blocked by required conditions
Tests / calculate-test-jobs (push) Blocked by required conditions
Tests / trial (push) Blocked by required conditions
Tests / trial-olddeps (push) Blocked by required conditions
Schema / Ensure Synapse config schema is valid (push) Waiting to run
Schema / Ensure generated documentation is up-to-date (push) Waiting to run
Tests / trial-pypy (all, pypy-3.10) (push) Blocked by required conditions
Tests / sytest (push) Blocked by required conditions
Tests / export-data (push) Blocked by required conditions
Tests / portdb (14, 3.10) (push) Blocked by required conditions
Tests / portdb (17, 3.14) (push) Blocked by required conditions
Tests / complement (monolith, Postgres) (push) Blocked by required conditions
Tests / complement (monolith, SQLite) (push) Blocked by required conditions
Tests / complement (workers, Postgres) (push) Blocked by required conditions
Tests / cargo-test (push) Blocked by required conditions
Tests / cargo-bench (push) Blocked by required conditions
Tests / tests-done (push) Blocked by required conditions
Fixes #19375 

`prometheus_client` 0.24 makes `Collector` a generic type. 
Previously, `InFlightGauge` inherited from both `Generic[MetricsEntry]`
and `Collector`, resulting in the error `TypeError: cannot create a
consistent MRO` when using `prometheus_client` >= 0.24. This behaviour
of disallowing multiple `Generic` inheritance is more strictly enforced
starting with python 3.14, but can still lead to issues with earlier
versions of python.

This PR separates runtime and typing inheritance for `InFlightGauge`:
- Runtime: `InFlightGauge` inherits only from `Collector`
- Typing: `InFlightGauge` is generic

This preserves static typing, avoids MRO conflicts, and supports both
`prometheus_client` <0.24 and >=0.24.

I have tested these changes out locally with `prometheus_client` 0.23.1
& 0.24 on python 3.14 while sending a bunch of messages over federation
and watching a grafana dashboard configured to show
`synapse_util_metrics_block_in_flight_total` &
`synapse_util_metrics_block_in_flight_real_time_sum` (the only metric
setup to use `InFlightGauge`) and things are working in each case.
a1e9abc7df/synapse/util/metrics.py (L112-L119)

### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [X] Pull request is based on the develop branch
* [X] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [X] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct (run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))
2026-01-16 20:35:30 +00:00
dependabot[bot]
cb376ee73b
Bump pyasn1 from 0.6.1 to 0.6.2 (#19387)
Bumps [pyasn1](https://github.com/pyasn1/pyasn1) from 0.6.1 to 0.6.2.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/pyasn1/pyasn1/releases">pyasn1's
releases</a>.</em></p>
<blockquote>
<h2>Release 0.6.2</h2>
<p>It's a minor release.</p>
<ul>
<li>Fixed continuation octet limits in OID/RELATIVE-OID decoder
(CVE-2026-23490).</li>
<li>Added support for Python 3.14.</li>
<li>Added SECURITY.md policy.</li>
<li>Migrated to pyproject.toml packaging.</li>
</ul>
<p>All changes are noted in the <a
href="https://github.com/pyasn1/pyasn1/blob/master/CHANGES.rst">CHANGELOG</a>.</p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/pyasn1/pyasn1/blob/main/CHANGES.rst">pyasn1's
changelog</a>.</em></p>
<blockquote>
<h2>Revision 0.6.2, released 16-01-2026</h2>
<ul>
<li>CVE-2026-23490 (GHSA-63vm-454h-vhhq): Fixed continuation octet
limits
in OID/RELATIVE-OID decoder (thanks to tsigouris007)</li>
<li>Added support for Python 3.14
[pr <a
href="https://redirect.github.com/pyasn1/pyasn1/issues/97">#97</a>](<a
href="https://redirect.github.com/pyasn1/pyasn1/pull/97">pyasn1/pyasn1#97</a>)</li>
<li>Added SECURITY.md policy</li>
<li>Fixed unit tests failing due to missing code
[issue <a
href="https://redirect.github.com/pyasn1/pyasn1/issues/91">#91</a>](<a
href="https://redirect.github.com/pyasn1/pyasn1/issues/91">pyasn1/pyasn1#91</a>)
[pr <a
href="https://redirect.github.com/pyasn1/pyasn1/issues/92">#92</a>](<a
href="https://redirect.github.com/pyasn1/pyasn1/pull/92">pyasn1/pyasn1#92</a>)</li>
<li>Migrated to pyproject.toml packaging
[pr <a
href="https://redirect.github.com/pyasn1/pyasn1/issues/90">#90</a>](<a
href="https://redirect.github.com/pyasn1/pyasn1/pull/90">pyasn1/pyasn1#90</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="e7356f89cf"><code>e7356f8</code></a>
Prepare release 0.6.2</li>
<li><a
href="3908f14422"><code>3908f14</code></a>
Merge commit from fork</li>
<li><a
href="0a7e067674"><code>0a7e067</code></a>
Add support for Python 3.14 (<a
href="https://redirect.github.com/pyasn1/pyasn1/issues/97">#97</a>)</li>
<li><a
href="33656e986d"><code>33656e9</code></a>
Create Security Policy</li>
<li><a
href="fa62307253"><code>fa62307</code></a>
fix for issue <a
href="https://redirect.github.com/pyasn1/pyasn1/issues/91">#91</a>: unit
tests failing due to missing code (<a
href="https://redirect.github.com/pyasn1/pyasn1/issues/92">#92</a>)</li>
<li><a
href="f1ed02e41c"><code>f1ed02e</code></a>
Package pyasn1 with pyproject.toml (<a
href="https://redirect.github.com/pyasn1/pyasn1/issues/90">#90</a>)</li>
<li><a
href="93c4d4f0b6"><code>93c4d4f</code></a>
Switch documentation user to pyasn1 (<a
href="https://redirect.github.com/pyasn1/pyasn1/issues/89">#89</a>)</li>
<li>See full diff in <a
href="https://github.com/pyasn1/pyasn1/compare/v0.6.1...v0.6.2">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=pyasn1&package-manager=pip&previous-version=0.6.1&new-version=0.6.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the
[Security Alerts
page](https://github.com/element-hq/synapse/network/alerts).

</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-16 20:32:57 +00:00
Eric Eastwood
87d93b1ae6
Latest changes from importing/exporting from Grafana 12.3.1 (#19381)
These are automatic changes from importing/exporting from Grafana
12.3.1.

In order to verify that I'm not sneaking in any changes, you can follow
these steps to get the same output.

Reproduction instructions:

 1. Start [Grafana](https://hub.docker.com/r/grafana/grafana)
    ```
docker run -d --name=grafana --add-host
host.docker.internal:host-gateway -p 3000:3000 grafana/grafana
    ```
1. Visit the Grafana dashboard, http://localhost:3000/ (Credentials:
`admin`/`admin`)
 1. Import the Synapse dashboard: `contrib/grafana/synapse.json`
1. Export the Synapse dashboard. On the dashboard page -> **Export** ->
**Export as code** -> Using the **Classic** model -> Check **Export for
sharing externally** -> Copy
 1. Paste into `contrib/grafana/synapse.json`
 1. `git status`/`git diff` to check if there is any diff

Sanity checked the dashboard itself by importing the dashboard on
https://grafana.matrix.org/ (Grafana 10.4.1 according to
https://grafana.matrix.org/api/health). The process-level metrics won't
work because https://github.com/element-hq/synapse/pull/19337 just
merged and isn't on `matrix.org` yet. Also just generally, this
dashboard works for me locally with the
[load-tests](https://github.com/element-hq/synapse-rust-apps/pull/397)
I've been doing.


### Motivation

There are few fixes I want to make to the Grafana dashboard and it sucks
having to manually translate everything back over because we have
different formatting.

Hopefully after this bulk change, future exports will have exactly what
we want to change.
2026-01-16 11:36:49 -06:00
Eric Eastwood
13c6476d6e
Always rollback transaction when retrying (#19372)
Previously, because `conn.rollback()` was inside the `if i < MAX_NUMBER_OF_RETRIES:` condition,
it never rolled back on the final retry.

Part of https://github.com/element-hq/synapse/issues/19202

There are other problems mentioned in
https://github.com/element-hq/synapse/issues/19202 but this is a nice
standalone change.
2026-01-15 19:35:51 -06:00
Eric Eastwood
6363d77ba2
Warn about skipping reactor metrics when using unknown reactor type (#19383)
Spawning from not seeing any reactor metrics in the Grafana dashboard in
some load tests, noticing `python_twisted_reactor_tick_time_bucket` is
`0` in Prometheus, following it back to Synapse and seeing that we don't
warn about skipping reactor metrics in all cases (when using an unknown
reactor type).

A follow-up to this would be to actually figure out how to instrument
the `ProxiedReactor` or why `ProxiedReactor` is being chosen in the
first place and see if we can get it to use a more normal type
🤔


### Reproduction instructions

1. Using the Complement scripts **with workers**: `WORKERS=1
./scripts-dev/complement.sh ./tests/csapi`
 1. `docker logs complement_csapi_dirty_hs1 2>&1 | grep -i "reactor"`
1. With these changes, notice `Skipping configuring
ReactorLastSeenMetric: unexpected reactor type: <__main__.ProxiedReactor
object at 0x7fc0adaaea50>` and `Twisted reactor: ProxiedReactor`
 1. Cleanup:
- `docker stop $(docker ps --all --filter "label=complement_context"
--quiet)`
- `docker rm $(docker ps --all --filter "label=complement_context"
--quiet)`

I'm unable to reproduce with the normal Synapse images or
`complement-synapse` without workers. They all use `Twisted reactor:
EPollReactor`

<details>
<summary>Checking <code>docker/Dockerfile-workers</code></summary>

1. Build the Docker image for Synapse: `docker build -t
matrixdotorg/synapse -f docker/Dockerfile . && docker build -t
matrixdotorg/synapse-workers -f docker/Dockerfile-workers .`
([docs](7a24fafbc3/docker/README-testing.md (building-and-running-the-images-manually)))
 1. Start Synapse:
     ```
    docker run -d --name synapse \
        --mount type=volume,src=synapse-data,dst=/data \
        -e SYNAPSE_SERVER_NAME=my.docker.synapse.server \
        -e SYNAPSE_REPORT_STATS=no \
        -e SYNAPSE_ENABLE_METRICS=1 \
        -p 8008:8008 \
        -p 9469:9469 \
        matrixdotorg/synapse-workers:latest
    ```
 1. `docker logs synapse 2>&1 | grep -i "reactor"`
 1. Says `Twisted reactor: EPollReactor`
 
 </details>
2026-01-15 15:49:10 -06:00
Andrew Ferrazzutti
079c52e16b
MSC4140: delayed event content as text, not bytes (#19360)
Store the JSON content of scheduled delayed events as text instead of a
byte array. This brings it in line with the `event_json` table's `json`
column, and fixes the inability to schedule a delayed event with
non-ASCII characters in its content.

Fixes #19242
2026-01-15 16:05:19 +00:00
Eric Eastwood
a1e9abc7df
Add Prometheus HTTP service discovery endpoint for easy discovery of all workers in Docker image (#19336)
Add Prometheus [HTTP service discovery](https://prometheus.io/docs/prometheus/latest/http_sd/)
endpoint for easy discovery of all workers in Docker image.

Follow-up to https://github.com/element-hq/synapse/pull/19324

Spawning from wanting to [run a load
test](https://github.com/element-hq/synapse-rust-apps/pull/397) against
the Complement Docker image of Synapse and see metrics from the
homeserver.


`GET http://<synapse_container>:9469/metrics/service_discovery`
```json5
[
  {
    "targets": [ "<host>", ... ],
    "labels": {
      "<labelname>": "<labelvalue>", ...
    }
  },
  ...
]
```

The metrics from each worker can also be accessed via
`http://<synapse_container>:9469/metrics/worker/<worker_name>` which is
what the service discovery response points to behind the scenes. This
way, you only need to expose a single port (9469) to access all metrics.

<details>
<summary>Real HTTP service discovery response</summary>

```json5
[
    {
        "targets": [
            "localhost:9469"
        ],
        "labels": {
            "job": "event_persister",
            "index": "1",
            "__metrics_path__": "/metrics/worker/event_persister1"
        }
    },
    {
        "targets": [
            "localhost:9469"
        ],
        "labels": {
            "job": "event_persister",
            "index": "2",
            "__metrics_path__": "/metrics/worker/event_persister2"
        }
    },
    {
        "targets": [
            "localhost:9469"
        ],
        "labels": {
            "job": "background_worker",
            "index": "1",
            "__metrics_path__": "/metrics/worker/background_worker1"
        }
    },
    {
        "targets": [
            "localhost:9469"
        ],
        "labels": {
            "job": "event_creator",
            "index": "1",
            "__metrics_path__": "/metrics/worker/event_creator1"
        }
    },
    {
        "targets": [
            "localhost:9469"
        ],
        "labels": {
            "job": "user_dir",
            "index": "1",
            "__metrics_path__": "/metrics/worker/user_dir1"
        }
    },
    {
        "targets": [
            "localhost:9469"
        ],
        "labels": {
            "job": "media_repository",
            "index": "1",
            "__metrics_path__": "/metrics/worker/media_repository1"
        }
    },
    {
        "targets": [
            "localhost:9469"
        ],
        "labels": {
            "job": "federation_inbound",
            "index": "1",
            "__metrics_path__": "/metrics/worker/federation_inbound1"
        }
    },
    {
        "targets": [
            "localhost:9469"
        ],
        "labels": {
            "job": "federation_reader",
            "index": "1",
            "__metrics_path__": "/metrics/worker/federation_reader1"
        }
    },
    {
        "targets": [
            "localhost:9469"
        ],
        "labels": {
            "job": "federation_sender",
            "index": "1",
            "__metrics_path__": "/metrics/worker/federation_sender1"
        }
    },
    {
        "targets": [
            "localhost:9469"
        ],
        "labels": {
            "job": "synchrotron",
            "index": "1",
            "__metrics_path__": "/metrics/worker/synchrotron1"
        }
    },
    {
        "targets": [
            "localhost:9469"
        ],
        "labels": {
            "job": "client_reader",
            "index": "1",
            "__metrics_path__": "/metrics/worker/client_reader1"
        }
    },
    {
        "targets": [
            "localhost:9469"
        ],
        "labels": {
            "job": "appservice",
            "index": "1",
            "__metrics_path__": "/metrics/worker/appservice1"
        }
    },
    {
        "targets": [
            "localhost:9469"
        ],
        "labels": {
            "job": "pusher",
            "index": "1",
            "__metrics_path__": "/metrics/worker/pusher1"
        }
    },
    {
        "targets": [
            "localhost:9469"
        ],
        "labels": {
            "job": "device_lists",
            "index": "1",
            "__metrics_path__": "/metrics/worker/device_lists1"
        }
    },
    {
        "targets": [
            "localhost:9469"
        ],
        "labels": {
            "job": "device_lists",
            "index": "2",
            "__metrics_path__": "/metrics/worker/device_lists2"
        }
    },
    {
        "targets": [
            "localhost:9469"
        ],
        "labels": {
            "job": "stream_writers",
            "index": "1",
            "__metrics_path__": "/metrics/worker/stream_writers1"
        }
    },
    {
        "targets": [
            "localhost:9469"
        ],
        "labels": {
            "job": "main",
            "index": "1",
            "__metrics_path__": "/metrics/worker/main"
        }
    }
]
```

</details>


And how it ends up as targets in Prometheus
(http://localhost:9090/targets):

(image)


### Testing strategy

1. Make sure your firewall allows the Docker containers to communicate
to the host (`host.docker.internal`) so they can access exposed ports of
other Docker containers. We want to allow Synapse to access the
Prometheus container and Grafana to access to the Prometheus container.
- `sudo ufw allow in on docker0 comment "Allow traffic from the default
Docker network to the host machine (host.docker.internal)"`
- `sudo ufw allow in on br-+ comment "(from Matrix Complement testing)
Allow traffic from custom Docker networks to the host machine
(host.docker.internal)"`
- [Complement firewall
docs](ee6acd9154/README.md (potential-conflict-with-firewall-software))
1. Build the Docker image for Synapse: `docker build -t
matrixdotorg/synapse -f docker/Dockerfile . && docker build -t
matrixdotorg/synapse-workers -f docker/Dockerfile-workers .`
([docs](7a24fafbc3/docker/README-testing.md (building-and-running-the-images-manually)))
 1. Start Synapse:
     ```
    docker run -d --name synapse \
        --mount type=volume,src=synapse-data,dst=/data \
        -e SYNAPSE_SERVER_NAME=my.docker.synapse.server \
        -e SYNAPSE_REPORT_STATS=no \
        -e SYNAPSE_ENABLE_METRICS=1 \
        -p 8008:8008 \
        -p 9469:9469 \
        matrixdotorg/synapse-workers:latest
    ```
    - Also try with workers:
       ```
      docker run -d --name synapse \
          --mount type=volume,src=synapse-data,dst=/data \
          -e SYNAPSE_SERVER_NAME=my.docker.synapse.server \
          -e SYNAPSE_REPORT_STATS=no \
          -e SYNAPSE_ENABLE_METRICS=1 \
          -e SYNAPSE_WORKER_TYPES="\
              event_persister:2, \
              background_worker, \
              event_creator, \
              user_dir, \
              media_repository, \
              federation_inbound, \
              federation_reader, \
              federation_sender, \
              synchrotron, \
              client_reader, \
              appservice, \
              pusher, \
              device_lists:2, \
stream_writers=account_data+presence+receipts+to_device+typing" \
          -p 8008:8008 \
          -p 9469:9469 \
          matrixdotorg/synapse-workers:latest
      ```
1. You should be able to see Prometheus service discovery endpoint at
http://localhost:9469/metrics/service_discovery
 1. Create a Prometheus config (`prometheus.yml`)
    ```yaml
    global:
      scrape_interval: 15s
      scrape_timeout: 15s
      evaluation_interval: 15s
    
    scrape_configs:
      - job_name: synapse
        scrape_interval: 15s
        metrics_path: /_synapse/metrics
        scheme: http
# We set `honor_labels` so that each service can set their own `job`
label
        #
# > honor_labels controls how Prometheus handles conflicts between
labels that are
# > already present in scraped data and labels that Prometheus would
attach
# > server-side ("job" and "instance" labels, manually configured target
# > labels, and labels generated by service discovery implementations).
        # >
# > *--
https://prometheus.io/docs/prometheus/latest/configuration/configuration/#scrape_config*
        honor_labels: true
        # Use HTTP service discovery
        #
        # Reference:
        #  - https://prometheus.io/docs/prometheus/latest/http_sd/
# -
https://prometheus.io/docs/prometheus/latest/configuration/configuration/#http_sd_config
        http_sd_configs:
          - url: 'http://localhost:9469/metrics/service_discovery'
    ```
1. Start Prometheus (update the volume bind mount to the config you just
saved somewhere):
    ```
    docker run \
        --detach \
        --name=prometheus \
        --add-host host.docker.internal:host-gateway \
        -p 9090:9090 \
-v
~/Documents/code/random/prometheus-config/prometheus.yml:/etc/prometheus/prometheus.yml
\
        prom/prometheus
    ```
1. Make sure you're seeing some data in Prometheus. On
http://localhost:9090/query, search for `synapse_build_info`
 1. Start [Grafana](https://hub.docker.com/r/grafana/grafana)
    ```
docker run -d --name=grafana --add-host
host.docker.internal:host-gateway -p 3000:3000 grafana/grafana
    ```
1. Visit the Grafana dashboard, http://localhost:3000/ (Credentials:
`admin`/`admin`)
1. **Connections** -> **Data Sources** -> **Add data source** ->
**Prometheus**
     - Prometheus server URL: `http://host.docker.internal:9090`
1. Import the Synapse dashboard:
https://github.com/element-hq/synapse/blob/develop/contrib/grafana/synapse.json
2026-01-14 18:02:55 -06:00
Eric Eastwood
58f59ffbcb
Refactor Grafana dashboard to use server_name label (#19337)
- Update `synapse_xxx` (server-level) metrics to use
`server_name="$server_name",` instead of `instance="$instance"`
- Add `synapse_server_name_info` metric to map Synapse `server_name`s to
the `instance`s they're hosted on.
- For process level metrics, update to use `xxx * on (instance, job,
index) group_left(server_name)
synapse_server_name_info{server_name="$server_name"}`

All of the changes here are backwards compatible with whatever people
were doing before with their Prometheus/Grafana dashboards.

Previously, the recommendation was to use the `instance` label to group
everything under the same server (803e4b4d88/docs/metrics-howto.md (L93-L147))

But the `instance` label actually has a special meaning and we're
actually abusing it by using it that way:

> `instance`: The `<host>:<port>` part of the target's URL that was
scraped.
>
> *--
https://prometheus.io/docs/concepts/jobs_instances/#automatically-generated-labels-and-time-series*

Since https://github.com/element-hq/synapse/issues/18592 (Synapse
`v1.139.0`), we now have the `server_name` label to use instead.


---

Additionally, the assumption that a single process is serving a single
server is no longer true with [Synapse Pro for small
hosts](https://docs.element.io/latest/element-server-suite-pro/synapse-pro-for-small-hosts/overview/).

Part of https://github.com/element-hq/synapse-small-hosts/issues/106

### Motivating use case

Although this change also benefits [Synapse Pro for small
hosts](https://docs.element.io/latest/element-server-suite-pro/synapse-pro-for-small-hosts/overview/)
(https://github.com/element-hq/synapse-small-hosts/issues/106), this is
actually spawning from adding Prometheus metrics to our workerized
Docker image (https://github.com/element-hq/synapse/pull/19324,
https://github.com/element-hq/synapse/pull/19336) with a more correct
label setup (without `instance`) and wanting the dashboard to be better.



### Testing strategy

1. Make sure your firewall allows the Docker containers to communicate
to the host (`host.docker.internal`) so they can access exposed ports of
other Docker containers. We want to allow Synapse to access the
Prometheus container and Grafana to access to the Prometheus container.
- `sudo ufw allow in on docker0 comment "Allow traffic from the default
Docker network to the host machine (host.docker.internal)"`
- `sudo ufw allow in on br-+ comment "(from Matrix Complement testing)
Allow traffic from custom Docker networks to the host machine
(host.docker.internal)"`
- [Complement firewall
docs](ee6acd9154/README.md (potential-conflict-with-firewall-software))
1. Build the Docker image for Synapse: `docker build -t
matrixdotorg/synapse -f docker/Dockerfile .`
([docs](7a24fafbc3/docker/README-testing.md (building-and-running-the-images-manually)))
 1. Generate config for Synapse:
    ```
    docker run -it --rm \
        --mount type=volume,src=synapse-data,dst=/data \
        -e SYNAPSE_SERVER_NAME=my.docker.synapse.server \
        -e SYNAPSE_REPORT_STATS=yes \
        -e SYNAPSE_ENABLE_METRICS=1 \
        matrixdotorg/synapse:latest generate
    ```
 1. Start Synapse:
     ```
    docker run -d --name synapse \
        --mount type=volume,src=synapse-data,dst=/data \
        -p 8008:8008 \
        -p 19090:19090 \
        matrixdotorg/synapse:latest
    ```
1. You should be able to see metrics from Synapse at
http://localhost:19090/_synapse/metrics
 1. Create a Prometheus config (`prometheus.yml`)
    ```yaml
    global:
      scrape_interval: 15s
      scrape_timeout: 15s
      evaluation_interval: 15s
    
    scrape_configs:
      - job_name: prometheus
        scrape_interval: 15s
        metrics_path: /_synapse/metrics
        scheme: http
        static_configs:
          - targets:
# This should point to the Synapse metrics listener (we're using
`host.docker.internal` because this is from within the Prometheus
container)
              - host.docker.internal:19090
    ```
1. Start Prometheus (update the volume bind mount to the config you just
saved somewhere):
    ```
    docker run \
        --detach \
        --name=prometheus \
        --add-host host.docker.internal:host-gateway \
        -p 9090:9090 \
-v
~/Documents/code/random/prometheus-config/prometheus.yml:/etc/prometheus/prometheus.yml
\
        prom/prometheus
    ```
1. Make sure you're seeing some data in Prometheus. On
http://localhost:9090/query, search for `synapse_build_info`
 1. Start [Grafana](https://hub.docker.com/r/grafana/grafana)
    ```
docker run -d --name=grafana --add-host
host.docker.internal:host-gateway -p 3000:3000 grafana/grafana
    ```
1. Visit the Grafana dashboard, http://localhost:3000/ (Credentials:
`admin`/`admin`)
1. **Connections** -> **Data Sources** -> **Add data source** ->
**Prometheus**
     - Prometheus server URL: `http://host.docker.internal:9090`
 1. Import the Synapse dashboard: `contrib/grafana/synapse.json`

To test workers, you can use the testing strategy from
https://github.com/element-hq/synapse/pull/19336 (assumes both changes
from this PR and the other PR are combined)
2026-01-14 17:57:42 -06:00
Devon Hudson
9b776c6a48
Minor changelog tweaks post-release (#19376)
### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [X] Pull request is based on the develop branch
* [X] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [X] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct (run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))
2026-01-13 18:25:07 +00:00
Devon Hudson
8eb9d7895d
Merge branch 'master' into develop 2026-01-13 09:51:11 -07:00
Andrew Morgan
9285cdf041
Update usage of deprecated release.title in release script (#19358) 2026-01-13 15:52:01 +00:00
Devon Hudson
27223a349c
1.145.0
Some checks are pending
Schema / Ensure Synapse config schema is valid (push) Waiting to run
Schema / Ensure generated documentation is up-to-date (push) Waiting to run
Tests / lint-clippy (push) Blocked by required conditions
Tests / cargo-test (push) Blocked by required conditions
Tests / cargo-bench (push) Blocked by required conditions
Tests / tests-done (push) Blocked by required conditions
Tests / changes (push) Waiting to run
Tests / check-sampleconfig (push) Blocked by required conditions
Tests / check-schema-delta (push) Blocked by required conditions
Tests / check-lockfile (push) Waiting to run
Tests / lint (push) Blocked by required conditions
Tests / Typechecking (push) Blocked by required conditions
Tests / lint-crlf (push) Waiting to run
Tests / lint-newsfile (push) Waiting to run
Tests / lint-clippy-nightly (push) Blocked by required conditions
Tests / lint-rust (push) Blocked by required conditions
Tests / lint-rustfmt (push) Blocked by required conditions
Tests / lint-readme (push) Blocked by required conditions
Tests / linting-done (push) Blocked by required conditions
Tests / calculate-test-jobs (push) Blocked by required conditions
Tests / trial (push) Blocked by required conditions
Tests / trial-olddeps (push) Blocked by required conditions
Tests / trial-pypy (all, pypy-3.10) (push) Blocked by required conditions
Tests / sytest (push) Blocked by required conditions
Tests / export-data (push) Blocked by required conditions
Tests / portdb (14, 3.10) (push) Blocked by required conditions
Tests / portdb (17, 3.14) (push) Blocked by required conditions
Tests / complement (monolith, Postgres) (push) Blocked by required conditions
Tests / complement (monolith, SQLite) (push) Blocked by required conditions
Tests / complement (workers, Postgres) (push) Blocked by required conditions
2026-01-13 08:38:14 -07:00
Olivier 'reivilibre
8e2e81430c
Tweak docstrings and signatures of auth_types_for_event and get_catchup_room_event_ids. (#19320)
A couple of tiny tweaks pulled out of #18968.
2026-01-13 15:00:35 +00:00
dependabot[bot]
164b980085
Bump the minor-and-patches group with 2 updates (#19339) 2026-01-13 14:25:42 +00:00
Andrew Morgan
daa4398818
Update Element logo to be an absolute URL, so it will render on PyPI (#19368) 2026-01-13 12:03:34 +00:00
timedout
6e80f2c43a
Fall back to checking power levels when sourcing local restricted join users (#19321)
Fix https://github.com/element-hq/synapse/issues/19120 by always falling
back to checking power levels for local users if a local creator cannot
be found in a v12 room.

Complement tests: https://github.com/matrix-org/complement/pull/836
2026-01-12 12:00:33 -06:00
Will Hunt
8f42f07bef
Remove MSC2697 (legacy dehydrated devices) (#19346)
Fixes #19347 

This deprecates MSC2697 which has been closed since May 2024. As per
#19347 this seems to be a thing we can just rip out. The crypto team
have moved onto MSC3814 and are suggesting that developers who rely on
MSC2697 should use MSC3814 instead.

MSC2697 implementation originally introduced by https://github.com/matrix-org/synapse/pull/8380
2026-01-12 10:32:38 -06:00
dependabot[bot]
5a3362c012
Bump authlib from 1.6.5 to 1.6.6 (#19363)
Some checks failed
Tests / lint-newsfile (push) Has been cancelled
Tests / lint-rust (push) Has been cancelled
Tests / lint-rustfmt (push) Has been cancelled
Tests / linting-done (push) Has been cancelled
Tests / calculate-test-jobs (push) Has been cancelled
Tests / trial (push) Has been cancelled
Tests / trial-olddeps (push) Has been cancelled
Tests / trial-pypy (all, pypy-3.10) (push) Has been cancelled
Tests / sytest (push) Has been cancelled
Tests / export-data (push) Has been cancelled
Tests / portdb (14, 3.10) (push) Has been cancelled
Tests / portdb (17, 3.14) (push) Has been cancelled
Tests / complement (monolith, Postgres) (push) Has been cancelled
Tests / lint (push) Has been cancelled
Tests / lint-readme (push) Has been cancelled
Build docker images / Push merged images to docker.io/matrixdotorg/synapse (push) Has been cancelled
Build docker images / Push merged images to ghcr.io/element-hq/synapse (push) Has been cancelled
Deploy the documentation / GitHub Pages (push) Has been cancelled
Build release artifacts / Build .deb packages (push) Has been cancelled
Build release artifacts / Attach assets to release (push) Has been cancelled
Tests / check-sampleconfig (push) Has been cancelled
Tests / check-schema-delta (push) Has been cancelled
Tests / Typechecking (push) Has been cancelled
Tests / lint-clippy (push) Has been cancelled
Tests / lint-clippy-nightly (push) Has been cancelled
Tests / complement (monolith, SQLite) (push) Has been cancelled
Tests / complement (workers, Postgres) (push) Has been cancelled
Tests / cargo-test (push) Has been cancelled
Tests / cargo-bench (push) Has been cancelled
Tests / tests-done (push) Has been cancelled
Bumps [authlib](https://github.com/authlib/authlib) from 1.6.5 to 1.6.6.
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/authlib/authlib/blob/main/docs/changelog.rst">authlib's
changelog</a>.</em></p>
<blockquote>
<h2>Version 1.6.6</h2>
<p><strong>Released on Dec 12, 2025</strong></p>
<ul>
<li><code>get_jwt_config</code> takes a <code>client</code> parameter,
:pr:<code>844</code>.</li>
<li>Fix incorrect signature when <code>Content-Type</code> is
x-www-form-urlencoded for OAuth 1.0 Client, :pr:<code>778</code>.</li>
<li>Use <code>expires_in</code> in <code>OAuth2Token</code> when
<code>expires_at</code> is unparsable, :pr:<code>842</code>.</li>
<li>Always track <code>state</code> in session for OAuth client
integrations.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="bb7a315bef"><code>bb7a315</code></a>
chore: release 1.6.6</li>
<li><a
href="0a423d4638"><code>0a423d4</code></a>
Merge pull request <a
href="https://redirect.github.com/authlib/authlib/issues/844">#844</a>
from azmeuk/806-get-jwt-config-client</li>
<li><a
href="2808378611"><code>2808378</code></a>
Merge commit from fork</li>
<li><a
href="714502a473"><code>714502a</code></a>
feat: get_jwt_config takes a client parameter</li>
<li><a
href="260d04edee"><code>260d04e</code></a>
Fix: Use <code>expires_in</code> when <code>expires_at</code> is
unparsable</li>
<li><a
href="eb37124bbb"><code>eb37124</code></a>
Merge pull request <a
href="https://redirect.github.com/authlib/authlib/issues/778">#778</a>
from shc261392/fix-httpx-oauth1-form-data-incorrect-s...</li>
<li><a
href="0ba9ec4fee"><code>0ba9ec4</code></a>
docs: fix guide on requests self signed certificate</li>
<li><a
href="a2e9943815"><code>a2e9943</code></a>
docs: indicate that <a
href="https://redirect.github.com/authlib/authlib/issues/743">#743</a>
needs a migration</li>
<li><a
href="06015d2065"><code>06015d2</code></a>
test: factorize the token fixture</li>
<li>See full diff in <a
href="https://github.com/authlib/authlib/compare/v1.6.5...v1.6.6">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=authlib&package-manager=pip&previous-version=1.6.5&new-version=1.6.6)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the
[Security Alerts
page](https://github.com/element-hq/synapse/network/alerts).

</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-09 14:05:25 +00:00
Devon Hudson
ff0fa0fd51
Merge branch 'release-v1.145' into develop
Some checks are pending
Schema / Ensure Synapse config schema is valid (push) Waiting to run
Tests / calculate-test-jobs (push) Blocked by required conditions
Tests / changes (push) Waiting to run
Tests / check-schema-delta (push) Blocked by required conditions
Schema / Ensure generated documentation is up-to-date (push) Waiting to run
Tests / tests-done (push) Blocked by required conditions
Tests / lint-readme (push) Blocked by required conditions
Tests / cargo-test (push) Blocked by required conditions
Tests / cargo-bench (push) Blocked by required conditions
Tests / check-sampleconfig (push) Blocked by required conditions
Tests / lint (push) Blocked by required conditions
Tests / linting-done (push) Blocked by required conditions
Tests / Typechecking (push) Blocked by required conditions
Tests / lint-crlf (push) Waiting to run
Tests / lint-newsfile (push) Waiting to run
Tests / lint-clippy (push) Blocked by required conditions
Tests / lint-clippy-nightly (push) Blocked by required conditions
Tests / lint-rust (push) Blocked by required conditions
Tests / trial (push) Blocked by required conditions
Tests / trial-olddeps (push) Blocked by required conditions
Tests / sytest (push) Blocked by required conditions
Tests / export-data (push) Blocked by required conditions
Tests / portdb (14, 3.10) (push) Blocked by required conditions
Tests / complement (monolith, SQLite) (push) Blocked by required conditions
Tests / check-lockfile (push) Waiting to run
Tests / lint-rustfmt (push) Blocked by required conditions
Tests / trial-pypy (all, pypy-3.10) (push) Blocked by required conditions
Tests / portdb (17, 3.14) (push) Blocked by required conditions
Tests / complement (monolith, Postgres) (push) Blocked by required conditions
Tests / complement (workers, Postgres) (push) Blocked by required conditions
2026-01-08 12:34:13 -07:00
Devon Hudson
438aa7c876
1.145.0rc4
Some checks failed
Schema / Ensure generated documentation is up-to-date (push) Has been cancelled
Tests / changes (push) Has been cancelled
Tests / check-lockfile (push) Has been cancelled
Tests / lint-clippy (push) Has been cancelled
Tests / lint-clippy-nightly (push) Has been cancelled
Tests / cargo-test (push) Has been cancelled
Deploy the documentation / GitHub Pages (push) Has been cancelled
Build release artifacts / Build .deb packages (push) Has been cancelled
Tests / cargo-bench (push) Has been cancelled
Build release artifacts / Attach assets to release (push) Has been cancelled
Tests / tests-done (push) Has been cancelled
Tests / check-sampleconfig (push) Has been cancelled
Tests / check-schema-delta (push) Has been cancelled
Tests / lint (push) Has been cancelled
Tests / Typechecking (push) Has been cancelled
Tests / lint-rust (push) Has been cancelled
Tests / lint-rustfmt (push) Has been cancelled
Tests / lint-readme (push) Has been cancelled
Tests / linting-done (push) Has been cancelled
Tests / calculate-test-jobs (push) Has been cancelled
Tests / trial (push) Has been cancelled
Tests / trial-olddeps (push) Has been cancelled
Tests / trial-pypy (all, pypy-3.10) (push) Has been cancelled
Tests / sytest (push) Has been cancelled
Tests / export-data (push) Has been cancelled
Tests / portdb (14, 3.10) (push) Has been cancelled
Tests / portdb (17, 3.14) (push) Has been cancelled
Tests / complement (monolith, Postgres) (push) Has been cancelled
Tests / complement (monolith, SQLite) (push) Has been cancelled
Tests / complement (workers, Postgres) (push) Has been cancelled
2026-01-08 12:09:01 -07:00
Devon Hudson
15700e0a32
Only exclude .so files for sdist packaging
Some checks are pending
Schema / Ensure Synapse config schema is valid (push) Waiting to run
Schema / Ensure generated documentation is up-to-date (push) Waiting to run
Tests / Typechecking (push) Blocked by required conditions
Tests / lint-crlf (push) Waiting to run
Tests / lint-newsfile (push) Waiting to run
Tests / lint-clippy (push) Blocked by required conditions
Tests / lint-clippy-nightly (push) Blocked by required conditions
Tests / changes (push) Waiting to run
Tests / check-schema-delta (push) Blocked by required conditions
Tests / check-lockfile (push) Waiting to run
Tests / lint (push) Blocked by required conditions
Tests / cargo-test (push) Blocked by required conditions
Tests / cargo-bench (push) Blocked by required conditions
Tests / check-sampleconfig (push) Blocked by required conditions
Tests / lint-rust (push) Blocked by required conditions
Tests / lint-readme (push) Blocked by required conditions
Tests / linting-done (push) Blocked by required conditions
Tests / calculate-test-jobs (push) Blocked by required conditions
Tests / trial (push) Blocked by required conditions
Tests / trial-olddeps (push) Blocked by required conditions
Tests / trial-pypy (all, pypy-3.10) (push) Blocked by required conditions
Tests / sytest (push) Blocked by required conditions
Tests / export-data (push) Blocked by required conditions
Tests / tests-done (push) Blocked by required conditions
Tests / portdb (14, 3.10) (push) Blocked by required conditions
Tests / portdb (17, 3.14) (push) Blocked by required conditions
Tests / complement (monolith, Postgres) (push) Blocked by required conditions
Tests / complement (monolith, SQLite) (push) Blocked by required conditions
Tests / complement (workers, Postgres) (push) Blocked by required conditions
Tests / lint-rustfmt (push) Blocked by required conditions
2026-01-08 11:22:59 -07:00
Mathieu Velten
d372ab3280
Add cancel_task API to the task scheduler (#19310)
Some checks failed
/ Check locked dependencies have sdists (push) Has been cancelled
Tests / portdb (17, 3.14) (push) Blocked by required conditions
Schema / Ensure generated documentation is up-to-date (push) Waiting to run
Tests / changes (push) Waiting to run
Tests / check-sampleconfig (push) Blocked by required conditions
Tests / check-schema-delta (push) Blocked by required conditions
Tests / check-lockfile (push) Waiting to run
Tests / lint (push) Blocked by required conditions
Tests / Typechecking (push) Blocked by required conditions
Tests / lint-crlf (push) Waiting to run
Tests / lint-newsfile (push) Waiting to run
Tests / lint-clippy (push) Blocked by required conditions
Tests / lint-clippy-nightly (push) Blocked by required conditions
Tests / lint-rust (push) Blocked by required conditions
Tests / lint-rustfmt (push) Blocked by required conditions
Tests / lint-readme (push) Blocked by required conditions
Tests / linting-done (push) Blocked by required conditions
Tests / calculate-test-jobs (push) Blocked by required conditions
Tests / trial (push) Blocked by required conditions
Tests / portdb (14, 3.10) (push) Blocked by required conditions
Tests / complement (monolith, Postgres) (push) Blocked by required conditions
Tests / complement (monolith, SQLite) (push) Blocked by required conditions
Tests / complement (workers, Postgres) (push) Blocked by required conditions
Tests / cargo-test (push) Blocked by required conditions
Tests / cargo-bench (push) Blocked by required conditions
Tests / tests-done (push) Blocked by required conditions
Tests / trial-olddeps (push) Blocked by required conditions
Tests / trial-pypy (all, pypy-3.10) (push) Blocked by required conditions
Tests / sytest (push) Blocked by required conditions
Tests / export-data (push) Blocked by required conditions
Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2026-01-08 18:21:24 +00:00
Eric Eastwood
ace2614fad
Remove docs on dead legacy metric names (#19341)
These metrics were [removed completely from the
codebase](444bc56cda/docs/changelogs/CHANGES-2022.md (synapse-1730-2022-12-06))
in Synapse v1.73.0 (2022-12-06). 3-years is plenty enough time


The deprecation/removal is still in our [upgrade
notes](444bc56cda/docs/upgrade.md (deprecation-of-legacy-prometheus-metric-names))
which points to a durable versioned link with the info still available:
https://element-hq.github.io/synapse/v1.69/metrics-howto.html#renaming-of-metrics--deprecation-of-old-names-in-12
2026-01-08 10:03:15 -06:00
dependabot[bot]
da7b32e8df
Bump urllib3 from 2.6.0 to 2.6.3 (#19361)
Bumps [urllib3](https://github.com/urllib3/urllib3) from 2.6.0 to 2.6.3.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/urllib3/urllib3/releases">urllib3's
releases</a>.</em></p>
<blockquote>
<h2>2.6.3</h2>
<h2>🚀 urllib3 is fundraising for HTTP/2 support</h2>
<p><a
href="https://sethmlarson.dev/urllib3-is-fundraising-for-http2-support">urllib3
is raising ~$40,000 USD</a> to release HTTP/2 support and ensure
long-term sustainable maintenance of the project after a sharp decline
in financial support. If your company or organization uses Python and
would benefit from HTTP/2 support in Requests, pip, cloud SDKs, and
thousands of other projects <a
href="https://opencollective.com/urllib3">please consider contributing
financially</a> to ensure HTTP/2 support is developed sustainably and
maintained for the long-haul.</p>
<p>Thank you for your support.</p>
<h2>Changes</h2>
<ul>
<li>Fixed a security issue where decompression-bomb safeguards of the
streaming API were bypassed when HTTP redirects were followed.
(CVE-2026-21441 reported by <a
href="https://github.com/D47A"><code>@​D47A</code></a>, 8.9 High,
GHSA-38jv-5279-wg99)</li>
<li>Started treating <code>Retry-After</code> times greater than 6 hours
as 6 hours by default. (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3743">urllib3/urllib3#3743</a>)</li>
<li>Fixed <code>urllib3.connection.VerifiedHTTPSConnection</code> on
Emscripten. (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3752">urllib3/urllib3#3752</a>)</li>
</ul>
<h2>2.6.2</h2>
<h2>🚀 urllib3 is fundraising for HTTP/2 support</h2>
<p><a
href="https://sethmlarson.dev/urllib3-is-fundraising-for-http2-support">urllib3
is raising ~$40,000 USD</a> to release HTTP/2 support and ensure
long-term sustainable maintenance of the project after a sharp decline
in financial support. If your company or organization uses Python and
would benefit from HTTP/2 support in Requests, pip, cloud SDKs, and
thousands of other projects <a
href="https://opencollective.com/urllib3">please consider contributing
financially</a> to ensure HTTP/2 support is developed sustainably and
maintained for the long-haul.</p>
<p>Thank you for your support.</p>
<h2>Changes</h2>
<ul>
<li>Fixed <code>HTTPResponse.read_chunked()</code> to properly handle
leftover data in the decoder's buffer when reading compressed chunked
responses. (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3734">urllib3/urllib3#3734</a>)</li>
</ul>
<h2>2.6.1</h2>
<h2>🚀 urllib3 is fundraising for HTTP/2 support</h2>
<p><a
href="https://sethmlarson.dev/urllib3-is-fundraising-for-http2-support">urllib3
is raising ~$40,000 USD</a> to release HTTP/2 support and ensure
long-term sustainable maintenance of the project after a sharp decline
in financial support. If your company or organization uses Python and
would benefit from HTTP/2 support in Requests, pip, cloud SDKs, and
thousands of other projects <a
href="https://opencollective.com/urllib3">please consider contributing
financially</a> to ensure HTTP/2 support is developed sustainably and
maintained for the long-haul.</p>
<p>Thank you for your support.</p>
<h2>Changes</h2>
<ul>
<li>Restore previously removed <code>HTTPResponse.getheaders()</code>
and <code>HTTPResponse.getheader()</code> methods. (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3731">#3731</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/urllib3/urllib3/blob/main/CHANGES.rst">urllib3's
changelog</a>.</em></p>
<blockquote>
<h1>2.6.3 (2026-01-07)</h1>
<ul>
<li>Fixed a high-severity security issue where decompression-bomb
safeguards of
the streaming API were bypassed when HTTP redirects were followed.
(<code>GHSA-38jv-5279-wg99
&lt;https://github.com/urllib3/urllib3/security/advisories/GHSA-38jv-5279-wg99&gt;</code>__)</li>
<li>Started treating <code>Retry-After</code> times greater than 6 hours
as 6 hours by
default. (<code>[#3743](https://github.com/urllib3/urllib3/issues/3743)
&lt;https://github.com/urllib3/urllib3/issues/3743&gt;</code>__)</li>
<li>Fixed <code>urllib3.connection.VerifiedHTTPSConnection</code> on
Emscripten.
(<code>[#3752](https://github.com/urllib3/urllib3/issues/3752)
&lt;https://github.com/urllib3/urllib3/issues/3752&gt;</code>__)</li>
</ul>
<h1>2.6.2 (2025-12-11)</h1>
<ul>
<li>Fixed <code>HTTPResponse.read_chunked()</code> to properly handle
leftover data in
the decoder's buffer when reading compressed chunked responses.
(<code>[#3734](https://github.com/urllib3/urllib3/issues/3734)
&lt;https://github.com/urllib3/urllib3/issues/3734&gt;</code>__)</li>
</ul>
<h1>2.6.1 (2025-12-08)</h1>
<ul>
<li>Restore previously removed <code>HTTPResponse.getheaders()</code>
and
<code>HTTPResponse.getheader()</code> methods.
(<code>[#3731](https://github.com/urllib3/urllib3/issues/3731)
&lt;https://github.com/urllib3/urllib3/issues/3731&gt;</code>__)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="0248277dd7"><code>0248277</code></a>
Release 2.6.3</li>
<li><a
href="8864ac407b"><code>8864ac4</code></a>
Merge commit from fork</li>
<li><a
href="70cecb27ca"><code>70cecb2</code></a>
Fix Scorecard issues related to vulnerable dev dependencies (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3755">#3755</a>)</li>
<li><a
href="41f249abe1"><code>41f249a</code></a>
Move &quot;v2.0 Migration Guide&quot; to the end of the table of
contents (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3747">#3747</a>)</li>
<li><a
href="fd4dffd2fc"><code>fd4dffd</code></a>
Patch <code>VerifiedHTTPSConnection</code> for Emscripten (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3752">#3752</a>)</li>
<li><a
href="13f0bfd55e"><code>13f0bfd</code></a>
Handle massive values in Retry-After when calculating time to sleep for
(<a
href="https://redirect.github.com/urllib3/urllib3/issues/3743">#3743</a>)</li>
<li><a
href="8c480bf87b"><code>8c480bf</code></a>
Bump actions/upload-artifact from 5.0.0 to 6.0.0 (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3748">#3748</a>)</li>
<li><a
href="4b40616e95"><code>4b40616</code></a>
Bump actions/cache from 4.3.0 to 5.0.1 (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3750">#3750</a>)</li>
<li><a
href="82b8479663"><code>82b8479</code></a>
Bump actions/download-artifact from 6.0.0 to 7.0.0 (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3749">#3749</a>)</li>
<li><a
href="34284cb017"><code>34284cb</code></a>
Mention experimental features in the security policy (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3746">#3746</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/urllib3/urllib3/compare/2.6.0...2.6.3">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=urllib3&package-manager=pip&previous-version=2.6.0&new-version=2.6.3)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the
[Security Alerts
page](https://github.com/element-hq/synapse/network/alerts).

</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-08 14:39:05 +00:00
Devon Hudson
3f2887cf80
Merge branch 'release-v1.145' into develop
Some checks are pending
Tests / check-sampleconfig (push) Blocked by required conditions
Tests / lint-clippy-nightly (push) Blocked by required conditions
Tests / trial-olddeps (push) Blocked by required conditions
Tests / sytest (push) Blocked by required conditions
Tests / portdb (17, 3.14) (push) Blocked by required conditions
Tests / complement (monolith, Postgres) (push) Blocked by required conditions
Tests / complement (monolith, SQLite) (push) Blocked by required conditions
Tests / complement (workers, Postgres) (push) Blocked by required conditions
Tests / cargo-test (push) Blocked by required conditions
Tests / cargo-bench (push) Blocked by required conditions
Tests / tests-done (push) Blocked by required conditions
Schema / Ensure Synapse config schema is valid (push) Waiting to run
Schema / Ensure generated documentation is up-to-date (push) Waiting to run
Tests / changes (push) Waiting to run
Tests / check-schema-delta (push) Blocked by required conditions
Tests / check-lockfile (push) Waiting to run
Tests / lint (push) Blocked by required conditions
Tests / Typechecking (push) Blocked by required conditions
Tests / lint-crlf (push) Waiting to run
Tests / lint-newsfile (push) Waiting to run
Tests / lint-clippy (push) Blocked by required conditions
Tests / lint-rust (push) Blocked by required conditions
Tests / lint-rustfmt (push) Blocked by required conditions
Tests / lint-readme (push) Blocked by required conditions
Tests / linting-done (push) Blocked by required conditions
Tests / calculate-test-jobs (push) Blocked by required conditions
Tests / trial (push) Blocked by required conditions
Tests / trial-pypy (all, pypy-3.10) (push) Blocked by required conditions
Tests / export-data (push) Blocked by required conditions
Tests / portdb (14, 3.10) (push) Blocked by required conditions
2026-01-07 15:55:46 -07:00
Devon Hudson
ade89c4317
1.145.0rc3
Some checks are pending
Schema / Ensure Synapse config schema is valid (push) Waiting to run
Schema / Ensure generated documentation is up-to-date (push) Waiting to run
Tests / changes (push) Waiting to run
Tests / check-sampleconfig (push) Blocked by required conditions
Tests / check-schema-delta (push) Blocked by required conditions
Tests / check-lockfile (push) Waiting to run
Tests / lint (push) Blocked by required conditions
Tests / Typechecking (push) Blocked by required conditions
Tests / lint-crlf (push) Waiting to run
Tests / lint-newsfile (push) Waiting to run
Tests / lint-clippy (push) Blocked by required conditions
Tests / lint-clippy-nightly (push) Blocked by required conditions
Tests / lint-rust (push) Blocked by required conditions
Tests / lint-rustfmt (push) Blocked by required conditions
Tests / lint-readme (push) Blocked by required conditions
Tests / linting-done (push) Blocked by required conditions
Tests / calculate-test-jobs (push) Blocked by required conditions
Tests / trial (push) Blocked by required conditions
Tests / trial-olddeps (push) Blocked by required conditions
Tests / trial-pypy (all, pypy-3.10) (push) Blocked by required conditions
Tests / sytest (push) Blocked by required conditions
Tests / export-data (push) Blocked by required conditions
Tests / portdb (14, 3.10) (push) Blocked by required conditions
Tests / portdb (17, 3.14) (push) Blocked by required conditions
Tests / complement (monolith, Postgres) (push) Blocked by required conditions
Tests / complement (monolith, SQLite) (push) Blocked by required conditions
Tests / complement (workers, Postgres) (push) Blocked by required conditions
Tests / cargo-test (push) Blocked by required conditions
Tests / cargo-bench (push) Blocked by required conditions
Tests / tests-done (push) Blocked by required conditions
2026-01-07 15:33:27 -07:00
Devon Hudson
66b1daa679
Limit maturin includes to sdist packaging 2026-01-07 15:23:00 -07:00
Andrew Morgan
1db2302303
Bump mdbook from 0.4.17 -> 0.5.2 and remove custom table-of-contents plugin (#19356) 2026-01-07 18:46:03 +00:00
Lukas Tautz
8ff1960878
Fix: use correct parameter when calling get_local_current_membership_for_user_in_room (#19353)
Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2026-01-07 18:32:16 +00:00
Devon Hudson
cbc5469113
Merge branch 'release-v1.145' into develop
Some checks are pending
Schema / Ensure Synapse config schema is valid (push) Waiting to run
Schema / Ensure generated documentation is up-to-date (push) Waiting to run
Tests / changes (push) Waiting to run
Tests / check-sampleconfig (push) Blocked by required conditions
Tests / check-schema-delta (push) Blocked by required conditions
Tests / check-lockfile (push) Waiting to run
Tests / lint (push) Blocked by required conditions
Tests / Typechecking (push) Blocked by required conditions
Tests / lint-crlf (push) Waiting to run
Tests / calculate-test-jobs (push) Blocked by required conditions
Tests / portdb (17, 3.14) (push) Blocked by required conditions
Tests / complement (monolith, Postgres) (push) Blocked by required conditions
Tests / complement (monolith, SQLite) (push) Blocked by required conditions
Tests / lint-newsfile (push) Waiting to run
Tests / lint-clippy (push) Blocked by required conditions
Tests / lint-clippy-nightly (push) Blocked by required conditions
Tests / lint-rust (push) Blocked by required conditions
Tests / lint-rustfmt (push) Blocked by required conditions
Tests / lint-readme (push) Blocked by required conditions
Tests / linting-done (push) Blocked by required conditions
Tests / trial (push) Blocked by required conditions
Tests / trial-olddeps (push) Blocked by required conditions
Tests / trial-pypy (all, pypy-3.10) (push) Blocked by required conditions
Tests / sytest (push) Blocked by required conditions
Tests / export-data (push) Blocked by required conditions
Tests / portdb (14, 3.10) (push) Blocked by required conditions
Tests / complement (workers, Postgres) (push) Blocked by required conditions
Tests / cargo-test (push) Blocked by required conditions
Tests / cargo-bench (push) Blocked by required conditions
Tests / tests-done (push) Blocked by required conditions
2026-01-07 10:34:01 -07:00
Devon Hudson
ecd67df49d
1.145.0rc2
Some checks are pending
Schema / Ensure Synapse config schema is valid (push) Waiting to run
Schema / Ensure generated documentation is up-to-date (push) Waiting to run
Tests / changes (push) Waiting to run
Tests / check-sampleconfig (push) Blocked by required conditions
Tests / check-schema-delta (push) Blocked by required conditions
Tests / check-lockfile (push) Waiting to run
Tests / lint (push) Blocked by required conditions
Tests / lint-crlf (push) Waiting to run
Tests / lint-newsfile (push) Waiting to run
Tests / lint-clippy (push) Blocked by required conditions
Tests / Typechecking (push) Blocked by required conditions
Tests / lint-clippy-nightly (push) Blocked by required conditions
Tests / lint-rust (push) Blocked by required conditions
Tests / lint-rustfmt (push) Blocked by required conditions
Tests / lint-readme (push) Blocked by required conditions
Tests / linting-done (push) Blocked by required conditions
Tests / calculate-test-jobs (push) Blocked by required conditions
Tests / trial (push) Blocked by required conditions
Tests / trial-olddeps (push) Blocked by required conditions
Tests / trial-pypy (all, pypy-3.10) (push) Blocked by required conditions
Tests / sytest (push) Blocked by required conditions
Tests / export-data (push) Blocked by required conditions
Tests / portdb (14, 3.10) (push) Blocked by required conditions
Tests / portdb (17, 3.14) (push) Blocked by required conditions
Tests / complement (monolith, Postgres) (push) Blocked by required conditions
Tests / complement (monolith, SQLite) (push) Blocked by required conditions
Tests / complement (workers, Postgres) (push) Blocked by required conditions
Tests / cargo-test (push) Blocked by required conditions
Tests / cargo-bench (push) Blocked by required conditions
Tests / tests-done (push) Blocked by required conditions
2026-01-07 10:11:44 -07:00
Devon Hudson
13dff90b5b
Fix sdist include formatting for maturin 2026-01-07 10:08:03 -07:00
Kierre
7ea78671a3
Drop support for Ubuntu 25.04 'Plucky Puffin', add support for Ubuntu 25.10 'Questing Quokka' (#19348)
Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2026-01-07 14:10:25 +00:00
Hugh Nimmo-Smith
4dcf113bff
Support for stable m.oauth UIA stage for MSC4312 (#19273) 2026-01-07 12:52:21 +00:00
Patrice Brend'amour
a094d922c9
Implement synapse issue #16751: Treat local_media_directory as optional storage provider (#19204)
Some checks failed
Schema / Ensure generated documentation is up-to-date (push) Waiting to run
Tests / changes (push) Waiting to run
Tests / check-sampleconfig (push) Blocked by required conditions
Tests / check-schema-delta (push) Blocked by required conditions
Tests / check-lockfile (push) Waiting to run
Tests / lint (push) Blocked by required conditions
Tests / Typechecking (push) Blocked by required conditions
Tests / lint-crlf (push) Waiting to run
Tests / lint-newsfile (push) Waiting to run
Tests / lint-clippy (push) Blocked by required conditions
Tests / linting-done (push) Blocked by required conditions
Tests / trial-pypy (all, pypy-3.10) (push) Blocked by required conditions
Tests / sytest (push) Blocked by required conditions
Tests / export-data (push) Blocked by required conditions
Tests / portdb (14, 3.10) (push) Blocked by required conditions
Tests / complement (monolith, Postgres) (push) Blocked by required conditions
Tests / lint-clippy-nightly (push) Blocked by required conditions
Tests / lint-rust (push) Blocked by required conditions
Tests / lint-rustfmt (push) Blocked by required conditions
Tests / lint-readme (push) Blocked by required conditions
Tests / calculate-test-jobs (push) Blocked by required conditions
Tests / trial (push) Blocked by required conditions
Tests / trial-olddeps (push) Blocked by required conditions
Tests / portdb (17, 3.14) (push) Blocked by required conditions
Tests / complement (monolith, SQLite) (push) Blocked by required conditions
Tests / complement (workers, Postgres) (push) Blocked by required conditions
Tests / cargo-test (push) Blocked by required conditions
Tests / cargo-bench (push) Blocked by required conditions
Tests / tests-done (push) Blocked by required conditions
/ Check locked dependencies have sdists (push) Has been cancelled
2026-01-06 23:29:58 +00:00
Devon Hudson
16bc8c78ba
Update changelog after reverting PR
Some checks are pending
Schema / Ensure Synapse config schema is valid (push) Waiting to run
Schema / Ensure generated documentation is up-to-date (push) Waiting to run
Tests / changes (push) Waiting to run
Tests / check-sampleconfig (push) Blocked by required conditions
Tests / check-schema-delta (push) Blocked by required conditions
Tests / check-lockfile (push) Waiting to run
Tests / lint (push) Blocked by required conditions
Tests / Typechecking (push) Blocked by required conditions
Tests / lint-crlf (push) Waiting to run
Tests / lint-newsfile (push) Waiting to run
Tests / lint-clippy (push) Blocked by required conditions
Tests / lint-clippy-nightly (push) Blocked by required conditions
Tests / lint-rust (push) Blocked by required conditions
Tests / lint-rustfmt (push) Blocked by required conditions
Tests / linting-done (push) Blocked by required conditions
Tests / calculate-test-jobs (push) Blocked by required conditions
Tests / trial (push) Blocked by required conditions
Tests / trial-olddeps (push) Blocked by required conditions
Tests / sytest (push) Blocked by required conditions
Tests / export-data (push) Blocked by required conditions
Tests / portdb (14, 3.10) (push) Blocked by required conditions
Tests / portdb (17, 3.14) (push) Blocked by required conditions
Tests / complement (monolith, SQLite) (push) Blocked by required conditions
Tests / cargo-bench (push) Blocked by required conditions
Tests / tests-done (push) Blocked by required conditions
Tests / lint-readme (push) Blocked by required conditions
Tests / trial-pypy (all, pypy-3.10) (push) Blocked by required conditions
Tests / complement (monolith, Postgres) (push) Blocked by required conditions
Tests / complement (workers, Postgres) (push) Blocked by required conditions
Tests / cargo-test (push) Blocked by required conditions
2026-01-06 14:49:09 -07:00
Devon Hudson
6ac61e4be4
Revert "Add an Admin API endpoint for listing quarantined media (#19268)" (#19351)
Fixes #19349 

This reverts commit 3f636386a6
(https://github.com/element-hq/synapse/pull/19268) as the DB migration
was taking too long and blocking media access while it happened.

See https://github.com/element-hq/synapse/issues/19349 for further
information.

### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [X] Pull request is based on the develop branch
* [X] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [X] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct (run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))

---------

Co-authored-by: Travis Ralston <travpc@gmail.com>
2026-01-06 14:46:58 -07:00
Devon Hudson
987b61a92b
Revert "Add an Admin API endpoint for listing quarantined media (#19268)" (#19351)
Fixes #19349 

This reverts commit 3f636386a6
(https://github.com/element-hq/synapse/pull/19268) as the DB migration
was taking too long and blocking media access while it happened.

See https://github.com/element-hq/synapse/issues/19349 for further
information.

### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [X] Pull request is based on the develop branch
* [X] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [X] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct (run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))

---------

Co-authored-by: Travis Ralston <travpc@gmail.com>
2026-01-06 21:37:23 +00:00
dependabot[bot]
18ef7f3085
Bump pynacl from 1.5.0 to 1.6.2 (#19350)
Bumps [pynacl](https://github.com/pyca/pynacl) from 1.5.0 to 1.6.2.
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/pyca/pynacl/blob/main/CHANGELOG.rst">pynacl's
changelog</a>.</em></p>
<blockquote>
<h2>1.6.2 (2026-01-01)</h2>
<ul>
<li>Updated <code>libsodium</code> to 1.0.20-stable (2025-12-31 build)
to resolve
<code>CVE-2025-69277</code>.</li>
</ul>
<h2>1.6.1 (2025-11-10)</h2>
<ul>
<li>The <code>MAKE</code> environment variable can now be used to
specify the <code>make</code>
binary that should be used in the build process.</li>
</ul>
<h2>1.6.0 (2025-09-11)</h2>
<ul>
<li><strong>BACKWARDS INCOMPATIBLE:</strong> Removed support for Python
3.6 and 3.7.</li>
<li>Added support for the low level AEAD AES bindings.</li>
<li>Added support for
<code>crypto_core_ed25519_from_uniform</code>.</li>
<li>Update <code>libsodium</code> to 1.0.20-stable (2025-08-27
build).</li>
<li>Added support for free-threaded Python 3.14.</li>
<li>Added support for Windows on ARM wheels.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="ecf41f55a3"><code>ecf41f5</code></a>
changelog and version bump for 1.6.2 (<a
href="https://redirect.github.com/pyca/pynacl/issues/923">#923</a>)</li>
<li><a
href="685a5e7277"><code>685a5e7</code></a>
Switch to PyPI trusted publishing (<a
href="https://redirect.github.com/pyca/pynacl/issues/925">#925</a>)</li>
<li><a
href="78e0aa32b1"><code>78e0aa3</code></a>
missed adding these files as part of the libsodium update (<a
href="https://redirect.github.com/pyca/pynacl/issues/924">#924</a>)</li>
<li><a
href="96314884d8"><code>9631488</code></a>
Bump libsodium to the latest 1.0.20 (<a
href="https://redirect.github.com/pyca/pynacl/issues/922">#922</a>)</li>
<li><a
href="563b25bded"><code>563b25b</code></a>
Add script to update vendored libsodium (<a
href="https://redirect.github.com/pyca/pynacl/issues/921">#921</a>)</li>
<li><a
href="d233105618"><code>d233105</code></a>
Include libsodium license in wheels (<a
href="https://redirect.github.com/pyca/pynacl/issues/917">#917</a>)</li>
<li><a
href="cabc3a879d"><code>cabc3a8</code></a>
Bump dessant/lock-threads from 5 to 6 (<a
href="https://redirect.github.com/pyca/pynacl/issues/914">#914</a>)</li>
<li><a
href="f3596177b3"><code>f359617</code></a>
Bump actions/download-artifact from 6.0.0 to 7.0.0 (<a
href="https://redirect.github.com/pyca/pynacl/issues/915">#915</a>)</li>
<li><a
href="fb6e37f76d"><code>fb6e37f</code></a>
Bump actions/upload-artifact from 5 to 6 (<a
href="https://redirect.github.com/pyca/pynacl/issues/916">#916</a>)</li>
<li><a
href="526f992783"><code>526f992</code></a>
Bump actions/checkout from 6.0.0 to 6.0.1 (<a
href="https://redirect.github.com/pyca/pynacl/issues/911">#911</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/pyca/pynacl/compare/1.5.0...1.6.2">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=pynacl&package-manager=pip&previous-version=1.5.0&new-version=1.6.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the
[Security Alerts
page](https://github.com/element-hq/synapse/network/alerts).

</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-06 19:31:09 +00:00
Tulir Asokan
ac6463c6da
Fix media creation being ratelimited for appservices (#19335)
Co-authored-by: Andrew Morgan <andrew@amorgan.xyz>
2026-01-06 17:34:38 +00:00
Andrew Morgan
1500733f4a
Replace usage of deprecated assertEquals with assertEqual (#19345) 2026-01-06 17:30:21 +00:00
Devon Hudson
d6d1404a8e
Add nifty titles to top level deprecations 2026-01-06 09:49:48 -07:00
Devon Hudson
39f80296c5
1.145.0rc1 2026-01-06 09:38:44 -07:00
Olivier 'reivilibre
cd252db3f5
Transform events with client metadata before serialising in /event response. (#19340)
Some checks are pending
Schema / Ensure Synapse config schema is valid (push) Waiting to run
Schema / Ensure generated documentation is up-to-date (push) Waiting to run
Tests / lint-crlf (push) Waiting to run
Tests / changes (push) Waiting to run
Tests / check-sampleconfig (push) Blocked by required conditions
Tests / check-schema-delta (push) Blocked by required conditions
Tests / check-lockfile (push) Waiting to run
Tests / lint (push) Blocked by required conditions
Tests / Typechecking (push) Blocked by required conditions
Tests / lint-newsfile (push) Waiting to run
Tests / lint-clippy (push) Blocked by required conditions
Tests / lint-clippy-nightly (push) Blocked by required conditions
Tests / lint-rust (push) Blocked by required conditions
Tests / lint-rustfmt (push) Blocked by required conditions
Tests / lint-readme (push) Blocked by required conditions
Tests / linting-done (push) Blocked by required conditions
Tests / calculate-test-jobs (push) Blocked by required conditions
Tests / trial (push) Blocked by required conditions
Tests / trial-olddeps (push) Blocked by required conditions
Tests / trial-pypy (all, pypy-3.10) (push) Blocked by required conditions
Tests / sytest (push) Blocked by required conditions
Tests / export-data (push) Blocked by required conditions
Tests / portdb (14, 3.10) (push) Blocked by required conditions
Tests / portdb (17, 3.14) (push) Blocked by required conditions
Tests / complement (monolith, Postgres) (push) Blocked by required conditions
Tests / complement (monolith, SQLite) (push) Blocked by required conditions
Tests / complement (workers, Postgres) (push) Blocked by required conditions
Tests / cargo-test (push) Blocked by required conditions
Tests / cargo-bench (push) Blocked by required conditions
Tests / tests-done (push) Blocked by required conditions
Fix /event/ endpoint not transforming event with per-requester metadata 

Pass notif_event through filter_events_for_client \
Not aware of an actual issue here, but seems silly to bypass it

Call it filter_and_transform_events_for_client to make it more obvious 

---------

Signed-off-by: Olivier 'reivilibre <oliverw@matrix.org>
2026-01-06 15:53:13 +00:00
Mathieu Velten
444bc56cda
Add rate limit conf to user directory endpoint (#19291)
Some checks are pending
Schema / Ensure Synapse config schema is valid (push) Waiting to run
Schema / Ensure generated documentation is up-to-date (push) Waiting to run
Tests / lint-clippy (push) Blocked by required conditions
Tests / lint-clippy-nightly (push) Blocked by required conditions
Tests / cargo-bench (push) Blocked by required conditions
Tests / tests-done (push) Blocked by required conditions
Tests / check-sampleconfig (push) Blocked by required conditions
Tests / check-schema-delta (push) Blocked by required conditions
Tests / check-lockfile (push) Waiting to run
Tests / lint-rustfmt (push) Blocked by required conditions
Tests / linting-done (push) Blocked by required conditions
Tests / calculate-test-jobs (push) Blocked by required conditions
Tests / trial-olddeps (push) Blocked by required conditions
Tests / trial-pypy (all, pypy-3.10) (push) Blocked by required conditions
Tests / sytest (push) Blocked by required conditions
Tests / cargo-test (push) Blocked by required conditions
Tests / changes (push) Waiting to run
Tests / lint (push) Blocked by required conditions
Tests / Typechecking (push) Blocked by required conditions
Tests / lint-crlf (push) Waiting to run
Tests / lint-newsfile (push) Waiting to run
Tests / lint-rust (push) Blocked by required conditions
Tests / lint-readme (push) Blocked by required conditions
Tests / trial (push) Blocked by required conditions
Tests / portdb (14, 3.10) (push) Blocked by required conditions
Tests / portdb (17, 3.14) (push) Blocked by required conditions
Tests / complement (monolith, Postgres) (push) Blocked by required conditions
Tests / complement (monolith, SQLite) (push) Blocked by required conditions
Tests / export-data (push) Blocked by required conditions
Tests / complement (workers, Postgres) (push) Blocked by required conditions
The goal is to avoid that an user could scrape the user directory too
quickly.
2026-01-05 13:35:11 -06:00
dependabot[bot]
6b755f964b
Bump actions/upload-artifact from 5.0.0 to 6.0.0 (#19334)
Some checks are pending
Schema / Ensure Synapse config schema is valid (push) Waiting to run
Schema / Ensure generated documentation is up-to-date (push) Waiting to run
Tests / cargo-bench (push) Blocked by required conditions
Tests / tests-done (push) Blocked by required conditions
Tests / cargo-test (push) Blocked by required conditions
Tests / changes (push) Waiting to run
Tests / check-sampleconfig (push) Blocked by required conditions
Tests / check-schema-delta (push) Blocked by required conditions
Tests / check-lockfile (push) Waiting to run
Tests / lint (push) Blocked by required conditions
Tests / lint-clippy (push) Blocked by required conditions
Tests / export-data (push) Blocked by required conditions
Tests / portdb (14, 3.10) (push) Blocked by required conditions
Tests / portdb (17, 3.14) (push) Blocked by required conditions
Tests / complement (monolith, Postgres) (push) Blocked by required conditions
Tests / Typechecking (push) Blocked by required conditions
Tests / lint-crlf (push) Waiting to run
Tests / lint-newsfile (push) Waiting to run
Tests / lint-clippy-nightly (push) Blocked by required conditions
Tests / lint-rust (push) Blocked by required conditions
Tests / lint-rustfmt (push) Blocked by required conditions
Tests / lint-readme (push) Blocked by required conditions
Tests / linting-done (push) Blocked by required conditions
Tests / calculate-test-jobs (push) Blocked by required conditions
Tests / trial (push) Blocked by required conditions
Tests / trial-olddeps (push) Blocked by required conditions
Tests / trial-pypy (all, pypy-3.10) (push) Blocked by required conditions
Tests / sytest (push) Blocked by required conditions
Tests / complement (monolith, SQLite) (push) Blocked by required conditions
Tests / complement (workers, Postgres) (push) Blocked by required conditions
Bumps
[actions/upload-artifact](https://github.com/actions/upload-artifact)
from 5.0.0 to 6.0.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/actions/upload-artifact/releases">actions/upload-artifact's
releases</a>.</em></p>
<blockquote>
<h2>v6.0.0</h2>
<h2>v6 - What's new</h2>
<blockquote>
<p>[!IMPORTANT]
actions/upload-artifact@v6 now runs on Node.js 24 (<code>runs.using:
node24</code>) and requires a minimum Actions Runner version of 2.327.1.
If you are using self-hosted runners, ensure they are updated before
upgrading.</p>
</blockquote>
<h3>Node.js 24</h3>
<p>This release updates the runtime to Node.js 24. v5 had preliminary
support for Node.js 24, however this action was by default still running
on Node.js 20. Now this action by default will run on Node.js 24.</p>
<h2>What's Changed</h2>
<ul>
<li>Upload Artifact Node 24 support by <a
href="https://github.com/salmanmkc"><code>@​salmanmkc</code></a> in <a
href="https://redirect.github.com/actions/upload-artifact/pull/719">actions/upload-artifact#719</a></li>
<li>fix: update <code>@​actions/artifact</code> for Node.js 24 punycode
deprecation by <a
href="https://github.com/salmanmkc"><code>@​salmanmkc</code></a> in <a
href="https://redirect.github.com/actions/upload-artifact/pull/744">actions/upload-artifact#744</a></li>
<li>prepare release v6.0.0 for Node.js 24 support by <a
href="https://github.com/salmanmkc"><code>@​salmanmkc</code></a> in <a
href="https://redirect.github.com/actions/upload-artifact/pull/745">actions/upload-artifact#745</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/actions/upload-artifact/compare/v5.0.0...v6.0.0">https://github.com/actions/upload-artifact/compare/v5.0.0...v6.0.0</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="b7c566a772"><code>b7c566a</code></a>
Merge pull request <a
href="https://redirect.github.com/actions/upload-artifact/issues/745">#745</a>
from actions/upload-artifact-v6-release</li>
<li><a
href="e516bc8500"><code>e516bc8</code></a>
docs: correct description of Node.js 24 support in README</li>
<li><a
href="ddc45ed9bc"><code>ddc45ed</code></a>
docs: update README to correct action name for Node.js 24 support</li>
<li><a
href="615b319bd2"><code>615b319</code></a>
chore: release v6.0.0 for Node.js 24 support</li>
<li><a
href="017748b48f"><code>017748b</code></a>
Merge pull request <a
href="https://redirect.github.com/actions/upload-artifact/issues/744">#744</a>
from actions/fix-storage-blob</li>
<li><a
href="38d4c7997f"><code>38d4c79</code></a>
chore: rebuild dist</li>
<li><a
href="7d27270e0c"><code>7d27270</code></a>
chore: add missing license cache files for <code>@​actions/core</code>,
<code>@​actions/io</code>, and mi...</li>
<li><a
href="5f643d3c94"><code>5f643d3</code></a>
chore: update license files for <code>@​actions/artifact</code><a
href="https://github.com/5"><code>@​5</code></a>.0.1 dependencies</li>
<li><a
href="1df1684032"><code>1df1684</code></a>
chore: update package-lock.json with <code>@​actions/artifact</code><a
href="https://github.com/5"><code>@​5</code></a>.0.1</li>
<li><a
href="b5b1a91840"><code>b5b1a91</code></a>
fix: update <code>@​actions/artifact</code> to ^5.0.0 for Node.js 24
punycode fix</li>
<li>Additional commits viewable in <a
href="https://github.com/actions/upload-artifact/compare/v5...v6">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=actions/upload-artifact&package-manager=github_actions&previous-version=5.0.0&new-version=6.0.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Devon Hudson <devonhudson@librem.one>
2026-01-05 14:53:58 +00:00
dependabot[bot]
169d5b9590
Bump reqwest from 0.12.24 to 0.12.25 in the patches group (#19331)
Bumps the patches group with 1 update:
[reqwest](https://github.com/seanmonstar/reqwest).

Updates `reqwest` from 0.12.24 to 0.12.25
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/seanmonstar/reqwest/releases">reqwest's
releases</a>.</em></p>
<blockquote>
<h2>v0.12.25</h2>
<h2>Highlights</h2>
<ul>
<li>Add <code>Error::is_upgrade()</code> to determine if the error was
from an HTTP upgrade.</li>
<li>Fix sending <code>Proxy-Authorization</code> if only username is
configured.</li>
<li>Fix sending <code>Proxy-Authorization</code> to HTTPS proxies when
the target is HTTP.</li>
<li>Refactor internal decompression handling to use tower-http.</li>
</ul>
<h2>What's Changed</h2>
<ul>
<li>tests: fix wasm timeout test with uncached response by <a
href="https://github.com/seanmonstar"><code>@​seanmonstar</code></a> in
<a
href="https://redirect.github.com/seanmonstar/reqwest/pull/2853">seanmonstar/reqwest#2853</a></li>
<li>docs: document connection pooling behavior by <a
href="https://github.com/vinzmyko"><code>@​vinzmyko</code></a> in <a
href="https://redirect.github.com/seanmonstar/reqwest/pull/2851">seanmonstar/reqwest#2851</a></li>
<li>docs: document WASM client by <a
href="https://github.com/vinzmyko"><code>@​vinzmyko</code></a> in <a
href="https://redirect.github.com/seanmonstar/reqwest/pull/2859">seanmonstar/reqwest#2859</a></li>
<li>chore: minor improvement for docs by <a
href="https://github.com/black5box"><code>@​black5box</code></a> in <a
href="https://redirect.github.com/seanmonstar/reqwest/pull/2862">seanmonstar/reqwest#2862</a></li>
<li>fix: send <code>proxy-authorization</code> even with empty
<code>password</code> by <a
href="https://github.com/barjin"><code>@​barjin</code></a> in <a
href="https://redirect.github.com/seanmonstar/reqwest/pull/2868">seanmonstar/reqwest#2868</a></li>
<li>feat(error): add <code>is_upgrade</code> method to detect protocol
upgrade errors by <a
href="https://github.com/0x676e67"><code>@​0x676e67</code></a> in <a
href="https://redirect.github.com/seanmonstar/reqwest/pull/2822">seanmonstar/reqwest#2822</a></li>
<li>Use decompression from tower-http by <a
href="https://github.com/ducaale"><code>@​ducaale</code></a> in <a
href="https://redirect.github.com/seanmonstar/reqwest/pull/2840">seanmonstar/reqwest#2840</a></li>
<li>fix(proxy): forward Proxy-Authorization header to HTTPS proxies for
HTTP targets by <a
href="https://github.com/0x676e67"><code>@​0x676e67</code></a> in <a
href="https://redirect.github.com/seanmonstar/reqwest/pull/2872">seanmonstar/reqwest#2872</a></li>
<li>v0.12.25 by <a
href="https://github.com/seanmonstar"><code>@​seanmonstar</code></a> in
<a
href="https://redirect.github.com/seanmonstar/reqwest/pull/2880">seanmonstar/reqwest#2880</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a href="https://github.com/vinzmyko"><code>@​vinzmyko</code></a>
made their first contribution in <a
href="https://redirect.github.com/seanmonstar/reqwest/pull/2851">seanmonstar/reqwest#2851</a></li>
<li><a href="https://github.com/black5box"><code>@​black5box</code></a>
made their first contribution in <a
href="https://redirect.github.com/seanmonstar/reqwest/pull/2862">seanmonstar/reqwest#2862</a></li>
<li><a href="https://github.com/barjin"><code>@​barjin</code></a> made
their first contribution in <a
href="https://redirect.github.com/seanmonstar/reqwest/pull/2868">seanmonstar/reqwest#2868</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/seanmonstar/reqwest/compare/v0.12.24...v0.12.25">https://github.com/seanmonstar/reqwest/compare/v0.12.24...v0.12.25</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/seanmonstar/reqwest/blob/master/CHANGELOG.md">reqwest's
changelog</a>.</em></p>
<blockquote>
<h2>v0.12.25</h2>
<ul>
<li>Add <code>Error::is_upgrade()</code> to determine if the error was
from an HTTP upgrade.</li>
<li>Fix sending <code>Proxy-Authorization</code> if only username is
configured.</li>
<li>Fix sending <code>Proxy-Authorization</code> to HTTPS proxies when
the target is HTTP.</li>
<li>Refactor internal decompression handling to use tower-http.</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="f156a9ffa7"><code>f156a9f</code></a>
v0.12.25</li>
<li><a
href="fc1ff4fc2b"><code>fc1ff4f</code></a>
fix(proxy): forward Proxy-Authorization header to HTTPS proxies for HTTP
targ...</li>
<li><a
href="b7c37121c3"><code>b7c3712</code></a>
Use decompression from tower-http (<a
href="https://redirect.github.com/seanmonstar/reqwest/issues/2840">#2840</a>)</li>
<li><a
href="74e6f84152"><code>74e6f84</code></a>
feat(error): add <code>is_upgrade</code> method to detect protocol
upgrade errors (<a
href="https://redirect.github.com/seanmonstar/reqwest/issues/2822">#2822</a>)</li>
<li><a
href="c0c06b7aef"><code>c0c06b7</code></a>
fix: send <code>proxy-authorization</code> even with empty
<code>password</code> (<a
href="https://redirect.github.com/seanmonstar/reqwest/issues/2868">#2868</a>)</li>
<li><a
href="a2aa5a34e4"><code>a2aa5a3</code></a>
chore: minor improvement for docs (<a
href="https://redirect.github.com/seanmonstar/reqwest/issues/2862">#2862</a>)</li>
<li><a
href="9c4999d607"><code>9c4999d</code></a>
docs: document WASM client (<a
href="https://redirect.github.com/seanmonstar/reqwest/issues/2859">#2859</a>)</li>
<li><a
href="a97e1956dd"><code>a97e195</code></a>
docs: document connection pooling behavior (<a
href="https://redirect.github.com/seanmonstar/reqwest/issues/2851">#2851</a>)</li>
<li><a
href="e3093edad8"><code>e3093ed</code></a>
tests: fix wasm timeout test with uncached response (<a
href="https://redirect.github.com/seanmonstar/reqwest/issues/2853">#2853</a>)</li>
<li>See full diff in <a
href="https://github.com/seanmonstar/reqwest/compare/v0.12.24...v0.12.25">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=reqwest&package-manager=cargo&previous-version=0.12.24&new-version=0.12.25)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-05 14:15:00 +00:00
dependabot[bot]
691e43bac9
Bump actions/cache from 4.3.0 to 5.0.1 (#19332)
Bumps [actions/cache](https://github.com/actions/cache) from 4.3.0 to
5.0.1.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/actions/cache/releases">actions/cache's
releases</a>.</em></p>
<blockquote>
<h2>v5.0.1</h2>
<blockquote>
<p>[!IMPORTANT]
<strong><code>actions/cache@v5</code> runs on the Node.js 24 runtime and
requires a minimum Actions Runner version of
<code>2.327.1</code>.</strong></p>
<p>If you are using self-hosted runners, ensure they are updated before
upgrading.</p>
</blockquote>
<hr />
<h1>v5.0.1</h1>
<h2>What's Changed</h2>
<ul>
<li>fix: update <code>@​actions/cache</code> for Node.js 24 punycode
deprecation by <a
href="https://github.com/salmanmkc"><code>@​salmanmkc</code></a> in <a
href="https://redirect.github.com/actions/cache/pull/1685">actions/cache#1685</a></li>
<li>prepare release v5.0.1 by <a
href="https://github.com/salmanmkc"><code>@​salmanmkc</code></a> in <a
href="https://redirect.github.com/actions/cache/pull/1686">actions/cache#1686</a></li>
</ul>
<h1>v5.0.0</h1>
<h2>What's Changed</h2>
<ul>
<li>Upgrade to use node24 by <a
href="https://github.com/salmanmkc"><code>@​salmanmkc</code></a> in <a
href="https://redirect.github.com/actions/cache/pull/1630">actions/cache#1630</a></li>
<li>Prepare v5.0.0 release by <a
href="https://github.com/salmanmkc"><code>@​salmanmkc</code></a> in <a
href="https://redirect.github.com/actions/cache/pull/1684">actions/cache#1684</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/actions/cache/compare/v5...v5.0.1">https://github.com/actions/cache/compare/v5...v5.0.1</a></p>
<h2>v5.0.0</h2>
<blockquote>
<p>[!IMPORTANT]
<strong><code>actions/cache@v5</code> runs on the Node.js 24 runtime and
requires a minimum Actions Runner version of
<code>2.327.1</code>.</strong></p>
<p>If you are using self-hosted runners, ensure they are updated before
upgrading.</p>
</blockquote>
<hr />
<h2>What's Changed</h2>
<ul>
<li>Upgrade to use node24 by <a
href="https://github.com/salmanmkc"><code>@​salmanmkc</code></a> in <a
href="https://redirect.github.com/actions/cache/pull/1630">actions/cache#1630</a></li>
<li>Prepare v5.0.0 release by <a
href="https://github.com/salmanmkc"><code>@​salmanmkc</code></a> in <a
href="https://redirect.github.com/actions/cache/pull/1684">actions/cache#1684</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/actions/cache/compare/v4.3.0...v5.0.0">https://github.com/actions/cache/compare/v4.3.0...v5.0.0</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/actions/cache/blob/main/RELEASES.md">actions/cache's
changelog</a>.</em></p>
<blockquote>
<h1>Releases</h1>
<h2>Changelog</h2>
<h3>5.0.1</h3>
<ul>
<li>Update <code>@azure/storage-blob</code> to <code>^12.29.1</code> via
<code>@actions/cache@5.0.1</code> <a
href="https://redirect.github.com/actions/cache/pull/1685">#1685</a></li>
</ul>
<h3>5.0.0</h3>
<blockquote>
<p>[!IMPORTANT]
<code>actions/cache@v5</code> runs on the Node.js 24 runtime and
requires a minimum Actions Runner version of <code>2.327.1</code>.
If you are using self-hosted runners, ensure they are updated before
upgrading.</p>
</blockquote>
<h3>4.3.0</h3>
<ul>
<li>Bump <code>@actions/cache</code> to <a
href="https://redirect.github.com/actions/toolkit/pull/2132">v4.1.0</a></li>
</ul>
<h3>4.2.4</h3>
<ul>
<li>Bump <code>@actions/cache</code> to v4.0.5</li>
</ul>
<h3>4.2.3</h3>
<ul>
<li>Bump <code>@actions/cache</code> to v4.0.3 (obfuscates SAS token in
debug logs for cache entries)</li>
</ul>
<h3>4.2.2</h3>
<ul>
<li>Bump <code>@actions/cache</code> to v4.0.2</li>
</ul>
<h3>4.2.1</h3>
<ul>
<li>Bump <code>@actions/cache</code> to v4.0.1</li>
</ul>
<h3>4.2.0</h3>
<p>TLDR; The cache backend service has been rewritten from the ground up
for improved performance and reliability. <a
href="https://github.com/actions/cache">actions/cache</a> now integrates
with the new cache service (v2) APIs.</p>
<p>The new service will gradually roll out as of <strong>February 1st,
2025</strong>. The legacy service will also be sunset on the same date.
Changes in these release are <strong>fully backward
compatible</strong>.</p>
<p><strong>We are deprecating some versions of this action</strong>. We
recommend upgrading to version <code>v4</code> or <code>v3</code> as
soon as possible before <strong>February 1st, 2025.</strong> (Upgrade
instructions below).</p>
<p>If you are using pinned SHAs, please use the SHAs of versions
<code>v4.2.0</code> or <code>v3.4.0</code></p>
<p>If you do not upgrade, all workflow runs using any of the deprecated
<a href="https://github.com/actions/cache">actions/cache</a> will
fail.</p>
<p>Upgrading to the recommended versions will not break your
workflows.</p>
<h3>4.1.2</h3>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="9255dc7a25"><code>9255dc7</code></a>
Merge pull request <a
href="https://redirect.github.com/actions/cache/issues/1686">#1686</a>
from actions/cache-v5.0.1-release</li>
<li><a
href="8ff5423e8b"><code>8ff5423</code></a>
chore: release v5.0.1</li>
<li><a
href="9233019a15"><code>9233019</code></a>
Merge pull request <a
href="https://redirect.github.com/actions/cache/issues/1685">#1685</a>
from salmanmkc/node24-storage-blob-fix</li>
<li><a
href="b975f2bb84"><code>b975f2b</code></a>
fix: add peer property to package-lock.json for dependencies</li>
<li><a
href="d0a0e18134"><code>d0a0e18</code></a>
fix: update license files for <code>@​actions/cache</code>,
fast-xml-parser, and strnum</li>
<li><a
href="74de208dcf"><code>74de208</code></a>
fix: update <code>@​actions/cache</code> to ^5.0.1 for Node.js 24
punycode fix</li>
<li><a
href="ac7f1152ea"><code>ac7f115</code></a>
peer</li>
<li><a
href="b0f846b50b"><code>b0f846b</code></a>
fix: update <code>@​actions/cache</code> with storage-blob fix for
Node.js 24 punycode depr...</li>
<li><a
href="a783357455"><code>a783357</code></a>
Merge pull request <a
href="https://redirect.github.com/actions/cache/issues/1684">#1684</a>
from actions/prepare-cache-v5-release</li>
<li><a
href="3bb0d78750"><code>3bb0d78</code></a>
docs: highlight v5 runner requirement in releases</li>
<li>Additional commits viewable in <a
href="0057852bfa...9255dc7a25">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=actions/cache&package-manager=github_actions&previous-version=4.3.0&new-version=5.0.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-05 14:00:05 +00:00
dependabot[bot]
8f96a83d16
Bump actions/download-artifact from 6.0.0 to 7.0.0 (#19333)
Bumps
[actions/download-artifact](https://github.com/actions/download-artifact)
from 6.0.0 to 7.0.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/actions/download-artifact/releases">actions/download-artifact's
releases</a>.</em></p>
<blockquote>
<h2>v7.0.0</h2>
<h2>v7 - What's new</h2>
<blockquote>
<p>[!IMPORTANT]
actions/download-artifact@v7 now runs on Node.js 24 (<code>runs.using:
node24</code>) and requires a minimum Actions Runner version of 2.327.1.
If you are using self-hosted runners, ensure they are updated before
upgrading.</p>
</blockquote>
<h3>Node.js 24</h3>
<p>This release updates the runtime to Node.js 24. v6 had preliminary
support for Node 24, however this action was by default still running on
Node.js 20. Now this action by default will run on Node.js 24.</p>
<h2>What's Changed</h2>
<ul>
<li>Update GHES guidance to include reference to Node 20 version by <a
href="https://github.com/patrikpolyak"><code>@​patrikpolyak</code></a>
in <a
href="https://redirect.github.com/actions/download-artifact/pull/440">actions/download-artifact#440</a></li>
<li>Download Artifact Node24 support by <a
href="https://github.com/salmanmkc"><code>@​salmanmkc</code></a> in <a
href="https://redirect.github.com/actions/download-artifact/pull/415">actions/download-artifact#415</a></li>
<li>fix: update <code>@​actions/artifact</code> to fix Node.js 24
punycode deprecation by <a
href="https://github.com/salmanmkc"><code>@​salmanmkc</code></a> in <a
href="https://redirect.github.com/actions/download-artifact/pull/451">actions/download-artifact#451</a></li>
<li>prepare release v7.0.0 for Node.js 24 support by <a
href="https://github.com/salmanmkc"><code>@​salmanmkc</code></a> in <a
href="https://redirect.github.com/actions/download-artifact/pull/452">actions/download-artifact#452</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/patrikpolyak"><code>@​patrikpolyak</code></a>
made their first contribution in <a
href="https://redirect.github.com/actions/download-artifact/pull/440">actions/download-artifact#440</a></li>
<li><a href="https://github.com/salmanmkc"><code>@​salmanmkc</code></a>
made their first contribution in <a
href="https://redirect.github.com/actions/download-artifact/pull/415">actions/download-artifact#415</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/actions/download-artifact/compare/v6.0.0...v7.0.0">https://github.com/actions/download-artifact/compare/v6.0.0...v7.0.0</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="37930b1c2a"><code>37930b1</code></a>
Merge pull request <a
href="https://redirect.github.com/actions/download-artifact/issues/452">#452</a>
from actions/download-artifact-v7-release</li>
<li><a
href="72582b9e0a"><code>72582b9</code></a>
doc: update readme</li>
<li><a
href="0d2ec9d4cb"><code>0d2ec9d</code></a>
chore: release v7.0.0 for Node.js 24 support</li>
<li><a
href="fd7ae8fda6"><code>fd7ae8f</code></a>
Merge pull request <a
href="https://redirect.github.com/actions/download-artifact/issues/451">#451</a>
from actions/fix-storage-blob</li>
<li><a
href="d484700543"><code>d484700</code></a>
chore: restore minimatch.dep.yml license file</li>
<li><a
href="03a808050e"><code>03a8080</code></a>
chore: remove obsolete dependency license files</li>
<li><a
href="56fe6d904b"><code>56fe6d9</code></a>
chore: update <code>@​actions/artifact</code> license file to 5.0.1</li>
<li><a
href="8e3ebc4ab4"><code>8e3ebc4</code></a>
chore: update package-lock.json with <code>@​actions/artifact</code><a
href="https://github.com/5"><code>@​5</code></a>.0.1</li>
<li><a
href="1e3c4b4d49"><code>1e3c4b4</code></a>
fix: update <code>@​actions/artifact</code> to ^5.0.0 for Node.js 24
punycode fix</li>
<li><a
href="458627d354"><code>458627d</code></a>
chore: use local <code>@​actions/artifact</code> package for Node.js 24
testing</li>
<li>Additional commits viewable in <a
href="018cc2cf5b...37930b1c2a">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=actions/download-artifact&package-manager=github_actions&previous-version=6.0.0&new-version=7.0.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2026-01-05 13:59:36 +00:00
Eric Eastwood
803e4b4d88
Make it more clear how shared_extra_conf is combined in our Docker configuration scripts (#19323)
Some checks failed
Tests / check-lockfile (push) Has been cancelled
Tests / lint-clippy (push) Has been cancelled
Tests / lint-clippy-nightly (push) Has been cancelled
Build docker images / Push merged images to docker.io/matrixdotorg/synapse (push) Has been cancelled
Tests / linting-done (push) Has been cancelled
Tests / calculate-test-jobs (push) Has been cancelled
Tests / trial (push) Has been cancelled
Tests / trial-olddeps (push) Has been cancelled
Tests / trial-pypy (all, pypy-3.10) (push) Has been cancelled
Tests / sytest (push) Has been cancelled
Tests / export-data (push) Has been cancelled
Tests / tests-done (push) Has been cancelled
Tests / portdb (14, 3.10) (push) Has been cancelled
Tests / portdb (17, 3.14) (push) Has been cancelled
Tests / complement (monolith, Postgres) (push) Has been cancelled
Tests / complement (monolith, SQLite) (push) Has been cancelled
Tests / complement (workers, Postgres) (push) Has been cancelled
Build docker images / Push merged images to ghcr.io/element-hq/synapse (push) Has been cancelled
Deploy the documentation / GitHub Pages (push) Has been cancelled
Build release artifacts / Build .deb packages (push) Has been cancelled
Tests / cargo-test (push) Has been cancelled
Build release artifacts / Attach assets to release (push) Has been cancelled
Tests / cargo-bench (push) Has been cancelled
Tests / check-sampleconfig (push) Has been cancelled
Tests / check-schema-delta (push) Has been cancelled
Tests / lint (push) Has been cancelled
Tests / Typechecking (push) Has been cancelled
Tests / lint-rust (push) Has been cancelled
Tests / lint-rustfmt (push) Has been cancelled
Tests / lint-readme (push) Has been cancelled
For reference, this PR used to include this whole `shared_config` block  in the diff.

But https://github.com/element-hq/synapse/pull/19324 was merged first which introduced parts of it already.

Here is what this code used to look like: 566670c363/docker/configure_workers_and_start.py (L865-L868)

---

Original context for why it was changed this way:
https://github.com/matrix-org/synapse/pull/14921#discussion_r1126257933


Previously, this code made me question two things:

1. Do we actually use `worker_config["shared_extra_conf"]` in the
templates?
- At first glance, I couldn't see why we're updating `shared_extra_conf`
here. It's not used in the `worker.yaml.j2` template so all of this
seemed a bit pointless.
- Turns out, updating `shared_extra_conf` itself is pointless and it's
being done as a convenient place to mix the objects to get things right
in `shared_config` (confusing).
 1. Does it actually do anything?
- Because `shared_config` starts out as an empty object, my first glance
made me think we we're just updating with an empty object and then just
re-assigning. But because we're in a loop, we actually accumulate the
`shared_extra_conf` from each worker.

I'm not sure whether I'm capturing my confusion well enough here but
basically, this made me spend time trying to figure out what/why we're
doing things this way and we can use a more clear pattern to accomplish
the same thing.

---

This change is spawning from looking at the
`docker/configure_workers_and_start.py` script in order to add a metrics
listener ([upcoming
PR](https://github.com/element-hq/synapse/pull/19324)).
2026-01-02 12:08:37 -06:00
Eric Eastwood
9dae6cc595
Add a way to expose metrics from the Docker image (SYNAPSE_ENABLE_METRICS) (#19324)
Some checks are pending
Schema / Ensure Synapse config schema is valid (push) Waiting to run
Schema / Ensure generated documentation is up-to-date (push) Waiting to run
Tests / cargo-bench (push) Blocked by required conditions
Tests / tests-done (push) Blocked by required conditions
Tests / lint-clippy (push) Blocked by required conditions
Tests / lint-clippy-nightly (push) Blocked by required conditions
Tests / changes (push) Waiting to run
Tests / check-sampleconfig (push) Blocked by required conditions
Tests / check-schema-delta (push) Blocked by required conditions
Tests / check-lockfile (push) Waiting to run
Tests / lint (push) Blocked by required conditions
Tests / trial (push) Blocked by required conditions
Tests / trial-olddeps (push) Blocked by required conditions
Tests / Typechecking (push) Blocked by required conditions
Tests / lint-crlf (push) Waiting to run
Tests / lint-newsfile (push) Waiting to run
Tests / lint-rust (push) Blocked by required conditions
Tests / lint-rustfmt (push) Blocked by required conditions
Tests / lint-readme (push) Blocked by required conditions
Tests / linting-done (push) Blocked by required conditions
Tests / calculate-test-jobs (push) Blocked by required conditions
Tests / trial-pypy (all, pypy-3.10) (push) Blocked by required conditions
Tests / sytest (push) Blocked by required conditions
Tests / export-data (push) Blocked by required conditions
Tests / cargo-test (push) Blocked by required conditions
Tests / portdb (14, 3.10) (push) Blocked by required conditions
Tests / portdb (17, 3.14) (push) Blocked by required conditions
Tests / complement (monolith, Postgres) (push) Blocked by required conditions
Tests / complement (monolith, SQLite) (push) Blocked by required conditions
Tests / complement (workers, Postgres) (push) Blocked by required conditions
Spawning from wanting to [run a load
test](https://github.com/element-hq/synapse-rust-apps/pull/397) against
the Complement Docker image of Synapse and see metrics from the
homeserver.


### Why not just provide your own homeserver config?

Probably possible but it gets tricky when you try to use the workers
variant of the Docker image (`docker/Dockerfile-workers`). The way to
workaround it would probably be to `yq` edit everything in a script and
change `/data/homeserver.yaml` and `/conf/workers/*.yaml` to add the
`metrics` listener. And then modify `/conf/workers/shared.yaml` to add
`enable_metrics: true`. Doesn't spark much joy.
2026-01-01 14:00:00 -06:00
Eric Eastwood
bd94152e06
Stream Complement progress and format logs in a separate step after all tests are done (#19326)
Some checks are pending
Schema / Ensure Synapse config schema is valid (push) Waiting to run
Schema / Ensure generated documentation is up-to-date (push) Waiting to run
Tests / check-schema-delta (push) Blocked by required conditions
Tests / check-lockfile (push) Waiting to run
Tests / changes (push) Waiting to run
Tests / check-sampleconfig (push) Blocked by required conditions
Tests / lint (push) Blocked by required conditions
Tests / Typechecking (push) Blocked by required conditions
Tests / lint-crlf (push) Waiting to run
Tests / lint-newsfile (push) Waiting to run
Tests / lint-clippy (push) Blocked by required conditions
Tests / lint-clippy-nightly (push) Blocked by required conditions
Tests / lint-rust (push) Blocked by required conditions
Tests / lint-rustfmt (push) Blocked by required conditions
Tests / lint-readme (push) Blocked by required conditions
Tests / linting-done (push) Blocked by required conditions
Tests / calculate-test-jobs (push) Blocked by required conditions
Tests / trial (push) Blocked by required conditions
Tests / trial-olddeps (push) Blocked by required conditions
Tests / trial-pypy (all, pypy-3.10) (push) Blocked by required conditions
Tests / sytest (push) Blocked by required conditions
Tests / export-data (push) Blocked by required conditions
Tests / portdb (14, 3.10) (push) Blocked by required conditions
Tests / portdb (17, 3.14) (push) Blocked by required conditions
Tests / complement (monolith, Postgres) (push) Blocked by required conditions
Tests / complement (monolith, SQLite) (push) Blocked by required conditions
Tests / complement (workers, Postgres) (push) Blocked by required conditions
Tests / cargo-test (push) Blocked by required conditions
Tests / cargo-bench (push) Blocked by required conditions
Tests / tests-done (push) Blocked by required conditions
This way we can see what's happening as the tests run
instead of nothing until the end. Also useful to split the test output
from the formatting so we can take the raw test output before formatting
gobbles it all up.

Same thing I did in
https://github.com/element-hq/synapse-rust-apps/pull/361
2025-12-31 14:43:04 -06:00
Eric Eastwood
7a24fafbc3
Auto-formatting .github/workflows/tests.yml from VSCode (#19327)
Some checks failed
Tests / check-lockfile (push) Has been cancelled
Build release artifacts / Build .deb packages (push) Has been cancelled
Build docker images / Push merged images to docker.io/matrixdotorg/synapse (push) Has been cancelled
Build docker images / Push merged images to ghcr.io/element-hq/synapse (push) Has been cancelled
Deploy the documentation / GitHub Pages (push) Has been cancelled
Build release artifacts / Attach assets to release (push) Has been cancelled
Tests / lint-clippy (push) Has been cancelled
Tests / check-sampleconfig (push) Has been cancelled
Tests / check-schema-delta (push) Has been cancelled
Tests / lint (push) Has been cancelled
Tests / Typechecking (push) Has been cancelled
Tests / lint-clippy-nightly (push) Has been cancelled
Tests / lint-rust (push) Has been cancelled
Tests / lint-rustfmt (push) Has been cancelled
Tests / lint-readme (push) Has been cancelled
Tests / linting-done (push) Has been cancelled
Tests / calculate-test-jobs (push) Has been cancelled
Tests / trial (push) Has been cancelled
Tests / trial-olddeps (push) Has been cancelled
Tests / trial-pypy (all, pypy-3.10) (push) Has been cancelled
Tests / sytest (push) Has been cancelled
Tests / export-data (push) Has been cancelled
Tests / portdb (14, 3.10) (push) Has been cancelled
Tests / portdb (17, 3.14) (push) Has been cancelled
Tests / complement (monolith, Postgres) (push) Has been cancelled
Tests / complement (monolith, SQLite) (push) Has been cancelled
Tests / complement (workers, Postgres) (push) Has been cancelled
Tests / cargo-test (push) Has been cancelled
Tests / cargo-bench (push) Has been cancelled
Tests / tests-done (push) Has been cancelled
2025-12-29 12:20:58 -06:00
dependabot[bot]
f79acff862
Bump log from 0.4.28 to 0.4.29 in the patches group (#19318)
Some checks failed
Tests / lint-newsfile (push) Has been cancelled
Tests / trial-pypy (all, pypy-3.10) (push) Has been cancelled
Tests / export-data (push) Has been cancelled
Tests / portdb (14, 3.10) (push) Has been cancelled
Tests / portdb (17, 3.14) (push) Has been cancelled
Tests / complement (monolith, Postgres) (push) Has been cancelled
Tests / complement (monolith, SQLite) (push) Has been cancelled
Tests / complement (workers, Postgres) (push) Has been cancelled
Tests / check-sampleconfig (push) Has been cancelled
Tests / check-schema-delta (push) Has been cancelled
Tests / lint (push) Has been cancelled
Tests / lint-readme (push) Has been cancelled
Tests / linting-done (push) Has been cancelled
Tests / calculate-test-jobs (push) Has been cancelled
Build docker images / Push merged images to docker.io/matrixdotorg/synapse (push) Has been cancelled
Build docker images / Push merged images to ghcr.io/element-hq/synapse (push) Has been cancelled
Deploy the documentation / GitHub Pages (push) Has been cancelled
Build release artifacts / Build .deb packages (push) Has been cancelled
Build release artifacts / Attach assets to release (push) Has been cancelled
Tests / Typechecking (push) Has been cancelled
Tests / lint-clippy (push) Has been cancelled
Tests / lint-clippy-nightly (push) Has been cancelled
Tests / lint-rust (push) Has been cancelled
Tests / lint-rustfmt (push) Has been cancelled
Tests / trial (push) Has been cancelled
Tests / trial-olddeps (push) Has been cancelled
Tests / sytest (push) Has been cancelled
Tests / cargo-test (push) Has been cancelled
Tests / cargo-bench (push) Has been cancelled
Tests / tests-done (push) Has been cancelled
Bumps the patches group with 1 update:
[log](https://github.com/rust-lang/log).

Updates `log` from 0.4.28 to 0.4.29
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/rust-lang/log/releases">log's
releases</a>.</em></p>
<blockquote>
<h2>0.4.29</h2>
<h2>MSRV</h2>
<p>This release increases <code>log</code>'s MSRV from
<code>1.61.0</code> to <code>1.68.0</code>.</p>
<h2>What's Changed</h2>
<ul>
<li>docs: Add missing impls from README.md by <a
href="https://github.com/AldaronLau"><code>@​AldaronLau</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/703">rust-lang/log#703</a></li>
<li>Point to new URLs for favicon and logo by <a
href="https://github.com/AldaronLau"><code>@​AldaronLau</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/704">rust-lang/log#704</a></li>
<li>perf: reduce llvm-lines of FromStr for <code>Level</code> and
<code>LevelFilter</code> by <a
href="https://github.com/dishmaker"><code>@​dishmaker</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/709">rust-lang/log#709</a></li>
<li>Replace serde with serde_core by <a
href="https://github.com/Thomasdezeeuw"><code>@​Thomasdezeeuw</code></a>
in <a
href="https://redirect.github.com/rust-lang/log/pull/712">rust-lang/log#712</a></li>
<li>Fix clippy lints by <a
href="https://github.com/Thomasdezeeuw"><code>@​Thomasdezeeuw</code></a>
in <a
href="https://redirect.github.com/rust-lang/log/pull/713">rust-lang/log#713</a></li>
<li>Use GitHub Actions to install Rust and cargo-hack by <a
href="https://github.com/Thomasdezeeuw"><code>@​Thomasdezeeuw</code></a>
in <a
href="https://redirect.github.com/rust-lang/log/pull/715">rust-lang/log#715</a></li>
<li>Exclude old unstable_kv features from testing matrix by <a
href="https://github.com/Thomasdezeeuw"><code>@​Thomasdezeeuw</code></a>
in <a
href="https://redirect.github.com/rust-lang/log/pull/716">rust-lang/log#716</a></li>
<li>Fix up CI by <a
href="https://github.com/KodrAus"><code>@​KodrAus</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/718">rust-lang/log#718</a></li>
<li>Prepare for 0.4.29 release by <a
href="https://github.com/KodrAus"><code>@​KodrAus</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/719">rust-lang/log#719</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/AldaronLau"><code>@​AldaronLau</code></a> made
their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/703">rust-lang/log#703</a></li>
<li><a href="https://github.com/dishmaker"><code>@​dishmaker</code></a>
made their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/709">rust-lang/log#709</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/rust-lang/log/compare/0.4.28...0.4.29">https://github.com/rust-lang/log/compare/0.4.28...0.4.29</a></p>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/rust-lang/log/blob/master/CHANGELOG.md">log's
changelog</a>.</em></p>
<blockquote>
<h2>[0.4.29] - 2025-12-02</h2>
<h2>What's Changed</h2>
<ul>
<li>perf: reduce llvm-lines of FromStr for <code>Level</code> and
<code>LevelFilter</code> by <a
href="https://github.com/dishmaker"><code>@​dishmaker</code></a> in <a
href="https://redirect.github.com/rust-lang/log/pull/709">rust-lang/log#709</a></li>
<li>Replace serde with serde_core by <a
href="https://github.com/Thomasdezeeuw"><code>@​Thomasdezeeuw</code></a>
in <a
href="https://redirect.github.com/rust-lang/log/pull/712">rust-lang/log#712</a></li>
</ul>
<h2>New Contributors</h2>
<ul>
<li><a
href="https://github.com/AldaronLau"><code>@​AldaronLau</code></a> made
their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/703">rust-lang/log#703</a></li>
<li><a href="https://github.com/dishmaker"><code>@​dishmaker</code></a>
made their first contribution in <a
href="https://redirect.github.com/rust-lang/log/pull/709">rust-lang/log#709</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/rust-lang/log/compare/0.4.28...0.4.29">https://github.com/rust-lang/log/compare/0.4.28...0.4.29</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="b1e2df7bce"><code>b1e2df7</code></a>
Merge pull request <a
href="https://redirect.github.com/rust-lang/log/issues/719">#719</a>
from rust-lang/cargo/0.4.29</li>
<li><a
href="3fe1a546dc"><code>3fe1a54</code></a>
prepare for 0.4.29 release</li>
<li><a
href="7a432d9ab5"><code>7a432d9</code></a>
Merge pull request <a
href="https://redirect.github.com/rust-lang/log/issues/718">#718</a>
from rust-lang/ci/msrv</li>
<li><a
href="0689d56847"><code>0689d56</code></a>
rebump msrv to 1.68.0</li>
<li><a
href="46b448e2a7"><code>46b448e</code></a>
try drop msrv back to 1.61.0</li>
<li><a
href="929ab3812e"><code>929ab38</code></a>
fix up doc test feature gate</li>
<li><a
href="957cece478"><code>957cece</code></a>
bump serde-dependent crates</li>
<li><a
href="bea40c847c"><code>bea40c8</code></a>
bump msrv to 1.68.0</li>
<li><a
href="c540184ee9"><code>c540184</code></a>
Merge pull request <a
href="https://redirect.github.com/rust-lang/log/issues/716">#716</a>
from rust-lang/ci-smaller-matrix2</li>
<li><a
href="c971e636c4"><code>c971e63</code></a>
Merge branch 'master' into ci-smaller-matrix2</li>
<li>Additional commits viewable in <a
href="https://github.com/rust-lang/log/compare/0.4.28...0.4.29">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=log&package-manager=cargo&previous-version=0.4.28&new-version=0.4.29)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-22 16:37:16 +00:00
dependabot[bot]
50fabc48c3
Bump actions/checkout from 6.0.0 to 6.0.1 in the minor-and-patches group (#19319)
Bumps the minor-and-patches group with 1 update:
[actions/checkout](https://github.com/actions/checkout).

Updates `actions/checkout` from 6.0.0 to 6.0.1
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/actions/checkout/releases">actions/checkout's
releases</a>.</em></p>
<blockquote>
<h2>v6.0.1</h2>
<h2>What's Changed</h2>
<ul>
<li>Update all references from v5 and v4 to v6 by <a
href="https://github.com/ericsciple"><code>@​ericsciple</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/2314">actions/checkout#2314</a></li>
<li>Add worktree support for persist-credentials includeIf by <a
href="https://github.com/ericsciple"><code>@​ericsciple</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/2327">actions/checkout#2327</a></li>
<li>Clarify v6 README by <a
href="https://github.com/ericsciple"><code>@​ericsciple</code></a> in <a
href="https://redirect.github.com/actions/checkout/pull/2328">actions/checkout#2328</a></li>
</ul>
<p><strong>Full Changelog</strong>: <a
href="https://github.com/actions/checkout/compare/v6...v6.0.1">https://github.com/actions/checkout/compare/v6...v6.0.1</a></p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="8e8c483db8"><code>8e8c483</code></a>
Clarify v6 README (<a
href="https://redirect.github.com/actions/checkout/issues/2328">#2328</a>)</li>
<li><a
href="033fa0dc0b"><code>033fa0d</code></a>
Add worktree support for persist-credentials includeIf (<a
href="https://redirect.github.com/actions/checkout/issues/2327">#2327</a>)</li>
<li><a
href="c2d88d3ecc"><code>c2d88d3</code></a>
Update all references from v5 and v4 to v6 (<a
href="https://redirect.github.com/actions/checkout/issues/2314">#2314</a>)</li>
<li>See full diff in <a
href="1af3b93b68...8e8c483db8">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=actions/checkout&package-manager=github_actions&previous-version=6.0.0&new-version=6.0.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-22 16:04:03 +00:00
Eric Eastwood
41938d6fd2
Log the original bind exception when encountering Failed to listen on 0.0.0.0, continuing because listening on [::] (#19297)
Some checks failed
Tests / trial-olddeps (push) Has been cancelled
Tests / trial-pypy (all, pypy-3.10) (push) Has been cancelled
Tests / sytest (push) Has been cancelled
Tests / export-data (push) Has been cancelled
Tests / portdb (14, 3.10) (push) Has been cancelled
Tests / portdb (17, 3.14) (push) Has been cancelled
Tests / complement (monolith, Postgres) (push) Has been cancelled
Tests / complement (monolith, SQLite) (push) Has been cancelled
Tests / complement (workers, Postgres) (push) Has been cancelled
Build docker images / Push merged images to docker.io/matrixdotorg/synapse (push) Has been cancelled
Build docker images / Push merged images to ghcr.io/element-hq/synapse (push) Has been cancelled
Deploy the documentation / GitHub Pages (push) Has been cancelled
Build release artifacts / Build .deb packages (push) Has been cancelled
Build release artifacts / Attach assets to release (push) Has been cancelled
Tests / check-sampleconfig (push) Has been cancelled
Tests / check-schema-delta (push) Has been cancelled
Tests / lint (push) Has been cancelled
Tests / Typechecking (push) Has been cancelled
Tests / lint-clippy (push) Has been cancelled
Tests / lint-clippy-nightly (push) Has been cancelled
Tests / lint-rust (push) Has been cancelled
Tests / lint-rustfmt (push) Has been cancelled
Tests / lint-readme (push) Has been cancelled
Tests / linting-done (push) Has been cancelled
Tests / calculate-test-jobs (push) Has been cancelled
Tests / trial (push) Has been cancelled
Tests / cargo-test (push) Has been cancelled
Tests / cargo-bench (push) Has been cancelled
Tests / tests-done (push) Has been cancelled
Tests / lint-newsfile (push) Has been cancelled
**Before:**

```
WARNING - call_when_running - Failed to listen on 0.0.0.0, continuing because listening on [::]
```

**After:**

```
WARNING - call_when_running - Failed to listen on 0.0.0.0, continuing because listening on [::]. Original exception: CannotListenError: Couldn't listen on 0.0.0.0:8008: [Errno 98] Address already in use.
```
2025-12-19 14:29:04 -06:00
Andrew Ferrazzutti
f4320b5a49
Admin API: worker support for Query User Account (#19281)
Some checks failed
Tests / lint-newsfile (push) Has been cancelled
Tests / lint (push) Has been cancelled
Tests / lint-readme (push) Has been cancelled
Build docker images / Push merged images to docker.io/matrixdotorg/synapse (push) Has been cancelled
Build docker images / Push merged images to ghcr.io/element-hq/synapse (push) Has been cancelled
Tests / linting-done (push) Has been cancelled
Tests / calculate-test-jobs (push) Has been cancelled
Tests / trial (push) Has been cancelled
Tests / trial-olddeps (push) Has been cancelled
Tests / trial-pypy (all, pypy-3.10) (push) Has been cancelled
Tests / sytest (push) Has been cancelled
Tests / export-data (push) Has been cancelled
Tests / portdb (14, 3.10) (push) Has been cancelled
Tests / portdb (17, 3.14) (push) Has been cancelled
Tests / tests-done (push) Has been cancelled
Tests / complement (monolith, Postgres) (push) Has been cancelled
Tests / complement (monolith, SQLite) (push) Has been cancelled
Tests / complement (workers, Postgres) (push) Has been cancelled
Tests / cargo-test (push) Has been cancelled
Tests / cargo-bench (push) Has been cancelled
Deploy the documentation / GitHub Pages (push) Has been cancelled
Build release artifacts / Build .deb packages (push) Has been cancelled
Build release artifacts / Attach assets to release (push) Has been cancelled
Tests / check-sampleconfig (push) Has been cancelled
Tests / check-schema-delta (push) Has been cancelled
Tests / Typechecking (push) Has been cancelled
Tests / lint-clippy (push) Has been cancelled
Tests / lint-clippy-nightly (push) Has been cancelled
Tests / lint-rust (push) Has been cancelled
Tests / lint-rustfmt (push) Has been cancelled
2025-12-16 17:42:08 +00:00
Tulir Asokan
3989d22a37
Implement pagination for MSC2666 (#19279)
Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2025-12-16 15:24:36 +00:00
Joshua Goins
0395b71e25
Fix Mastodon URL previews not showing anything useful (#19231)
Some checks are pending
Schema / Ensure Synapse config schema is valid (push) Waiting to run
Schema / Ensure generated documentation is up-to-date (push) Waiting to run
Tests / check-sampleconfig (push) Blocked by required conditions
Tests / lint-clippy-nightly (push) Blocked by required conditions
Tests / lint-rust (push) Blocked by required conditions
Tests / lint-rustfmt (push) Blocked by required conditions
Tests / trial (push) Blocked by required conditions
Tests / trial-olddeps (push) Blocked by required conditions
Tests / trial-pypy (all, pypy-3.10) (push) Blocked by required conditions
Tests / check-schema-delta (push) Blocked by required conditions
Tests / check-lockfile (push) Waiting to run
Tests / lint (push) Blocked by required conditions
Tests / lint-readme (push) Blocked by required conditions
Tests / linting-done (push) Blocked by required conditions
Tests / calculate-test-jobs (push) Blocked by required conditions
Tests / changes (push) Waiting to run
Tests / Typechecking (push) Blocked by required conditions
Tests / lint-crlf (push) Waiting to run
Tests / lint-newsfile (push) Waiting to run
Tests / lint-clippy (push) Blocked by required conditions
Tests / sytest (push) Blocked by required conditions
Tests / export-data (push) Blocked by required conditions
Tests / portdb (14, 3.10) (push) Blocked by required conditions
Tests / portdb (17, 3.14) (push) Blocked by required conditions
Tests / complement (monolith, Postgres) (push) Blocked by required conditions
Tests / complement (monolith, SQLite) (push) Blocked by required conditions
Tests / complement (workers, Postgres) (push) Blocked by required conditions
Tests / cargo-test (push) Blocked by required conditions
Tests / cargo-bench (push) Blocked by required conditions
Tests / tests-done (push) Blocked by required conditions
Fixes #18444. Inside of UrlPreviewer, we need to combine two dicts (one
from oEmbed, and one from OpenGraph metadata in the HTML) and in Mastodon's case they were very
different.

Single Page Applications (SPAs) seem to sometimes provide better information in the OpenGraph tags
than the oEmbed stubs, because the oEmbed stubs are filled in with JavaScript that Synapse does
not execute.

This change improves previews on Mastodon and YouTube (for the same reason).

Tested to not regress previews of Twitter or GitHub.
2025-12-16 13:02:29 +00:00
Denis Kasak
29fd0116a5
Improve proxy support for the federation_client.py dev script (#19300)
Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2025-12-16 11:06:07 +00:00
Travis Ralston
0f2b29511f
Allow admins to bypass the quarantine check on media downloads (#19275)
Some checks are pending
Tests / complement (workers, Postgres) (push) Blocked by required conditions
Tests / cargo-test (push) Blocked by required conditions
Tests / cargo-bench (push) Blocked by required conditions
Schema / Ensure Synapse config schema is valid (push) Waiting to run
Schema / Ensure generated documentation is up-to-date (push) Waiting to run
Tests / check-lockfile (push) Waiting to run
Tests / lint-readme (push) Blocked by required conditions
Tests / lint (push) Blocked by required conditions
Tests / changes (push) Waiting to run
Tests / check-sampleconfig (push) Blocked by required conditions
Tests / check-schema-delta (push) Blocked by required conditions
Tests / Typechecking (push) Blocked by required conditions
Tests / lint-crlf (push) Waiting to run
Tests / lint-newsfile (push) Waiting to run
Tests / lint-clippy (push) Blocked by required conditions
Tests / lint-clippy-nightly (push) Blocked by required conditions
Tests / lint-rust (push) Blocked by required conditions
Tests / lint-rustfmt (push) Blocked by required conditions
Tests / linting-done (push) Blocked by required conditions
Tests / calculate-test-jobs (push) Blocked by required conditions
Tests / trial (push) Blocked by required conditions
Tests / trial-olddeps (push) Blocked by required conditions
Tests / trial-pypy (all, pypy-3.10) (push) Blocked by required conditions
Tests / sytest (push) Blocked by required conditions
Tests / export-data (push) Blocked by required conditions
Tests / portdb (14, 3.10) (push) Blocked by required conditions
Tests / portdb (17, 3.14) (push) Blocked by required conditions
Tests / tests-done (push) Blocked by required conditions
Tests / complement (monolith, Postgres) (push) Blocked by required conditions
Tests / complement (monolith, SQLite) (push) Blocked by required conditions
Co-authored-by: turt2live <1190097+turt2live@users.noreply.github.com>
Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2025-12-15 17:23:33 +00:00
Andre Klärner
466994743a
Document importance of public_baseurl for delegation and OIDC (#19270)
Some checks failed
Tests / lint-newsfile (push) Has been cancelled
Tests / lint-clippy (push) Has been cancelled
Build docker images / Push merged images to docker.io/matrixdotorg/synapse (push) Has been cancelled
Build docker images / Push merged images to ghcr.io/element-hq/synapse (push) Has been cancelled
Deploy the documentation / GitHub Pages (push) Has been cancelled
Build release artifacts / Build .deb packages (push) Has been cancelled
Build release artifacts / Attach assets to release (push) Has been cancelled
Tests / check-sampleconfig (push) Has been cancelled
Tests / check-schema-delta (push) Has been cancelled
Tests / lint (push) Has been cancelled
Tests / Typechecking (push) Has been cancelled
Tests / lint-clippy-nightly (push) Has been cancelled
Tests / lint-rust (push) Has been cancelled
Tests / lint-rustfmt (push) Has been cancelled
Tests / lint-readme (push) Has been cancelled
Tests / linting-done (push) Has been cancelled
Tests / calculate-test-jobs (push) Has been cancelled
Tests / trial (push) Has been cancelled
Tests / trial-olddeps (push) Has been cancelled
Tests / trial-pypy (all, pypy-3.10) (push) Has been cancelled
Tests / sytest (push) Has been cancelled
Tests / export-data (push) Has been cancelled
Tests / portdb (14, 3.10) (push) Has been cancelled
Tests / portdb (17, 3.14) (push) Has been cancelled
Tests / complement (monolith, Postgres) (push) Has been cancelled
Tests / complement (monolith, SQLite) (push) Has been cancelled
Tests / complement (workers, Postgres) (push) Has been cancelled
Tests / cargo-test (push) Has been cancelled
Tests / cargo-bench (push) Has been cancelled
Tests / tests-done (push) Has been cancelled
I just stumbled across the fact that my config used delegation as
recommended by the docs, and hosted Synapse on a subdomain. However my
config never had `public_baseurl` set and worked without issues, until I
just now tried to setup OIDC.

OIDC is initialized by the client instructing to open a URL on the
homeserver, and initially the correct URL is called, but Synapse does
not recognize it without `public_baseurl` being set correctly. After
changing this it immediately started working.

So in order to prevent anybody from making the same mistake, this adds a
small clarifying block in the OIDC docs.
2025-12-12 18:07:39 -06:00
Devon Hudson
df24e0f302
Fix support for older versions of zope-interface (#19274)
Some checks failed
Schema / Ensure generated documentation is up-to-date (push) Waiting to run
Tests / lint-crlf (push) Waiting to run
Tests / changes (push) Waiting to run
Tests / check-sampleconfig (push) Blocked by required conditions
Tests / check-schema-delta (push) Blocked by required conditions
Tests / check-lockfile (push) Waiting to run
Tests / lint (push) Blocked by required conditions
Tests / Typechecking (push) Blocked by required conditions
Tests / lint-newsfile (push) Waiting to run
Tests / lint-clippy (push) Blocked by required conditions
Tests / lint-clippy-nightly (push) Blocked by required conditions
Tests / lint-rust (push) Blocked by required conditions
Tests / lint-rustfmt (push) Blocked by required conditions
Tests / lint-readme (push) Blocked by required conditions
Tests / linting-done (push) Blocked by required conditions
Tests / calculate-test-jobs (push) Blocked by required conditions
Tests / trial (push) Blocked by required conditions
Tests / trial-olddeps (push) Blocked by required conditions
Tests / trial-pypy (all, pypy-3.10) (push) Blocked by required conditions
Tests / sytest (push) Blocked by required conditions
Tests / export-data (push) Blocked by required conditions
Tests / portdb (14, 3.10) (push) Blocked by required conditions
Tests / portdb (17, 3.14) (push) Blocked by required conditions
Tests / complement (monolith, Postgres) (push) Blocked by required conditions
Tests / complement (monolith, SQLite) (push) Blocked by required conditions
Tests / complement (workers, Postgres) (push) Blocked by required conditions
Tests / cargo-test (push) Blocked by required conditions
Tests / cargo-bench (push) Blocked by required conditions
Tests / tests-done (push) Blocked by required conditions
/ Check locked dependencies have sdists (push) Has been cancelled
Fixes #19269 

Versions of zope-interface from RHEL, Ubuntu LTS 22 & 24 and OpenSuse
don't support the new python union `X | Y` syntax for interfaces. This
PR partially reverts the change over to fully use the new syntax, adds a
minimum supported version of zope-interface to Synapse's dependency
list, and removes the linter auto-upgrades which prefer the newer
syntax.

### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [X] Pull request is based on the develop branch
* [X] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [X] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct (run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))

---------

Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2025-12-12 15:34:13 +00:00
Andrew Morgan
048629dd13 minor grammar fix
context: https://github.com/element-hq/synapse/pull/19260#discussion_r2614227743
2025-12-12 13:36:34 +00:00
Mathieu Velten
7347cc436e
Add memberships admin API (#19260)
Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2025-12-12 13:35:46 +00:00
Travis Ralston
3f636386a6
Add an Admin API endpoint for listing quarantined media (#19268)
Co-authored-by: turt2live <1190097+turt2live@users.noreply.github.com>
Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2025-12-12 13:30:21 +00:00
Andrew Morgan
1f7f16477d
Unpin Rust from 1.82.0 (#19302)
Some checks are pending
Tests / lint (push) Blocked by required conditions
Tests / lint-crlf (push) Waiting to run
Tests / lint-clippy (push) Blocked by required conditions
Tests / lint-clippy-nightly (push) Blocked by required conditions
Tests / lint-rust (push) Blocked by required conditions
Tests / lint-rustfmt (push) Blocked by required conditions
Tests / linting-done (push) Blocked by required conditions
Tests / calculate-test-jobs (push) Blocked by required conditions
Tests / trial (push) Blocked by required conditions
Tests / trial-olddeps (push) Blocked by required conditions
Tests / trial-pypy (all, pypy-3.10) (push) Blocked by required conditions
Tests / sytest (push) Blocked by required conditions
Tests / export-data (push) Blocked by required conditions
Tests / portdb (14, 3.10) (push) Blocked by required conditions
Tests / portdb (17, 3.14) (push) Blocked by required conditions
Tests / tests-done (push) Blocked by required conditions
Tests / complement (monolith, Postgres) (push) Blocked by required conditions
Tests / complement (monolith, SQLite) (push) Blocked by required conditions
Tests / complement (workers, Postgres) (push) Blocked by required conditions
Tests / cargo-test (push) Blocked by required conditions
Tests / cargo-bench (push) Blocked by required conditions
Schema / Ensure Synapse config schema is valid (push) Waiting to run
Schema / Ensure generated documentation is up-to-date (push) Waiting to run
Tests / lint-readme (push) Blocked by required conditions
Tests / changes (push) Waiting to run
Tests / check-sampleconfig (push) Blocked by required conditions
Tests / check-schema-delta (push) Blocked by required conditions
Tests / check-lockfile (push) Waiting to run
Tests / Typechecking (push) Blocked by required conditions
Tests / lint-newsfile (push) Waiting to run
2025-12-12 11:31:55 +00:00
Erik Johnston
dfd00a986f
Fix sliding sync performance slow down for long lived connections. (#19206)
Fixes https://github.com/element-hq/synapse/issues/19175

This PR moves tracking of what lazy loaded membership we've sent to each
room out of the required state table. This avoids that table from
continuously growing, which massively helps performance as we pull out
all matching rows for the connection when we receive a request.

The new table is only read when we have data in a room to send, so we
end up reading a lot fewer rows from the DB. Though we now read from
that table for every room we have events to return in, rather than once
at the start of the request.

For an explanation of how the new table works, see the
[comment](https://github.com/element-hq/synapse/blob/erikj/sss_better_membership_storage2/synapse/storage/schema/main/delta/93/02_sliding_sync_members.sql#L15-L38)
on the table schema.

The table is designed so that we can later prune old entries if we wish,
but that is not implemented in this PR.

Reviewable commit-by-commit.

---------

Co-authored-by: Eric Eastwood <erice@element.io>
2025-12-12 10:02:57 +00:00
Devon Hudson
cdf286d405
Use uv to test full set of minimum deps in CI (#19289)
Some checks are pending
Schema / Ensure Synapse config schema is valid (push) Waiting to run
Schema / Ensure generated documentation is up-to-date (push) Waiting to run
Tests / check-schema-delta (push) Blocked by required conditions
Tests / check-lockfile (push) Waiting to run
Tests / lint (push) Blocked by required conditions
Tests / lint-readme (push) Blocked by required conditions
Tests / tests-done (push) Blocked by required conditions
Tests / changes (push) Waiting to run
Tests / check-sampleconfig (push) Blocked by required conditions
Tests / Typechecking (push) Blocked by required conditions
Tests / lint-crlf (push) Waiting to run
Tests / lint-newsfile (push) Waiting to run
Tests / lint-clippy (push) Blocked by required conditions
Tests / lint-clippy-nightly (push) Blocked by required conditions
Tests / lint-rust (push) Blocked by required conditions
Tests / lint-rustfmt (push) Blocked by required conditions
Tests / linting-done (push) Blocked by required conditions
Tests / calculate-test-jobs (push) Blocked by required conditions
Tests / trial (push) Blocked by required conditions
Tests / trial-olddeps (push) Blocked by required conditions
Tests / trial-pypy (all, pypy-3.10) (push) Blocked by required conditions
Tests / sytest (push) Blocked by required conditions
Tests / export-data (push) Blocked by required conditions
Tests / portdb (14, 3.10) (push) Blocked by required conditions
Tests / portdb (17, 3.14) (push) Blocked by required conditions
Tests / complement (monolith, Postgres) (push) Blocked by required conditions
Tests / cargo-bench (push) Blocked by required conditions
Tests / complement (monolith, SQLite) (push) Blocked by required conditions
Tests / complement (workers, Postgres) (push) Blocked by required conditions
Tests / cargo-test (push) Blocked by required conditions
Stemming from #19274 this updates the `olddeps` CI to test against not
just the minimum version of our explicit dependencies, but also the
minimum version of all implicit (transitive) dependencies that are
pulled in from the explicit dependencies themselves.

### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [X] Pull request is based on the develop branch
* [X] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [X] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct (run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))
2025-12-11 17:58:27 +00:00
Andrew Morgan
3aaa2e80b2
Switch the build backend from poetry-core to maturin (#19234)
Some checks are pending
Tests / lint-rust (push) Blocked by required conditions
Tests / lint-rustfmt (push) Blocked by required conditions
Tests / lint-readme (push) Blocked by required conditions
Tests / linting-done (push) Blocked by required conditions
Tests / calculate-test-jobs (push) Blocked by required conditions
Tests / trial (push) Blocked by required conditions
Tests / portdb (14, 3.10) (push) Blocked by required conditions
Tests / portdb (17, 3.14) (push) Blocked by required conditions
Tests / tests-done (push) Blocked by required conditions
Tests / trial-olddeps (push) Blocked by required conditions
Tests / trial-pypy (all, pypy-3.10) (push) Blocked by required conditions
Tests / sytest (push) Blocked by required conditions
Tests / export-data (push) Blocked by required conditions
Tests / complement (monolith, Postgres) (push) Blocked by required conditions
Tests / complement (monolith, SQLite) (push) Blocked by required conditions
Tests / complement (workers, Postgres) (push) Blocked by required conditions
Tests / cargo-test (push) Blocked by required conditions
Tests / cargo-bench (push) Blocked by required conditions
Schema / Ensure Synapse config schema is valid (push) Waiting to run
Schema / Ensure generated documentation is up-to-date (push) Waiting to run
Tests / changes (push) Waiting to run
Tests / check-sampleconfig (push) Blocked by required conditions
Tests / check-schema-delta (push) Blocked by required conditions
Tests / check-lockfile (push) Waiting to run
Tests / lint (push) Blocked by required conditions
Tests / Typechecking (push) Blocked by required conditions
Tests / lint-crlf (push) Waiting to run
Tests / lint-newsfile (push) Waiting to run
Tests / lint-clippy (push) Blocked by required conditions
Tests / lint-clippy-nightly (push) Blocked by required conditions
2025-12-10 14:46:47 +00:00
dependabot[bot]
ba774e2311
Bump ruff from 0.14.5 to 0.14.6 in the minor-and-patches group across 1 directory (#19296)
Some checks failed
Tests / sytest (push) Blocked by required conditions
Tests / export-data (push) Blocked by required conditions
Tests / portdb (14, 3.10) (push) Blocked by required conditions
Tests / portdb (17, 3.14) (push) Blocked by required conditions
Tests / complement (monolith, Postgres) (push) Blocked by required conditions
Tests / complement (monolith, SQLite) (push) Blocked by required conditions
Tests / complement (workers, Postgres) (push) Blocked by required conditions
Tests / cargo-test (push) Blocked by required conditions
Tests / cargo-bench (push) Blocked by required conditions
Tests / tests-done (push) Blocked by required conditions
Schema / Ensure generated documentation is up-to-date (push) Waiting to run
Tests / changes (push) Waiting to run
Tests / check-sampleconfig (push) Blocked by required conditions
Tests / check-schema-delta (push) Blocked by required conditions
Tests / check-lockfile (push) Waiting to run
Tests / lint (push) Blocked by required conditions
Tests / Typechecking (push) Blocked by required conditions
Tests / lint-crlf (push) Waiting to run
Tests / lint-newsfile (push) Waiting to run
Tests / lint-clippy (push) Blocked by required conditions
Tests / lint-clippy-nightly (push) Blocked by required conditions
Tests / lint-rust (push) Blocked by required conditions
Tests / lint-rustfmt (push) Blocked by required conditions
Tests / lint-readme (push) Blocked by required conditions
Tests / linting-done (push) Blocked by required conditions
Tests / calculate-test-jobs (push) Blocked by required conditions
Tests / trial (push) Blocked by required conditions
Tests / trial-olddeps (push) Blocked by required conditions
Tests / trial-pypy (all, pypy-3.10) (push) Blocked by required conditions
/ Check locked dependencies have sdists (push) Has been cancelled
Bumps the minor-and-patches group with 1 update in the / directory:
[ruff](https://github.com/astral-sh/ruff).

Updates `ruff` from 0.14.5 to 0.14.6
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/releases">ruff's
releases</a>.</em></p>
<blockquote>
<h2>0.14.6</h2>
<h2>Release Notes</h2>
<p>Released on 2025-11-21.</p>
<h3>Preview features</h3>
<ul>
<li>[<code>flake8-bandit</code>] Support new PySNMP API paths
(<code>S508</code>, <code>S509</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/21374">#21374</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>Adjust own-line comment placement between branches (<a
href="https://redirect.github.com/astral-sh/ruff/pull/21185">#21185</a>)</li>
<li>Avoid syntax error when formatting attribute expressions with outer
parentheses, parenthesized value, and trailing comment on value (<a
href="https://redirect.github.com/astral-sh/ruff/pull/20418">#20418</a>)</li>
<li>Fix panic when formatting comments in unary expressions (<a
href="https://redirect.github.com/astral-sh/ruff/pull/21501">#21501</a>)</li>
<li>Respect <code>fmt: skip</code> for compound statements on a single
line (<a
href="https://redirect.github.com/astral-sh/ruff/pull/20633">#20633</a>)</li>
<li>[<code>refurb</code>] Fix <code>FURB103</code> autofix (<a
href="https://redirect.github.com/astral-sh/ruff/pull/21454">#21454</a>)</li>
<li>[<code>ruff</code>] Fix false positive for complex conversion
specifiers in <code>logging-eager-conversion</code>
(<code>RUF065</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/21464">#21464</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>[<code>ruff</code>] Avoid false positive on <code>ClassVar</code>
reassignment (<code>RUF012</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/21478">#21478</a>)</li>
</ul>
<h3>CLI</h3>
<ul>
<li>Render hyperlinks for lint errors (<a
href="https://redirect.github.com/astral-sh/ruff/pull/21514">#21514</a>)</li>
<li>Add a <code>ruff analyze</code> option to skip over imports in
<code>TYPE_CHECKING</code> blocks (<a
href="https://redirect.github.com/astral-sh/ruff/pull/21472">#21472</a>)</li>
</ul>
<h3>Documentation</h3>
<ul>
<li>Limit <code>eglot-format</code> hook to eglot-managed Python buffers
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/21459">#21459</a>)</li>
<li>Mention <code>force-exclude</code> in &quot;Configuration &gt;
Python file discovery&quot; (<a
href="https://redirect.github.com/astral-sh/ruff/pull/21500">#21500</a>)</li>
</ul>
<h3>Contributors</h3>
<ul>
<li><a href="https://github.com/ntBre"><code>@​ntBre</code></a></li>
<li><a href="https://github.com/dylwil3"><code>@​dylwil3</code></a></li>
<li><a
href="https://github.com/gauthsvenkat"><code>@​gauthsvenkat</code></a></li>
<li><a
href="https://github.com/MichaReiser"><code>@​MichaReiser</code></a></li>
<li><a href="https://github.com/thamer"><code>@​thamer</code></a></li>
<li><a
href="https://github.com/Ruchir28"><code>@​Ruchir28</code></a></li>
<li><a
href="https://github.com/thejcannon"><code>@​thejcannon</code></a></li>
<li><a
href="https://github.com/danparizher"><code>@​danparizher</code></a></li>
<li><a
href="https://github.com/chirizxc"><code>@​chirizxc</code></a></li>
</ul>
<h2>Install ruff 0.14.6</h2>
<h3>Install prebuilt binaries via shell script</h3>
<pre lang="sh"><code>curl --proto '=https' --tlsv1.2 -LsSf
https://github.com/astral-sh/ruff/releases/download/0.14.6/ruff-installer.sh
| sh
&lt;/tr&gt;&lt;/table&gt; 
</code></pre>
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/astral-sh/ruff/blob/main/CHANGELOG.md">ruff's
changelog</a>.</em></p>
<blockquote>
<h2>0.14.6</h2>
<p>Released on 2025-11-21.</p>
<h3>Preview features</h3>
<ul>
<li>[<code>flake8-bandit</code>] Support new PySNMP API paths
(<code>S508</code>, <code>S509</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/21374">#21374</a>)</li>
</ul>
<h3>Bug fixes</h3>
<ul>
<li>Adjust own-line comment placement between branches (<a
href="https://redirect.github.com/astral-sh/ruff/pull/21185">#21185</a>)</li>
<li>Avoid syntax error when formatting attribute expressions with outer
parentheses, parenthesized value, and trailing comment on value (<a
href="https://redirect.github.com/astral-sh/ruff/pull/20418">#20418</a>)</li>
<li>Fix panic when formatting comments in unary expressions (<a
href="https://redirect.github.com/astral-sh/ruff/pull/21501">#21501</a>)</li>
<li>Respect <code>fmt: skip</code> for compound statements on a single
line (<a
href="https://redirect.github.com/astral-sh/ruff/pull/20633">#20633</a>)</li>
<li>[<code>refurb</code>] Fix <code>FURB103</code> autofix (<a
href="https://redirect.github.com/astral-sh/ruff/pull/21454">#21454</a>)</li>
<li>[<code>ruff</code>] Fix false positive for complex conversion
specifiers in <code>logging-eager-conversion</code>
(<code>RUF065</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/21464">#21464</a>)</li>
</ul>
<h3>Rule changes</h3>
<ul>
<li>[<code>ruff</code>] Avoid false positive on <code>ClassVar</code>
reassignment (<code>RUF012</code>) (<a
href="https://redirect.github.com/astral-sh/ruff/pull/21478">#21478</a>)</li>
</ul>
<h3>CLI</h3>
<ul>
<li>Render hyperlinks for lint errors (<a
href="https://redirect.github.com/astral-sh/ruff/pull/21514">#21514</a>)</li>
<li>Add a <code>ruff analyze</code> option to skip over imports in
<code>TYPE_CHECKING</code> blocks (<a
href="https://redirect.github.com/astral-sh/ruff/pull/21472">#21472</a>)</li>
</ul>
<h3>Documentation</h3>
<ul>
<li>Limit <code>eglot-format</code> hook to eglot-managed Python buffers
(<a
href="https://redirect.github.com/astral-sh/ruff/pull/21459">#21459</a>)</li>
<li>Mention <code>force-exclude</code> in &quot;Configuration &gt;
Python file discovery&quot; (<a
href="https://redirect.github.com/astral-sh/ruff/pull/21500">#21500</a>)</li>
</ul>
<h3>Contributors</h3>
<ul>
<li><a href="https://github.com/ntBre"><code>@​ntBre</code></a></li>
<li><a href="https://github.com/dylwil3"><code>@​dylwil3</code></a></li>
<li><a
href="https://github.com/gauthsvenkat"><code>@​gauthsvenkat</code></a></li>
<li><a
href="https://github.com/MichaReiser"><code>@​MichaReiser</code></a></li>
<li><a href="https://github.com/thamer"><code>@​thamer</code></a></li>
<li><a
href="https://github.com/Ruchir28"><code>@​Ruchir28</code></a></li>
<li><a
href="https://github.com/thejcannon"><code>@​thejcannon</code></a></li>
<li><a
href="https://github.com/danparizher"><code>@​danparizher</code></a></li>
<li><a
href="https://github.com/chirizxc"><code>@​chirizxc</code></a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="59c6cb521d"><code>59c6cb5</code></a>
Bump 0.14.6 (<a
href="https://redirect.github.com/astral-sh/ruff/issues/21558">#21558</a>)</li>
<li><a
href="54dba15088"><code>54dba15</code></a>
[ty] Improve debug messages when imports fail (<a
href="https://redirect.github.com/astral-sh/ruff/issues/21555">#21555</a>)</li>
<li><a
href="1af318534a"><code>1af3185</code></a>
[ty] Add support for relative import completions</li>
<li><a
href="553e568624"><code>553e568</code></a>
[ty] Refactor detection of import statements for completions</li>
<li><a
href="cdef3f5ab8"><code>cdef3f5</code></a>
[ty] Use dedicated collector for completions</li>
<li><a
href="6178822427"><code>6178822</code></a>
[ty] Attach subdiagnostics to <code>unresolved-import</code> errors for
relative imports...</li>
<li><a
href="6b7adb0537"><code>6b7adb0</code></a>
[ty] support PEP 613 type aliases (<a
href="https://redirect.github.com/astral-sh/ruff/issues/21394">#21394</a>)</li>
<li><a
href="06941c1987"><code>06941c1</code></a>
[ty] More low-hanging fruit for inlay hint goto-definition (<a
href="https://redirect.github.com/astral-sh/ruff/issues/21548">#21548</a>)</li>
<li><a
href="eb7c098d6b"><code>eb7c098</code></a>
[ty] implement <code>TypedDict</code> structural assignment (<a
href="https://redirect.github.com/astral-sh/ruff/issues/21467">#21467</a>)</li>
<li><a
href="1b28fc1f14"><code>1b28fc1</code></a>
[ty] Add more random TypeDetails and tests (<a
href="https://redirect.github.com/astral-sh/ruff/issues/21546">#21546</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/astral-sh/ruff/compare/0.14.5...0.14.6">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=ruff&package-manager=pip&previous-version=0.14.5&new-version=0.14.6)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-09 23:01:56 +00:00
Devon Hudson
acafac3bb6
Merge branch 'master' into develop
Some checks are pending
Tests / changes (push) Waiting to run
Tests / lint-crlf (push) Waiting to run
Tests / lint-newsfile (push) Waiting to run
Tests / lint-clippy (push) Blocked by required conditions
Tests / lint-clippy-nightly (push) Blocked by required conditions
Tests / lint-rust (push) Blocked by required conditions
Tests / calculate-test-jobs (push) Blocked by required conditions
Tests / trial (push) Blocked by required conditions
Tests / trial-olddeps (push) Blocked by required conditions
Schema / Ensure Synapse config schema is valid (push) Waiting to run
Schema / Ensure generated documentation is up-to-date (push) Waiting to run
Tests / check-sampleconfig (push) Blocked by required conditions
Tests / check-schema-delta (push) Blocked by required conditions
Tests / check-lockfile (push) Waiting to run
Tests / lint (push) Blocked by required conditions
Tests / Typechecking (push) Blocked by required conditions
Tests / lint-rustfmt (push) Blocked by required conditions
Tests / lint-readme (push) Blocked by required conditions
Tests / linting-done (push) Blocked by required conditions
Tests / trial-pypy (all, pypy-3.10) (push) Blocked by required conditions
Tests / sytest (push) Blocked by required conditions
Tests / export-data (push) Blocked by required conditions
Tests / portdb (14, 3.10) (push) Blocked by required conditions
Tests / portdb (17, 3.14) (push) Blocked by required conditions
Tests / complement (monolith, Postgres) (push) Blocked by required conditions
Tests / complement (monolith, SQLite) (push) Blocked by required conditions
Tests / complement (workers, Postgres) (push) Blocked by required conditions
Tests / cargo-test (push) Blocked by required conditions
Tests / cargo-bench (push) Blocked by required conditions
Tests / tests-done (push) Blocked by required conditions
2025-12-09 09:30:32 -07:00
Devon Hudson
8b0083cad9
Respond with useful error codes when Content-Length header/s are invalid (#19212)
Some checks are pending
Schema / Ensure Synapse config schema is valid (push) Waiting to run
Schema / Ensure generated documentation is up-to-date (push) Waiting to run
Tests / check-schema-delta (push) Blocked by required conditions
Tests / check-lockfile (push) Waiting to run
Tests / lint (push) Blocked by required conditions
Tests / lint-readme (push) Blocked by required conditions
Tests / changes (push) Waiting to run
Tests / check-sampleconfig (push) Blocked by required conditions
Tests / Typechecking (push) Blocked by required conditions
Tests / lint-crlf (push) Waiting to run
Tests / lint-newsfile (push) Waiting to run
Tests / lint-clippy (push) Blocked by required conditions
Tests / lint-clippy-nightly (push) Blocked by required conditions
Tests / lint-rust (push) Blocked by required conditions
Tests / lint-rustfmt (push) Blocked by required conditions
Tests / linting-done (push) Blocked by required conditions
Tests / calculate-test-jobs (push) Blocked by required conditions
Tests / trial (push) Blocked by required conditions
Tests / trial-olddeps (push) Blocked by required conditions
Tests / trial-pypy (all, pypy-3.10) (push) Blocked by required conditions
Tests / sytest (push) Blocked by required conditions
Tests / export-data (push) Blocked by required conditions
Tests / portdb (14, 3.10) (push) Blocked by required conditions
Tests / portdb (17, 3.14) (push) Blocked by required conditions
Tests / tests-done (push) Blocked by required conditions
Tests / complement (monolith, Postgres) (push) Blocked by required conditions
Tests / complement (monolith, SQLite) (push) Blocked by required conditions
Tests / complement (workers, Postgres) (push) Blocked by required conditions
Tests / cargo-test (push) Blocked by required conditions
Tests / cargo-bench (push) Blocked by required conditions
Related to https://github.com/element-hq/synapse/issues/17035, when
Synapse receives a request that is larger than the maximum size allowed,
it aborts the connection without ever sending back a HTTP response.
I dug into our usage of twisted and how best to try and report such an
error and this is what I came up with.

It would be ideal to be able to report the status from within
`handleContentChunk` but that is called too early on in the twisted http
handling code, before things have been setup enough to be able to
properly write a response.
I tested this change out locally (both with C-S and S-S apis) and they
do receive a 413 response now in addition to the connection being
closed.

Hopefully this will aid in being able to quickly detect when
https://github.com/element-hq/synapse/issues/17035 is occurring as the
current situation makes it very hard to narrow things down to that
specific issue without making a lot of assumptions.

This PR also responds with more meaningful error codes now in the case
of:
- multiple `Content-Length` headers
- invalid `Content-Length` header value
- request content size being larger than the `Content-Length` value

### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [X] Pull request is based on the develop branch
* [X] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [X] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct (run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))

---------

Co-authored-by: Eric Eastwood <erice@element.io>
2025-12-08 21:39:18 +00:00
dependabot[bot]
09fd2645c2
Bump urllib3 from 2.5.0 to 2.6.0 (#19282)
Some checks failed
Tests / lint-newsfile (push) Has been cancelled
Tests / lint-readme (push) Has been cancelled
Tests / linting-done (push) Has been cancelled
Tests / tests-done (push) Has been cancelled
Build docker images / Push merged images to docker.io/matrixdotorg/synapse (push) Has been cancelled
Build docker images / Push merged images to ghcr.io/element-hq/synapse (push) Has been cancelled
Deploy the documentation / GitHub Pages (push) Has been cancelled
Build release artifacts / Build .deb packages (push) Has been cancelled
Build release artifacts / Attach assets to release (push) Has been cancelled
Tests / check-sampleconfig (push) Has been cancelled
Tests / check-schema-delta (push) Has been cancelled
Tests / lint (push) Has been cancelled
Tests / Typechecking (push) Has been cancelled
Tests / lint-clippy (push) Has been cancelled
Tests / lint-clippy-nightly (push) Has been cancelled
Tests / lint-rust (push) Has been cancelled
Tests / lint-rustfmt (push) Has been cancelled
Tests / calculate-test-jobs (push) Has been cancelled
Tests / trial (push) Has been cancelled
Tests / trial-olddeps (push) Has been cancelled
Tests / trial-pypy (all, pypy-3.10) (push) Has been cancelled
Tests / sytest (push) Has been cancelled
Tests / export-data (push) Has been cancelled
Tests / portdb (14, 3.10) (push) Has been cancelled
Tests / portdb (17, 3.14) (push) Has been cancelled
Tests / complement (monolith, Postgres) (push) Has been cancelled
Tests / complement (monolith, SQLite) (push) Has been cancelled
Tests / complement (workers, Postgres) (push) Has been cancelled
Tests / cargo-test (push) Has been cancelled
Tests / cargo-bench (push) Has been cancelled
Bumps [urllib3](https://github.com/urllib3/urllib3) from 2.5.0 to 2.6.0.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/urllib3/urllib3/releases">urllib3's
releases</a>.</em></p>
<blockquote>
<h2>2.6.0</h2>
<h2>🚀 urllib3 is fundraising for HTTP/2 support</h2>
<p><a
href="https://sethmlarson.dev/urllib3-is-fundraising-for-http2-support">urllib3
is raising ~$40,000 USD</a> to release HTTP/2 support and ensure
long-term sustainable maintenance of the project after a sharp decline
in financial support. If your company or organization uses Python and
would benefit from HTTP/2 support in Requests, pip, cloud SDKs, and
thousands of other projects <a
href="https://opencollective.com/urllib3">please consider contributing
financially</a> to ensure HTTP/2 support is developed sustainably and
maintained for the long-haul.</p>
<p>Thank you for your support.</p>
<h2>Security</h2>
<ul>
<li>Fixed a security issue where streaming API could improperly handle
highly compressed HTTP content (&quot;decompression bombs&quot;) leading
to excessive resource consumption even when a small amount of data was
requested. Reading small chunks of compressed data is safer and much
more efficient now. (CVE-2025-66471 reported by <a
href="https://github.com/Cycloctane"><code>@​Cycloctane</code></a>, 8.9
High, GHSA-2xpw-w6gg-jr37)</li>
<li>Fixed a security issue where an attacker could compose an HTTP
response with virtually unlimited links in the
<code>Content-Encoding</code> header, potentially leading to a denial of
service (DoS) attack by exhausting system resources during decoding. The
number of allowed chained encodings is now limited to 5. (CVE-2025-66418
reported by <a
href="https://github.com/illia-v"><code>@​illia-v</code></a>, 8.9 High,
GHSA-gm62-xv2j-4w53)</li>
</ul>
<blockquote>
<p>[!IMPORTANT]</p>
<ul>
<li>If urllib3 is not installed with the optional
<code>urllib3[brotli]</code> extra, but your environment contains a
Brotli/brotlicffi/brotlipy package anyway, make sure to upgrade it to at
least Brotli 1.2.0 or brotlicffi 1.2.0.0 to benefit from the security
fixes and avoid warnings. Prefer using <code>urllib3[brotli]</code> to
install a compatible Brotli package automatically.</li>
<li>If you use custom decompressors, please make sure to update them to
respect the changed API of
<code>urllib3.response.ContentDecoder</code>.</li>
</ul>
</blockquote>
<h2>Features</h2>
<ul>
<li>Enabled retrieval, deletion, and membership testing in
<code>HTTPHeaderDict</code> using bytes keys. (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3653">#3653</a>)</li>
<li>Added host and port information to string representations of
<code>HTTPConnection</code>. (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3666">#3666</a>)</li>
<li>Added support for Python 3.14 free-threading builds explicitly. (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3696">#3696</a>)</li>
</ul>
<h2>Removals</h2>
<ul>
<li>Removed the <code>HTTPResponse.getheaders()</code> method in favor
of <code>HTTPResponse.headers</code>. Removed the
<code>HTTPResponse.getheader(name, default)</code> method in favor of
<code>HTTPResponse.headers.get(name, default)</code>. (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3622">#3622</a>)</li>
</ul>
<h2>Bugfixes</h2>
<ul>
<li>Fixed redirect handling in <code>urllib3.PoolManager</code> when an
integer is passed for the retries parameter. (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3649">#3649</a>)</li>
<li>Fixed <code>HTTPConnectionPool</code> when used in Emscripten with
no explicit port. (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3664">#3664</a>)</li>
<li>Fixed handling of <code>SSLKEYLOGFILE</code> with expandable
variables. (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3700">#3700</a>)</li>
</ul>
<h2>Misc</h2>
<ul>
<li>Changed the <code>zstd</code> extra to install
<code>backports.zstd</code> instead of <code>zstandard</code> on Python
3.13 and before. (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3693">#3693</a>)</li>
<li>Improved the performance of content decoding by optimizing
<code>BytesQueueBuffer</code> class. (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3710">#3710</a>)</li>
<li>Allowed building the urllib3 package with newer setuptools-scm v9.x.
(<a
href="https://redirect.github.com/urllib3/urllib3/issues/3652">#3652</a>)</li>
<li>Ensured successful urllib3 builds by setting Hatchling requirement
to ≥ 1.27.0. (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3638">#3638</a>)</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/urllib3/urllib3/blob/main/CHANGES.rst">urllib3's
changelog</a>.</em></p>
<blockquote>
<h1>2.6.0 (2025-12-05)</h1>
<h2>Security</h2>
<ul>
<li>Fixed a security issue where streaming API could improperly handle
highly
compressed HTTP content (&quot;decompression bombs&quot;) leading to
excessive resource
consumption even when a small amount of data was requested. Reading
small
chunks of compressed data is safer and much more efficient now.
(<code>GHSA-2xpw-w6gg-jr37
&lt;https://github.com/urllib3/urllib3/security/advisories/GHSA-2xpw-w6gg-jr37&gt;</code>__)</li>
<li>Fixed a security issue where an attacker could compose an HTTP
response with
virtually unlimited links in the <code>Content-Encoding</code> header,
potentially
leading to a denial of service (DoS) attack by exhausting system
resources
during decoding. The number of allowed chained encodings is now limited
to 5.
(<code>GHSA-gm62-xv2j-4w53
&lt;https://github.com/urllib3/urllib3/security/advisories/GHSA-gm62-xv2j-4w53&gt;</code>__)</li>
</ul>
<p>.. caution::</p>
<ul>
<li>
<p>If urllib3 is not installed with the optional
<code>urllib3[brotli]</code> extra, but
your environment contains a Brotli/brotlicffi/brotlipy package anyway,
make
sure to upgrade it to at least Brotli 1.2.0 or brotlicffi 1.2.0.0 to
benefit from the security fixes and avoid warnings. Prefer using
<code>urllib3[brotli]</code> to install a compatible Brotli package
automatically.</p>
</li>
<li>
<p>If you use custom decompressors, please make sure to update them to
respect the changed API of
<code>urllib3.response.ContentDecoder</code>.</p>
</li>
</ul>
<h2>Features</h2>
<ul>
<li>Enabled retrieval, deletion, and membership testing in
<code>HTTPHeaderDict</code> using bytes keys.
(<code>[#3653](https://github.com/urllib3/urllib3/issues/3653)
&lt;https://github.com/urllib3/urllib3/issues/3653&gt;</code>__)</li>
<li>Added host and port information to string representations of
<code>HTTPConnection</code>.
(<code>[#3666](https://github.com/urllib3/urllib3/issues/3666)
&lt;https://github.com/urllib3/urllib3/issues/3666&gt;</code>__)</li>
<li>Added support for Python 3.14 free-threading builds explicitly.
(<code>[#3696](https://github.com/urllib3/urllib3/issues/3696)
&lt;https://github.com/urllib3/urllib3/issues/3696&gt;</code>__)</li>
</ul>
<h2>Removals</h2>
<ul>
<li>Removed the <code>HTTPResponse.getheaders()</code> method in favor
of <code>HTTPResponse.headers</code>.
Removed the <code>HTTPResponse.getheader(name, default)</code> method in
favor of <code>HTTPResponse.headers.get(name, default)</code>.
(<code>[#3622](https://github.com/urllib3/urllib3/issues/3622)
&lt;https://github.com/urllib3/urllib3/issues/3622&gt;</code>__)</li>
</ul>
<h2>Bugfixes</h2>
<ul>
<li>Fixed redirect handling in <code>urllib3.PoolManager</code> when an
integer is passed
for the retries parameter.
(<code>[#3649](https://github.com/urllib3/urllib3/issues/3649)
&lt;https://github.com/urllib3/urllib3/issues/3649&gt;</code>__)</li>
<li>Fixed <code>HTTPConnectionPool</code> when used in Emscripten with
no explicit port.
(<code>[#3664](https://github.com/urllib3/urllib3/issues/3664)
&lt;https://github.com/urllib3/urllib3/issues/3664&gt;</code>__)</li>
<li>Fixed handling of <code>SSLKEYLOGFILE</code> with expandable
variables.
(<code>[#3700](https://github.com/urllib3/urllib3/issues/3700)
&lt;https://github.com/urllib3/urllib3/issues/3700&gt;</code>__)</li>
</ul>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="720f484b60"><code>720f484</code></a>
Release 2.6.0</li>
<li><a
href="24d7b67eac"><code>24d7b67</code></a>
Merge commit from fork</li>
<li><a
href="c19571de34"><code>c19571d</code></a>
Merge commit from fork</li>
<li><a
href="816fcf0452"><code>816fcf0</code></a>
Bump actions/setup-python from 6.0.0 to 6.1.0 (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3725">#3725</a>)</li>
<li><a
href="18af0a10ef"><code>18af0a1</code></a>
Improve speed of <code>BytesQueueBuffer.get()</code> by using memoryview
(<a
href="https://redirect.github.com/urllib3/urllib3/issues/3711">#3711</a>)</li>
<li><a
href="1f6abac3e6"><code>1f6abac</code></a>
Bump versions of pre-commit hooks (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3716">#3716</a>)</li>
<li><a
href="1c8fbf787b"><code>1c8fbf7</code></a>
Bump actions/checkout from 5.0.0 to 6.0.0 (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3722">#3722</a>)</li>
<li><a
href="7784b9eee9"><code>7784b9e</code></a>
Add Python 3.15 to CI (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3717">#3717</a>)</li>
<li><a
href="0241c9e728"><code>0241c9e</code></a>
Updated docs to reflect change in optional zstd dependency from
<code>zstandard</code> t...</li>
<li><a
href="7afcabb648"><code>7afcabb</code></a>
Expand environment variable of SSLKEYLOGFILE (<a
href="https://redirect.github.com/urllib3/urllib3/issues/3705">#3705</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/urllib3/urllib3/compare/2.5.0...2.6.0">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=urllib3&package-manager=pip&previous-version=2.5.0&new-version=2.6.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)
You can disable automated security fix PRs for this repo from the
[Security Alerts
page](https://github.com/element-hq/synapse/network/alerts).

</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-05 23:51:29 +00:00
dependabot[bot]
891983f3f4
Bump the minor-and-patches group with 3 updates (#19280)
Bumps the minor-and-patches group with 3 updates:
[mypy](https://github.com/python/mypy),
[mypy-zope](https://github.com/Shoobx/mypy-zope) and
[phonenumbers](https://github.com/daviddrysdale/python-phonenumbers).

Updates `mypy` from 1.17.1 to 1.18.2
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/python/mypy/blob/master/CHANGELOG.md">mypy's
changelog</a>.</em></p>
<blockquote>
<h3>Mypy 1.18.2</h3>
<ul>
<li>Fix crash on recursive alias (Ivan Levkivskyi, PR <a
href="https://redirect.github.com/python/mypy/pull/19845">19845</a>)</li>
<li>Add additional guidance for stubtest errors when runtime is
<code>object.__init__</code> (Stephen Morton, PR <a
href="https://redirect.github.com/python/mypy/pull/19733">19733</a>)</li>
<li>Fix handling of None values in f-string expressions in mypyc
(BobTheBuidler, PR <a
href="https://redirect.github.com/python/mypy/pull/19846">19846</a>)</li>
</ul>
<h3>Acknowledgements</h3>
<p>Thanks to all mypy contributors who contributed to this release:</p>
<ul>
<li>Ali Hamdan</li>
<li>Anthony Sottile</li>
<li>BobTheBuidler</li>
<li>Brian Schubert</li>
<li>Chainfire</li>
<li>Charlie Denton</li>
<li>Christoph Tyralla</li>
<li>CoolCat467</li>
<li>Daniel Hnyk</li>
<li>Emily</li>
<li>Emma Smith</li>
<li>Ethan Sarp</li>
<li>Ivan Levkivskyi</li>
<li>Jahongir Qurbonov</li>
<li>Jelle Zijlstra</li>
<li>Joren Hammudoglu</li>
<li>Jukka Lehtosalo</li>
<li>Marc Mueller</li>
<li>Omer Hadari</li>
<li>Piotr Sawicki</li>
<li>PrinceNaroliya</li>
<li>Randolf Scholz</li>
<li>Robsdedude</li>
<li>Saul Shanabrook</li>
<li>Shantanu</li>
<li>Stanislav Terliakov</li>
<li>Stephen Morton</li>
<li>wyattscarpenter</li>
</ul>
<p>I’d also like to thank my employer, Dropbox, for supporting mypy
development.</p>
<h2>Mypy 1.17</h2>
<p>We’ve just uploaded mypy 1.17 to the Python Package Index (<a
href="https://pypi.org/project/mypy/">PyPI</a>).
Mypy is a static type checker for Python. This release includes new
features and bug fixes.
You can install it as follows:</p>
<pre><code>python3 -m pip install -U mypy
</code></pre>
<p>You can read the full documentation for this release on <a
href="http://mypy.readthedocs.io">Read the Docs</a>.</p>
<!-- raw HTML omitted -->
</blockquote>
<p>... (truncated)</p>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="df05f05555"><code>df05f05</code></a>
remove +dev from version</li>
<li><a
href="01a7a1285d"><code>01a7a12</code></a>
Update changelog for 1.18.2 (<a
href="https://redirect.github.com/python/mypy/issues/19873">#19873</a>)</li>
<li><a
href="ca5abf09f3"><code>ca5abf0</code></a>
Typeshed cherry-pick: Make type of <code>unitest.mock.Any</code> a
subclass of <code>Any</code> (<a
href="https://redirect.github.com/python/mypy/issues/1">#1</a>...</li>
<li><a
href="9d794b57d9"><code>9d794b5</code></a>
[mypyc] fix: inappropriate <code>None</code>s in f-strings (<a
href="https://redirect.github.com/python/mypy/issues/19846">#19846</a>)</li>
<li><a
href="2c0510c848"><code>2c0510c</code></a>
stubtest: additional guidance on errors when runtime is
object.<strong>init</strong> (<a
href="https://redirect.github.com/python/mypy/issues/19733">#19733</a>)</li>
<li><a
href="2f3f03c3e3"><code>2f3f03c</code></a>
Bump version to 1.18.2+dev for point release</li>
<li><a
href="76698412bc"><code>7669841</code></a>
Fix crash on recursive alias in indirection.py (<a
href="https://redirect.github.com/python/mypy/issues/19845">#19845</a>)</li>
<li><a
href="03fbaa941b"><code>03fbaa9</code></a>
bump version to 1.18.1 due to wheels failure</li>
<li><a
href="b44a1fbf0c"><code>b44a1fb</code></a>
removed +dev from version</li>
<li><a
href="7197a99d1a"><code>7197a99</code></a>
Removed Unreleased in the Changelog for Release 1.18 (<a
href="https://redirect.github.com/python/mypy/issues/19827">#19827</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/python/mypy/compare/v1.17.1...v1.18.2">compare
view</a></li>
</ul>
</details>
<br />

Updates `mypy-zope` from 1.0.13 to 1.0.14
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/Shoobx/mypy-zope/blob/master/CHANGELOG.md">mypy-zope's
changelog</a>.</em></p>
<blockquote>
<h2>1.0.14 (2025-12-01)</h2>
<hr />
<ul>
<li>Support mypy-1.19</li>
<li>Support mypy-1.18</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="38d22f3f4f"><code>38d22f3</code></a>
Preparing release 1.0.14</li>
<li><a
href="76762ec861"><code>76762ec</code></a>
Maintain changelog</li>
<li><a
href="4971d98ab8"><code>4971d98</code></a>
Merge pull request <a
href="https://redirect.github.com/Shoobx/mypy-zope/issues/134">#134</a>
from Shoobx/dependabot/pip/mypy-gte-1.0.0-and-lt-1.20.0</li>
<li><a
href="47af89d2c7"><code>47af89d</code></a>
Update mypy requirement from &lt;1.19.0,&gt;=1.0.0 to
&gt;=1.0.0,&lt;1.20.0</li>
<li><a
href="0c596ff804"><code>0c596ff</code></a>
Maintain changelog</li>
<li><a
href="dcaa27841d"><code>dcaa278</code></a>
Merge pull request <a
href="https://redirect.github.com/Shoobx/mypy-zope/issues/132">#132</a>
from Shoobx/dependabot/pip/mypy-gte-1.0.0-and-lt-1.19.0</li>
<li><a
href="8f7b6778df"><code>8f7b677</code></a>
Update mypy requirement from &lt;1.18.0,&gt;=1.0.0 to
&gt;=1.0.0,&lt;1.19.0</li>
<li><a
href="91b275b364"><code>91b275b</code></a>
Back to development: 1.0.14</li>
<li>See full diff in <a
href="https://github.com/Shoobx/mypy-zope/compare/1.0.13...1.0.14">compare
view</a></li>
</ul>
</details>
<br />

Updates `phonenumbers` from 9.0.18 to 9.0.19
<details>
<summary>Commits</summary>
<ul>
<li><a
href="38f2ffe1e8"><code>38f2ffe</code></a>
Prep for 9.0.19 release</li>
<li><a
href="cd7f0cc64f"><code>cd7f0cc</code></a>
Generated files for metadata</li>
<li><a
href="40ae18f50a"><code>40ae18f</code></a>
Merge metadata changes from upstream 9.0.19</li>
<li>See full diff in <a
href="https://github.com/daviddrysdale/python-phonenumbers/compare/v9.0.18...v9.0.19">compare
view</a></li>
</ul>
</details>
<br />

**Does not** update `pysaml2` from 7.5.0 to 7.5.4 since this would
downgrade pyOpenSSL
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/IdentityPython/pysaml2/releases">pysaml2's
releases</a>.</em></p>
<blockquote>
<h2>Version v7.5.4</h2>
<h2>v7.5.4 (2025-10-07)</h2>
<ul>
<li>Minor refactor to handle <code>shelve.open</code> and
<code>dbm</code> errors</li>
<li>Remove import of deprecated <code>cgi</code> module</li>
<li>Replace deprecated <code>datetime.utcnow()</code> by
<code>datetime.now(timezone.utc)</code></li>
<li>deps: Remove the <code>importlib_metadata</code> dependency</li>
<li>deps: Remove the <code>importlib_resources</code> dependency</li>
<li>deps: Update dependency versions and lockfile</li>
<li>build: Update pyproject and lockfile to be compatible with PEP
621</li>
<li>docs: Correct spelling mistakes</li>
<li>docs: Fix interal references/links</li>
<li>docs: Clarify units for accepted_time_diff config param</li>
<li>docs: Correct documentation for contact_person</li>
</ul>
<h2>Version 7.5.3</h2>
<h2>7.5.3 (2025-10-04)</h2>
<ul>
<li><a
href="https://redirect.github.com/IdentityPython/pysaml2/issues/973">#973</a>
Fix prepare_for_negotiated_authenticate to avoid double signing redirect
requests</li>
</ul>
<h2>Version 7.5.2</h2>
<h2>7.5.2 (2025-02-10)</h2>
<ul>
<li>Include the XSD of the XML Encryption Syntax and Processing Version
1.1 to the schema validator</li>
</ul>
<h2>Version 7.5.1</h2>
<h2>7.5.1 (2025-02-10)</h2>
<ul>
<li>deps: restrict pyOpenSSL up to v24.2.1 until it is replaced</li>
<li>deps: update dependncies for the lockfile and examples</li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/IdentityPython/pysaml2/blob/master/CHANGELOG.md">pysaml2's
changelog</a>.</em></p>
<blockquote>
<h2>v7.5.4 (2025-10-07)</h2>
<ul>
<li>Minor refactor to handle <code>shelve.open</code> and
<code>dbm</code> errors</li>
<li>Remove import of deprecated <code>cgi</code> module</li>
<li>Replace deprecated <code>datetime.utcnow()</code> by
<code>datetime.now(timezone.utc)</code></li>
<li>deps: Remove the <code>importlib_metadata</code> dependency</li>
<li>deps: Remove the <code>importlib_resources</code> dependency</li>
<li>deps: Update dependency versions and lockfile</li>
<li>build: Update pyproject and lockfile to be compatible with PEP
621</li>
<li>docs: Correct spelling mistakes</li>
<li>docs: Fix interal references/links</li>
<li>docs: Clarify units for accepted_time_diff config param</li>
<li>docs: Correct documentation for contact_person</li>
</ul>
<h2>7.5.3 (2025-10-04)</h2>
<ul>
<li><a
href="https://redirect.github.com/IdentityPython/pysaml2/issues/973">#973</a>
Fix prepare_for_negotiated_authenticate to avoid double signing redirect
requests</li>
</ul>
<h2>7.5.2 (2025-02-10)</h2>
<ul>
<li>Include the XSD of the XML Encryption Syntax and Processing Version
1.1 to the schema validator</li>
</ul>
<h2>7.5.1 (2025-02-10)</h2>
<ul>
<li>deps: restrict pyOpenSSL up to v24.2.1 until it is replaced</li>
<li>deps: update dependencies for the lockfile and examples</li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="9cf71f7f9e"><code>9cf71f7</code></a>
Release version 7.5.4</li>
<li><a
href="c3ec7199d1"><code>c3ec719</code></a>
Refactor _shelve_compat</li>
<li><a
href="1d6ea6024e"><code>1d6ea60</code></a>
Remove import of deprecated cgi module</li>
<li><a
href="c45eb9df82"><code>c45eb9d</code></a>
Replace deprecated datetime.utcnow() by datetime.now(timezone.utc)</li>
<li><a
href="178f6d12b4"><code>178f6d1</code></a>
Remove unneeded dependencies</li>
<li><a
href="1f0a25a5cf"><code>1f0a25a</code></a>
remove importlib_metadata import</li>
<li><a
href="099f716ae7"><code>099f716</code></a>
remove importlib_resources imports</li>
<li><a
href="3fa11ee15d"><code>3fa11ee</code></a>
spelling updates.</li>
<li><a
href="4b7887f59a"><code>4b7887f</code></a>
update link.</li>
<li><a
href="bc8d3b4ecc"><code>bc8d3b4</code></a>
update link.</li>
<li>Additional commits viewable in <a
href="https://github.com/IdentityPython/pysaml2/compare/v7.5.0...v7.5.4">compare
view</a></li>
</ul>
</details>
<br />


Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore <dependency name> major version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's major version (unless you unignore this specific
dependency's major version or upgrade to it yourself)
- `@dependabot ignore <dependency name> minor version` will close this
group update PR and stop Dependabot creating any more for the specific
dependency's minor version (unless you unignore this specific
dependency's minor version or upgrade to it yourself)
- `@dependabot ignore <dependency name>` will close this group update PR
and stop Dependabot creating any more for the specific dependency
(unless you unignore this specific dependency or upgrade to it yourself)
- `@dependabot unignore <dependency name>` will remove all of the ignore
conditions of the specified dependency
- `@dependabot unignore <dependency name> <ignore condition>` will
remove the ignore condition of the specified dependency and ignore
conditions


</details>

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Devon Hudson <devonhudson@librem.one>
2025-12-05 22:11:58 +00:00
Andrew Morgan
a096fba969
Group non-breaking dependabot PRs together to reduce review load (#18402)
Some checks are pending
Schema / Ensure Synapse config schema is valid (push) Waiting to run
Schema / Ensure generated documentation is up-to-date (push) Waiting to run
Tests / check-schema-delta (push) Blocked by required conditions
Tests / check-lockfile (push) Waiting to run
Tests / lint (push) Blocked by required conditions
Tests / lint-readme (push) Blocked by required conditions
Tests / changes (push) Waiting to run
Tests / check-sampleconfig (push) Blocked by required conditions
Tests / linting-done (push) Blocked by required conditions
Tests / Typechecking (push) Blocked by required conditions
Tests / lint-crlf (push) Waiting to run
Tests / lint-newsfile (push) Waiting to run
Tests / lint-clippy (push) Blocked by required conditions
Tests / lint-clippy-nightly (push) Blocked by required conditions
Tests / lint-rust (push) Blocked by required conditions
Tests / lint-rustfmt (push) Blocked by required conditions
Tests / calculate-test-jobs (push) Blocked by required conditions
Tests / trial (push) Blocked by required conditions
Tests / trial-olddeps (push) Blocked by required conditions
Tests / trial-pypy (all, pypy-3.10) (push) Blocked by required conditions
Tests / sytest (push) Blocked by required conditions
Tests / export-data (push) Blocked by required conditions
Tests / portdb (14, 3.10) (push) Blocked by required conditions
Tests / portdb (17, 3.14) (push) Blocked by required conditions
Tests / complement (monolith, Postgres) (push) Blocked by required conditions
Tests / complement (monolith, SQLite) (push) Blocked by required conditions
Tests / complement (workers, Postgres) (push) Blocked by required conditions
Tests / cargo-test (push) Blocked by required conditions
Tests / cargo-bench (push) Blocked by required conditions
Tests / tests-done (push) Blocked by required conditions
2025-12-05 10:48:01 +00:00
Devon Hudson
e8710e7c5e
Don't include debug logs in Clock unless explicitly enabled (#19278)
Some checks are pending
Schema / Ensure Synapse config schema is valid (push) Waiting to run
Schema / Ensure generated documentation is up-to-date (push) Waiting to run
Tests / lint-clippy (push) Blocked by required conditions
Tests / changes (push) Waiting to run
Tests / check-sampleconfig (push) Blocked by required conditions
Tests / check-schema-delta (push) Blocked by required conditions
Tests / check-lockfile (push) Waiting to run
Tests / lint (push) Blocked by required conditions
Tests / Typechecking (push) Blocked by required conditions
Tests / lint-crlf (push) Waiting to run
Tests / lint-newsfile (push) Waiting to run
Tests / lint-clippy-nightly (push) Blocked by required conditions
Tests / lint-rust (push) Blocked by required conditions
Tests / lint-rustfmt (push) Blocked by required conditions
Tests / lint-readme (push) Blocked by required conditions
Tests / linting-done (push) Blocked by required conditions
Tests / calculate-test-jobs (push) Blocked by required conditions
Tests / trial (push) Blocked by required conditions
Tests / trial-olddeps (push) Blocked by required conditions
Tests / trial-pypy (all, pypy-3.10) (push) Blocked by required conditions
Tests / sytest (push) Blocked by required conditions
Tests / export-data (push) Blocked by required conditions
Tests / portdb (14, 3.10) (push) Blocked by required conditions
Tests / portdb (17, 3.14) (push) Blocked by required conditions
Tests / complement (monolith, Postgres) (push) Blocked by required conditions
Tests / complement (monolith, SQLite) (push) Blocked by required conditions
Tests / complement (workers, Postgres) (push) Blocked by required conditions
Tests / cargo-test (push) Blocked by required conditions
Tests / cargo-bench (push) Blocked by required conditions
Tests / tests-done (push) Blocked by required conditions
Fixes #19276

This log with stack traces results in a ton of noise in the logs and is
confusing to users since it looks like it's an error in the logs.
This PR removes the stack trace from the log. This can be re-enabled on
demand if it is deemed necessary in the future.

### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [X] Pull request is based on the develop branch
* [X] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [X] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct (run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))
2025-12-04 23:49:24 +00:00
Devon Hudson
978ae0b080
Merge branch 'release-v1.144' into develop
Some checks failed
Tests / lint-newsfile (push) Has been cancelled
Tests / calculate-test-jobs (push) Has been cancelled
Tests / lint-readme (push) Has been cancelled
Tests / linting-done (push) Has been cancelled
Tests / check-schema-delta (push) Has been cancelled
Build docker images / Push merged images to docker.io/matrixdotorg/synapse (push) Has been cancelled
Build docker images / Push merged images to ghcr.io/element-hq/synapse (push) Has been cancelled
Deploy the documentation / GitHub Pages (push) Has been cancelled
Build release artifacts / Build .deb packages (push) Has been cancelled
Build release artifacts / Attach assets to release (push) Has been cancelled
Tests / lint (push) Has been cancelled
Tests / check-sampleconfig (push) Has been cancelled
Tests / Typechecking (push) Has been cancelled
Tests / lint-clippy (push) Has been cancelled
Tests / lint-clippy-nightly (push) Has been cancelled
Tests / lint-rust (push) Has been cancelled
Tests / lint-rustfmt (push) Has been cancelled
Tests / trial (push) Has been cancelled
Tests / trial-olddeps (push) Has been cancelled
Tests / trial-pypy (all, pypy-3.10) (push) Has been cancelled
Tests / sytest (push) Has been cancelled
Tests / export-data (push) Has been cancelled
Tests / portdb (14, 3.10) (push) Has been cancelled
Tests / complement (monolith, SQLite) (push) Has been cancelled
Tests / complement (workers, Postgres) (push) Has been cancelled
Tests / cargo-test (push) Has been cancelled
Tests / cargo-bench (push) Has been cancelled
Tests / tests-done (push) Has been cancelled
Tests / portdb (17, 3.14) (push) Has been cancelled
Tests / complement (monolith, Postgres) (push) Has been cancelled
2025-12-02 15:06:23 -07:00
dependabot[bot]
93e658bd13
Bump cryptography from 45.0.7 to 46.0.3 (#19266)
Bumps [cryptography](https://github.com/pyca/cryptography) from 45.0.7
to 46.0.3.
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/pyca/cryptography/blob/main/CHANGELOG.rst">cryptography's
changelog</a>.</em></p>
<blockquote>
<p>46.0.3 - 2025-10-15</p>
<pre><code>
* Fixed compilation when using LibreSSL 4.2.0.
<p>.. _v46-0-2:</p>
<p>46.0.2 - 2025-09-30<br />
</code></pre></p>
<ul>
<li>Updated Windows, macOS, and Linux wheels to be compiled with OpenSSL
3.5.4.</li>
</ul>
<p>.. _v46-0-1:</p>
<p>46.0.1 - 2025-09-16</p>
<pre><code>
* Fixed an issue where users installing via ``pip`` on Python 3.14
development
  versions would not properly install a dependency.
* Fixed an issue building the free-threaded macOS 3.14 wheels.
<p>.. _v46-0-0:</p>
<p>46.0.0 - 2025-09-16<br />
</code></pre></p>
<ul>
<li><strong>BACKWARDS INCOMPATIBLE:</strong> Support for Python 3.7 has
been removed.</li>
<li>Support for OpenSSL &lt; 3.0 is deprecated and will be removed in
the next
release.</li>
<li>Support for <code>x86_64</code> macOS (including publishing wheels)
is deprecated
and will be removed in two releases. We will switch to publishing an
<code>arm64</code> only wheel for macOS.</li>
<li>Support for 32-bit Windows (including publishing wheels) is
deprecated
and will be removed in two releases. Users should move to a 64-bit
Python installation.</li>
<li>Updated Windows, macOS, and Linux wheels to be compiled with OpenSSL
3.5.3.</li>
<li>We now build <code>ppc64le</code> <code>manylinux</code> wheels and
publish them to PyPI.</li>
<li>We now build <code>win_arm64</code> (Windows on Arm) wheels and
publish them to PyPI.</li>
<li>Added support for free-threaded Python 3.14.</li>
<li>Removed the deprecated <code>get_attribute_for_oid</code> method on
:class:<code>~cryptography.x509.CertificateSigningRequest</code>. Users
should use
:meth:<code>~cryptography.x509.Attributes.get_attribute_for_oid</code>
instead.</li>
<li>Removed the deprecated <code>CAST5</code>, <code>SEED</code>,
<code>IDEA</code>, and <code>Blowfish</code>
classes from the cipher module. These are still available in
:doc:<code>/hazmat/decrepit/index</code>.</li>
<li>In X.509, when performing a PSS signature with a SHA-3 hash, it is
now
encoded with the official NIST SHA3 OID.</li>
</ul>
<p>.. _v45-0-7:</p>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="c0af4dd7b7"><code>c0af4dd</code></a>
release 46.0.3 (<a
href="https://redirect.github.com/pyca/cryptography/issues/13681">#13681</a>)</li>
<li><a
href="99efe5ad15"><code>99efe5a</code></a>
bump version for 46.0.2 (<a
href="https://redirect.github.com/pyca/cryptography/issues/13531">#13531</a>)</li>
<li><a
href="e735cfc275"><code>e735cfc</code></a>
release 46.0.1 (<a
href="https://redirect.github.com/pyca/cryptography/issues/13450">#13450</a>)</li>
<li><a
href="4e457ffba4"><code>4e457ff</code></a>
Explicitly specify python in mac uv build invocation (<a
href="https://redirect.github.com/pyca/cryptography/issues/13447">#13447</a>)</li>
<li><a
href="2726efdb6d"><code>2726efd</code></a>
Depend on CFFI 2.0.0 or newer on Python &gt; 3.8 (<a
href="https://redirect.github.com/pyca/cryptography/issues/13448">#13448</a>)</li>
<li><a
href="62230623d1"><code>6223062</code></a>
release 46.0.0 (<a
href="https://redirect.github.com/pyca/cryptography/issues/13446">#13446</a>)</li>
<li><a
href="563c4915b0"><code>563c491</code></a>
Update comment for pyopenssl-release tag (<a
href="https://redirect.github.com/pyca/cryptography/issues/13445">#13445</a>)</li>
<li><a
href="d2f6f7face"><code>d2f6f7f</code></a>
Bump downstream dependencies in CI (<a
href="https://redirect.github.com/pyca/cryptography/issues/13439">#13439</a>)</li>
<li><a
href="e7ab02bd67"><code>e7ab02b</code></a>
we'll ship this with 3.5.3 why not (<a
href="https://redirect.github.com/pyca/cryptography/issues/13442">#13442</a>)</li>
<li><a
href="0b68a4bffb"><code>0b68a4b</code></a>
Another pair of bump dependencies fix (<a
href="https://redirect.github.com/pyca/cryptography/issues/13444">#13444</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/pyca/cryptography/compare/45.0.7...46.0.3">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=cryptography&package-manager=pip&previous-version=45.0.7&new-version=46.0.3)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-02 20:27:05 +00:00
Devon Hudson
d688daf41c
Fix bug where Duration was logged incorrectly (#19267)
### Pull Request Checklist

<!-- Please read
https://element-hq.github.io/synapse/latest/development/contributing_guide.html
before submitting your pull request -->

* [X] Pull request is based on the develop branch
* [X] Pull request includes a [changelog
file](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#changelog).
The entry should:
- Be a short description of your change which makes sense to users.
"Fixed a bug that prevented receiving messages from other servers."
instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
  - Use markdown where necessary, mostly for `code blocks`.
  - End with either a period (.) or an exclamation mark (!).
  - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by
@github_username." or "Contributed by [Your Name]." to the end of the
entry.
* [X] [Code
style](https://element-hq.github.io/synapse/latest/code_style.html) is
correct (run the
[linters](https://element-hq.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))
2025-12-02 20:08:32 +00:00
dependabot[bot]
aff90a5245
Bump bleach from 6.2.0 to 6.3.0 (#19265)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-12-02 20:03:07 +00:00
Eric Eastwood
83023ce1e0
Be able to shutdown homeserver that failed to start (#19232)
For example, a homeserver can fail to `start` if the port is already in
use or the port number is invalid (not 0-65535)

Fix https://github.com/element-hq/synapse/issues/19189

Follow-up to https://github.com/element-hq/synapse/pull/18828


### Background

As part of Element's plan to support a light form of vhosting (virtual
host) (multiple instances of Synapse in the same Python process) (c.f
[Synapse Pro for small
hosts](https://docs.element.io/latest/element-server-suite-pro/synapse-pro-for-small-hosts/overview/)),
we're currently diving into the details and implications of running
multiple instances of Synapse in the same Python process.

"Clean tenant deprovisioning" tracked internally by
https://github.com/element-hq/synapse-small-hosts/issues/50
2025-12-02 11:28:46 -06:00
Eric Eastwood
39316672da
Be able to shutdown homeserver that hasn't setup (#19187)
For example, a homeserver can fail to `setup` if it fails to connect to
the database.

Fix https://github.com/element-hq/synapse/issues/19188

Follow-up to https://github.com/element-hq/synapse/pull/18828


### Background

As part of Element's plan to support a light form of vhosting (virtual
host) (multiple instances of Synapse in the same Python process) (c.f
Synapse Pro for small hosts), we're currently diving into the details
and implications of running multiple instances of Synapse in the same
Python process.

"Clean tenant deprovisioning" tracked internally by
https://github.com/element-hq/synapse-small-hosts/issues/50
2025-12-02 10:58:06 -06:00
Andrew Morgan
f86918e562
Remove the currently broken netlify GHA workflow (#19262) 2025-12-02 16:46:08 +00:00
Andrew Morgan
3d28e2213f
Dependabot: allow 10 open PRs for general updates (#19253) 2025-12-02 16:45:54 +00:00
Andrew Morgan
0dfc21ca9f
Remove "Updates to locked dependencies" section from changelog (#19254) 2025-12-02 16:45:41 +00:00
Andrew Morgan
ffd0b4c079
Add a 14-day cooldown for dependency updates (#19258) 2025-12-02 16:45:28 +00:00
202 changed files with 7261 additions and 9633 deletions

View file

@ -7,4 +7,4 @@ if command -v yum &> /dev/null; then
fi fi
# Install a Rust toolchain # Install a Rust toolchain
curl https://sh.rustup.rs -sSf | sh -s -- --default-toolchain 1.82.0 -y --profile minimal curl https://sh.rustup.rs -sSf | sh -s -- --default-toolchain stable -y --profile minimal

View file

@ -1,146 +0,0 @@
#!/usr/bin/env python
#
# This file is licensed under the Affero General Public License (AGPL) version 3.
#
# Copyright (C) 2023 New Vector, Ltd
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# See the GNU Affero General Public License for more details:
# <https://www.gnu.org/licenses/agpl-3.0.html>.
#
# Originally licensed under the Apache License, Version 2.0:
# <http://www.apache.org/licenses/LICENSE-2.0>.
#
# [This file includes modifications made by New Vector Limited]
#
#
# Wraps `auditwheel repair` to first check if we're repairing a potentially abi3
# compatible wheel, if so rename the wheel before repairing it.
import argparse
import os
import subprocess
from zipfile import ZipFile
from packaging.tags import Tag
from packaging.utils import parse_wheel_filename
from packaging.version import Version
def check_is_abi3_compatible(wheel_file: str) -> None:
"""Check the contents of the built wheel for any `.so` files that are *not*
abi3 compatible.
"""
with ZipFile(wheel_file, "r") as wheel:
for file in wheel.namelist():
if not file.endswith(".so"):
continue
if not file.endswith(".abi3.so"):
raise Exception(f"Found non-abi3 lib: {file}")
def cpython(wheel_file: str, name: str, version: Version, tag: Tag) -> str:
"""Replaces the cpython wheel file with a ABI3 compatible wheel"""
if tag.abi == "abi3":
# Nothing to do.
return wheel_file
check_is_abi3_compatible(wheel_file)
# HACK: it seems that some older versions of pip will consider a wheel marked
# as macosx_11_0 as incompatible with Big Sur. I haven't done the full archaeology
# here; there are some clues in
# https://github.com/pantsbuild/pants/pull/12857
# https://github.com/pypa/pip/issues/9138
# https://github.com/pypa/packaging/pull/319
# Empirically this seems to work, note that macOS 11 and 10.16 are the same,
# both versions are valid for backwards compatibility.
platform = tag.platform.replace("macosx_11_0", "macosx_10_16")
abi3_tag = Tag(tag.interpreter, "abi3", platform)
dirname = os.path.dirname(wheel_file)
new_wheel_file = os.path.join(
dirname,
f"{name}-{version}-{abi3_tag}.whl",
)
os.rename(wheel_file, new_wheel_file)
print("Renamed wheel to", new_wheel_file)
return new_wheel_file
def main(wheel_file: str, dest_dir: str, archs: str | None) -> None:
"""Entry point"""
# Parse the wheel file name into its parts. Note that `parse_wheel_filename`
# normalizes the package name (i.e. it converts matrix_synapse ->
# matrix-synapse), which is not what we want.
_, version, build, tags = parse_wheel_filename(os.path.basename(wheel_file))
name = os.path.basename(wheel_file).split("-")[0]
if len(tags) != 1:
# We expect only a wheel file with only a single tag
raise Exception(f"Unexpectedly found multiple tags: {tags}")
tag = next(iter(tags))
if build:
# We don't use build tags in Synapse
raise Exception(f"Unexpected build tag: {build}")
# If the wheel is for cpython then convert it into an abi3 wheel.
if tag.interpreter.startswith("cp"):
wheel_file = cpython(wheel_file, name, version, tag)
# Finally, repair the wheel.
if archs is not None:
# If we are given archs then we are on macos and need to use
# `delocate-listdeps`.
subprocess.run(["delocate-listdeps", wheel_file], check=True)
subprocess.run(
["delocate-wheel", "--require-archs", archs, "-w", dest_dir, wheel_file],
check=True,
)
else:
subprocess.run(["auditwheel", "repair", "-w", dest_dir, wheel_file], check=True)
if __name__ == "__main__":
parser = argparse.ArgumentParser(description="Tag wheel as abi3 and repair it.")
parser.add_argument(
"--wheel-dir",
"-w",
metavar="WHEEL_DIR",
help="Directory to store delocated wheels",
required=True,
)
parser.add_argument(
"--require-archs",
metavar="archs",
default=None,
)
parser.add_argument(
"wheel_file",
metavar="WHEEL_FILE",
)
args = parser.parse_args()
wheel_file = args.wheel_file
wheel_dir = args.wheel_dir
archs = args.require_archs
main(wheel_file, wheel_dir, archs)

View file

@ -1,39 +0,0 @@
#!/usr/bin/env bash
# this script is run by GitHub Actions in a plain `jammy` container; it
# - installs the minimal system requirements, and poetry;
# - patches the project definition file to refer to old versions only;
# - creates a venv with these old versions using poetry; and finally
# - invokes `trial` to run the tests with old deps.
set -ex
# Prevent virtualenv from auto-updating pip to an incompatible version
export VIRTUALENV_NO_DOWNLOAD=1
# TODO: in the future, we could use an implementation of
# https://github.com/python-poetry/poetry/issues/3527
# https://github.com/pypa/pip/issues/8085
# to select the lowest possible versions, rather than resorting to this sed script.
# Patch the project definitions in-place:
# - `-E` use extended regex syntax.
# - Don't modify the line that defines required Python versions.
# - Replace all lower and tilde bounds with exact bounds.
# - Replace all caret bounds with exact bounds.
# - Delete all lines referring to psycopg2 - so no testing of postgres support.
# - Use pyopenssl 17.0, which is the oldest version that works with
# a `cryptography` compiled against OpenSSL 1.1.
# - Omit systemd: we're not logging to journal here.
sed -i -E '
/^\s*requires-python\s*=/b
s/[~>]=/==/g
s/\^/==/g
/psycopg2/d
s/pyOpenSSL\s*==\s*16\.0\.0"/pyOpenSSL==17.0.0"/
/systemd/d
' pyproject.toml
echo "::group::Patched pyproject.toml"
cat pyproject.toml
echo "::endgroup::"

View file

@ -1,23 +1,92 @@
version: 2 version: 2
# As dependabot is currently only run on a weekly basis, we raise the
# open-pull-requests-limit to 10 (from the default of 5) to better ensure we
# don't continuously grow a backlog of updates.
updates: updates:
- # "pip" is the correct setting for poetry, per https://docs.github.com/en/code-security/dependabot/dependabot-version-updates/configuration-options-for-the-dependabot.yml-file#package-ecosystem - # "pip" is the correct setting for poetry, per https://docs.github.com/en/code-security/dependabot/dependabot-version-updates/configuration-options-for-the-dependabot.yml-file#package-ecosystem
package-ecosystem: "pip" package-ecosystem: "pip"
directory: "/" directory: "/"
open-pull-requests-limit: 10
schedule: schedule:
interval: "weekly" interval: "weekly"
# Group patch updates to packages together into a single PR, as they rarely
# if ever contain breaking changes that need to be reviewed separately.
#
# Less PRs means a streamlined review process.
#
# Python packages follow semantic versioning, and tend to only introduce
# breaking changes in major version bumps. Thus, we'll group minor and patch
# versions together.
groups:
minor-and-patches:
applies-to: version-updates
patterns:
- "*"
update-types:
- "minor"
- "patch"
# Prevent pulling packages that were recently updated to help mitigate
# supply chain attacks. 14 days was taken from the recommendation at
# https://blog.yossarian.net/2025/11/21/We-should-all-be-using-dependency-cooldowns
# where the author noted that 9/10 attacks would have been mitigated by a
# two week cooldown.
#
# The cooldown only applies to general updates; security updates will still
# be pulled in as soon as possible.
cooldown:
default-days: 14
- package-ecosystem: "docker" - package-ecosystem: "docker"
directory: "/docker" directory: "/docker"
open-pull-requests-limit: 10
schedule: schedule:
interval: "weekly" interval: "weekly"
# For container versions, breaking changes are also typically only introduced in major
# package bumps.
groups:
minor-and-patches:
applies-to: version-updates
patterns:
- "*"
update-types:
- "minor"
- "patch"
cooldown:
default-days: 14
- package-ecosystem: "github-actions" - package-ecosystem: "github-actions"
directory: "/" directory: "/"
open-pull-requests-limit: 10
schedule: schedule:
interval: "weekly" interval: "weekly"
# Similarly for GitHub Actions, breaking changes are typically only introduced in major
# package bumps.
groups:
minor-and-patches:
applies-to: version-updates
patterns:
- "*"
update-types:
- "minor"
- "patch"
cooldown:
default-days: 14
- package-ecosystem: "cargo" - package-ecosystem: "cargo"
directory: "/" directory: "/"
open-pull-requests-limit: 10
versioning-strategy: "lockfile-only" versioning-strategy: "lockfile-only"
schedule: schedule:
interval: "weekly" interval: "weekly"
# The Rust ecosystem is special in that breaking changes are often introduced
# in minor version bumps, as packages typically stay pre-1.0 for a long time.
# Thus we specifically keep minor version bumps separate in their own PRs.
groups:
patches:
applies-to: version-updates
patterns:
- "*"
update-types:
- "patch"
cooldown:
default-days: 14

View file

@ -28,10 +28,10 @@ jobs:
steps: steps:
- name: Set up Docker Buildx - name: Set up Docker Buildx
id: buildx id: buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1 uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Extract version from pyproject.toml - name: Extract version from pyproject.toml
# Note: explicitly requesting bash will mean bash is invoked with `-eo pipefail`, see # Note: explicitly requesting bash will mean bash is invoked with `-eo pipefail`, see
@ -75,7 +75,7 @@ jobs:
touch "${{ runner.temp }}/digests/${digest#sha256:}" touch "${{ runner.temp }}/digests/${digest#sha256:}"
- name: Upload digest - name: Upload digest
uses: actions/upload-artifact@v5 uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with: with:
name: digests-${{ matrix.suffix }} name: digests-${{ matrix.suffix }}
path: ${{ runner.temp }}/digests/* path: ${{ runner.temp }}/digests/*
@ -95,7 +95,7 @@ jobs:
- build - build
steps: steps:
- name: Download digests - name: Download digests
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0 uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
with: with:
path: ${{ runner.temp }}/digests path: ${{ runner.temp }}/digests
pattern: digests-* pattern: digests-*
@ -117,7 +117,7 @@ jobs:
password: ${{ secrets.GITHUB_TOKEN }} password: ${{ secrets.GITHUB_TOKEN }}
- name: Set up Docker Buildx - name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1 uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
- name: Install Cosign - name: Install Cosign
uses: sigstore/cosign-installer@faadad0cce49287aee09b3a48701e75088a2c6ad # v4.0.0 uses: sigstore/cosign-installer@faadad0cce49287aee09b3a48701e75088a2c6ad # v4.0.0

View file

@ -1,34 +0,0 @@
name: Deploy documentation PR preview
on:
workflow_run:
workflows: [ "Prepare documentation PR preview" ]
types:
- completed
jobs:
netlify:
if: github.event.workflow_run.conclusion == 'success' && github.event.workflow_run.event == 'pull_request'
runs-on: ubuntu-latest
steps:
# There's a 'download artifact' action, but it hasn't been updated for the workflow_run action
# (https://github.com/actions/download-artifact/issues/60) so instead we get this mess:
- name: 📥 Download artifact
uses: dawidd6/action-download-artifact@ac66b43f0e6a346234dd65d4d0c8fbb31cb316e5 # v11
with:
workflow: docs-pr.yaml
run_id: ${{ github.event.workflow_run.id }}
name: book
path: book
- name: 📤 Deploy to Netlify
uses: matrix-org/netlify-pr-preview@9805cd123fc9a7e421e35340a05e1ebc5dee46b5 # v3
with:
path: book
owner: ${{ github.event.workflow_run.head_repository.owner.login }}
branch: ${{ github.event.workflow_run.head_branch }}
revision: ${{ github.event.workflow_run.head_sha }}
token: ${{ secrets.NETLIFY_AUTH_TOKEN }}
site_id: ${{ secrets.NETLIFY_SITE_ID }}
desc: Documentation preview
deployment_env: PR Documentation Preview

View file

@ -13,7 +13,7 @@ jobs:
name: GitHub Pages name: GitHub Pages
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with: with:
# Fetch all history so that the schema_versions script works. # Fetch all history so that the schema_versions script works.
fetch-depth: 0 fetch-depth: 0
@ -21,7 +21,7 @@ jobs:
- name: Setup mdbook - name: Setup mdbook
uses: peaceiris/actions-mdbook@ee69d230fe19748b7abf22df32acaa93833fad08 # v2.0.0 uses: peaceiris/actions-mdbook@ee69d230fe19748b7abf22df32acaa93833fad08 # v2.0.0
with: with:
mdbook-version: '0.4.17' mdbook-version: '0.5.2'
- name: Setup python - name: Setup python
uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0 uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
@ -39,7 +39,7 @@ jobs:
cp book/welcome_and_overview.html book/index.html cp book/welcome_and_overview.html book/index.html
- name: Upload Artifact - name: Upload Artifact
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0 uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with: with:
name: book name: book
path: book path: book
@ -50,12 +50,12 @@ jobs:
name: Check links in documentation name: Check links in documentation
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Setup mdbook - name: Setup mdbook
uses: peaceiris/actions-mdbook@ee69d230fe19748b7abf22df32acaa93833fad08 # v2.0.0 uses: peaceiris/actions-mdbook@ee69d230fe19748b7abf22df32acaa93833fad08 # v2.0.0
with: with:
mdbook-version: '0.4.17' mdbook-version: '0.5.2'
- name: Setup htmltest - name: Setup htmltest
run: | run: |
@ -64,8 +64,17 @@ jobs:
tar zxf htmltest_0.17.0_linux_amd64.tar.gz tar zxf htmltest_0.17.0_linux_amd64.tar.gz
- name: Test links with htmltest - name: Test links with htmltest
# Build the book with `./` as the site URL (to make checks on 404.html possible)
# Then run htmltest (without checking external links since that involves the network and is slow).
run: | run: |
# Build the book with `./` as the site URL (to make checks on 404.html possible)
MDBOOK_OUTPUT__HTML__SITE_URL="./" mdbook build MDBOOK_OUTPUT__HTML__SITE_URL="./" mdbook build
./htmltest book --skip-external
# Delete the contents of the print.html file, as it can raise false
# positives during link checking.
#
# We empty out the file, instead of deleting it, as doing so would
# just cause htmltest to complain that links to it were invalid.
# Ideally `htmltest` would have an option to ignore specific files
# instead.
echo '<!DOCTYPE HTML>' > book/print.html
./htmltest book --conf docs/.htmltest.yml

View file

@ -50,7 +50,7 @@ jobs:
needs: needs:
- pre - pre
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with: with:
# Fetch all history so that the schema_versions script works. # Fetch all history so that the schema_versions script works.
fetch-depth: 0 fetch-depth: 0
@ -58,7 +58,7 @@ jobs:
- name: Setup mdbook - name: Setup mdbook
uses: peaceiris/actions-mdbook@ee69d230fe19748b7abf22df32acaa93833fad08 # v2.0.0 uses: peaceiris/actions-mdbook@ee69d230fe19748b7abf22df32acaa93833fad08 # v2.0.0
with: with:
mdbook-version: '0.4.17' mdbook-version: '0.5.2'
- name: Set version of docs - name: Set version of docs
run: echo 'window.SYNAPSE_VERSION = "${{ needs.pre.outputs.branch-version }}";' > ./docs/website_files/version.js run: echo 'window.SYNAPSE_VERSION = "${{ needs.pre.outputs.branch-version }}";' > ./docs/website_files/version.js

View file

@ -18,7 +18,7 @@ jobs:
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Install Rust - name: Install Rust
uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master
@ -47,6 +47,6 @@ jobs:
- run: cargo fmt - run: cargo fmt
continue-on-error: true continue-on-error: true
- uses: stefanzweifel/git-auto-commit-action@28e16e81777b558cc906c8750092100bbb34c5e3 # v7.0.0 - uses: stefanzweifel/git-auto-commit-action@04702edda442b2e678b25b537cec683a1493fcb9 # v7.1.0
with: with:
commit_message: "Attempt to fix linting" commit_message: "Attempt to fix linting"

View file

@ -42,7 +42,7 @@ jobs:
if: needs.check_repo.outputs.should_run_workflow == 'true' if: needs.check_repo.outputs.should_run_workflow == 'true'
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Install Rust - name: Install Rust
uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master
with: with:
@ -77,7 +77,7 @@ jobs:
postgres-version: "14" postgres-version: "14"
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Install Rust - name: Install Rust
uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master
@ -152,7 +152,7 @@ jobs:
BLACKLIST: ${{ matrix.workers && 'synapse-blacklist-with-workers' }} BLACKLIST: ${{ matrix.workers && 'synapse-blacklist-with-workers' }}
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Install Rust - name: Install Rust
uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master
@ -173,7 +173,7 @@ jobs:
if: ${{ always() }} if: ${{ always() }}
run: /sytest/scripts/tap_to_gha.pl /logs/results.tap run: /sytest/scripts/tap_to_gha.pl /logs/results.tap
- name: Upload SyTest logs - name: Upload SyTest logs
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0 uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
if: ${{ always() }} if: ${{ always() }}
with: with:
name: Sytest Logs - ${{ job.status }} - (${{ join(matrix.*, ', ') }}) name: Sytest Logs - ${{ job.status }} - (${{ join(matrix.*, ', ') }})
@ -202,7 +202,7 @@ jobs:
steps: steps:
- name: Check out synapse codebase - name: Check out synapse codebase
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with: with:
path: synapse path: synapse
@ -234,7 +234,7 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- uses: JasonEtco/create-an-issue@1b14a70e4d8dc185e5cc76d3bec9eab20257b2c5 # v2.9.2 - uses: JasonEtco/create-an-issue@1b14a70e4d8dc185e5cc76d3bec9eab20257b2c5 # v2.9.2
env: env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View file

@ -16,7 +16,7 @@ jobs:
name: "Check locked dependencies have sdists" name: "Check locked dependencies have sdists"
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0 - uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with: with:
python-version: '3.x' python-version: '3.x'

View file

@ -33,17 +33,17 @@ jobs:
packages: write packages: write
steps: steps:
- name: Checkout specific branch (debug build) - name: Checkout specific branch (debug build)
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
if: github.event_name == 'workflow_dispatch' if: github.event_name == 'workflow_dispatch'
with: with:
ref: ${{ inputs.branch }} ref: ${{ inputs.branch }}
- name: Checkout clean copy of develop (scheduled build) - name: Checkout clean copy of develop (scheduled build)
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
if: github.event_name == 'schedule' if: github.event_name == 'schedule'
with: with:
ref: develop ref: develop
- name: Checkout clean copy of master (on-push) - name: Checkout clean copy of master (on-push)
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
if: github.event_name == 'push' if: github.event_name == 'push'
with: with:
ref: master ref: master

View file

@ -5,7 +5,7 @@ name: Build release artifacts
on: on:
# we build on PRs and develop to (hopefully) get early warning # we build on PRs and develop to (hopefully) get early warning
# of things breaking (but only build one set of debs). PRs skip # of things breaking (but only build one set of debs). PRs skip
# building wheels on macOS & ARM. # building wheels on ARM.
pull_request: pull_request:
push: push:
branches: ["develop", "release-*"] branches: ["develop", "release-*"]
@ -27,7 +27,7 @@ jobs:
name: "Calculate list of debian distros" name: "Calculate list of debian distros"
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0 - uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with: with:
python-version: "3.x" python-version: "3.x"
@ -55,18 +55,16 @@ jobs:
steps: steps:
- name: Checkout - name: Checkout
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with: with:
path: src path: src
- name: Set up Docker Buildx - name: Set up Docker Buildx
id: buildx id: buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1 uses: docker/setup-buildx-action@8d2750c68a42422c14e847fe6c8ac0403b4cbd6f # v3.12.0
with:
install: true
- name: Set up docker layer caching - name: Set up docker layer caching
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0 uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
with: with:
path: /tmp/.buildx-cache path: /tmp/.buildx-cache
key: ${{ runner.os }}-buildx-${{ github.sha }} key: ${{ runner.os }}-buildx-${{ github.sha }}
@ -101,7 +99,7 @@ jobs:
echo "ARTIFACT_NAME=${DISTRO#*:}" >> "$GITHUB_OUTPUT" echo "ARTIFACT_NAME=${DISTRO#*:}" >> "$GITHUB_OUTPUT"
- name: Upload debs as artifacts - name: Upload debs as artifacts
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0 uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with: with:
name: debs-${{ steps.artifact-name.outputs.ARTIFACT_NAME }} name: debs-${{ steps.artifact-name.outputs.ARTIFACT_NAME }}
path: debs/* path: debs/*
@ -125,7 +123,7 @@ jobs:
os: "ubuntu-24.04-arm" os: "ubuntu-24.04-arm"
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0 - uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with: with:
@ -152,7 +150,7 @@ jobs:
# musl: (TODO: investigate). # musl: (TODO: investigate).
CIBW_TEST_SKIP: pp3*-* *musl* CIBW_TEST_SKIP: pp3*-* *musl*
- uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0 - uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with: with:
name: Wheel-${{ matrix.os }} name: Wheel-${{ matrix.os }}
path: ./wheelhouse/*.whl path: ./wheelhouse/*.whl
@ -163,7 +161,7 @@ jobs:
if: ${{ !startsWith(github.ref, 'refs/pull/') }} if: ${{ !startsWith(github.ref, 'refs/pull/') }}
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0 - uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with: with:
python-version: "3.10" python-version: "3.10"
@ -173,7 +171,7 @@ jobs:
- name: Build sdist - name: Build sdist
run: python -m build --sdist run: python -m build --sdist
- uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0 - uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
with: with:
name: Sdist name: Sdist
path: dist/*.tar.gz path: dist/*.tar.gz
@ -189,7 +187,7 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- name: Download all workflow run artifacts - name: Download all workflow run artifacts
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0 uses: actions/download-artifact@37930b1c2abaa49bbe596cd826c3c89aef350131 # v7.0.0
- name: Build a tarball for the debs - name: Build a tarball for the debs
# We need to merge all the debs uploads into one folder, then compress # We need to merge all the debs uploads into one folder, then compress
# that. # that.

View file

@ -14,7 +14,7 @@ jobs:
name: Ensure Synapse config schema is valid name: Ensure Synapse config schema is valid
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0 - uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with: with:
python-version: "3.x" python-version: "3.x"
@ -40,7 +40,7 @@ jobs:
name: Ensure generated documentation is up-to-date name: Ensure generated documentation is up-to-date
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0 - uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with: with:
python-version: "3.x" python-version: "3.x"

View file

@ -26,59 +26,59 @@ jobs:
linting: ${{ !startsWith(github.ref, 'refs/pull/') || steps.filter.outputs.linting }} linting: ${{ !startsWith(github.ref, 'refs/pull/') || steps.filter.outputs.linting }}
linting_readme: ${{ !startsWith(github.ref, 'refs/pull/') || steps.filter.outputs.linting_readme }} linting_readme: ${{ !startsWith(github.ref, 'refs/pull/') || steps.filter.outputs.linting_readme }}
steps: steps:
- uses: dorny/paths-filter@de90cc6fb38fc0963ad72b210f1f284cd68cea36 # v3.0.2 - uses: dorny/paths-filter@de90cc6fb38fc0963ad72b210f1f284cd68cea36 # v3.0.2
id: filter id: filter
# We only check on PRs # We only check on PRs
if: startsWith(github.ref, 'refs/pull/') if: startsWith(github.ref, 'refs/pull/')
with: with:
filters: | filters: |
rust: rust:
- 'rust/**' - 'rust/**'
- 'Cargo.toml' - 'Cargo.toml'
- 'Cargo.lock' - 'Cargo.lock'
- '.rustfmt.toml' - '.rustfmt.toml'
- '.github/workflows/tests.yml' - '.github/workflows/tests.yml'
trial: trial:
- 'synapse/**' - 'synapse/**'
- 'tests/**' - 'tests/**'
- 'rust/**' - 'rust/**'
- '.ci/scripts/calculate_jobs.py' - '.ci/scripts/calculate_jobs.py'
- 'Cargo.toml' - 'Cargo.toml'
- 'Cargo.lock' - 'Cargo.lock'
- 'pyproject.toml' - 'pyproject.toml'
- 'poetry.lock' - 'poetry.lock'
- '.github/workflows/tests.yml' - '.github/workflows/tests.yml'
integration: integration:
- 'synapse/**' - 'synapse/**'
- 'rust/**' - 'rust/**'
- 'docker/**' - 'docker/**'
- 'Cargo.toml' - 'Cargo.toml'
- 'Cargo.lock' - 'Cargo.lock'
- 'pyproject.toml' - 'pyproject.toml'
- 'poetry.lock' - 'poetry.lock'
- 'docker/**' - 'docker/**'
- '.ci/**' - '.ci/**'
- 'scripts-dev/complement.sh' - 'scripts-dev/complement.sh'
- '.github/workflows/tests.yml' - '.github/workflows/tests.yml'
linting: linting:
- 'synapse/**' - 'synapse/**'
- 'docker/**' - 'docker/**'
- 'tests/**' - 'tests/**'
- 'scripts-dev/**' - 'scripts-dev/**'
- 'contrib/**' - 'contrib/**'
- 'synmark/**' - 'synmark/**'
- 'stubs/**' - 'stubs/**'
- '.ci/**' - '.ci/**'
- 'mypy.ini' - 'mypy.ini'
- 'pyproject.toml' - 'pyproject.toml'
- 'poetry.lock' - 'poetry.lock'
- '.github/workflows/tests.yml' - '.github/workflows/tests.yml'
linting_readme: linting_readme:
- 'README.rst' - 'README.rst'
check-sampleconfig: check-sampleconfig:
runs-on: ubuntu-latest runs-on: ubuntu-latest
@ -86,7 +86,7 @@ jobs:
if: ${{ needs.changes.outputs.linting == 'true' }} if: ${{ needs.changes.outputs.linting == 'true' }}
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Install Rust - name: Install Rust
uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master
with: with:
@ -106,7 +106,7 @@ jobs:
if: ${{ needs.changes.outputs.linting == 'true' }} if: ${{ needs.changes.outputs.linting == 'true' }}
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0 - uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with: with:
python-version: "3.x" python-version: "3.x"
@ -116,7 +116,7 @@ jobs:
check-lockfile: check-lockfile:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0 - uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with: with:
python-version: "3.x" python-version: "3.x"
@ -129,7 +129,7 @@ jobs:
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Setup Poetry - name: Setup Poetry
uses: matrix-org/setup-python-poetry@5bbf6603c5c930615ec8a29f1b5d7d258d905aa4 # v2.0.0 uses: matrix-org/setup-python-poetry@5bbf6603c5c930615ec8a29f1b5d7d258d905aa4 # v2.0.0
@ -151,7 +151,7 @@ jobs:
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Install Rust - name: Install Rust
uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master
@ -174,7 +174,7 @@ jobs:
# Cribbed from # Cribbed from
# https://github.com/AustinScola/mypy-cache-github-action/blob/85ea4f2972abed39b33bd02c36e341b28ca59213/src/restore.ts#L10-L17 # https://github.com/AustinScola/mypy-cache-github-action/blob/85ea4f2972abed39b33bd02c36e341b28ca59213/src/restore.ts#L10-L17
- name: Restore/persist mypy's cache - name: Restore/persist mypy's cache
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0 uses: actions/cache@9255dc7a253b0ccc959486e2bca901246202afeb # v5.0.1
with: with:
path: | path: |
.mypy_cache .mypy_cache
@ -187,7 +187,7 @@ jobs:
lint-crlf: lint-crlf:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Check line endings - name: Check line endings
run: scripts-dev/check_line_terminators.sh run: scripts-dev/check_line_terminators.sh
@ -196,7 +196,7 @@ jobs:
if: ${{ github.event_name == 'pull_request' && (github.base_ref == 'develop' || contains(github.base_ref, 'release-')) && github.event.pull_request.user.login != 'dependabot[bot]' }} if: ${{ github.event_name == 'pull_request' && (github.base_ref == 'develop' || contains(github.base_ref, 'release-')) && github.event.pull_request.user.login != 'dependabot[bot]' }}
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with: with:
ref: ${{ github.event.pull_request.head.sha }} ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0 fetch-depth: 0
@ -214,13 +214,13 @@ jobs:
if: ${{ needs.changes.outputs.rust == 'true' }} if: ${{ needs.changes.outputs.rust == 'true' }}
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Install Rust - name: Install Rust
uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master
with: with:
components: clippy components: clippy
toolchain: ${{ env.RUST_VERSION }} toolchain: ${{ env.RUST_VERSION }}
- uses: Swatinem/rust-cache@779680da715d629ac1d338a641029a2f4372abb5 # v2.8.2 - uses: Swatinem/rust-cache@779680da715d629ac1d338a641029a2f4372abb5 # v2.8.2
- run: cargo clippy -- -D warnings - run: cargo clippy -- -D warnings
@ -233,13 +233,13 @@ jobs:
if: ${{ needs.changes.outputs.rust == 'true' }} if: ${{ needs.changes.outputs.rust == 'true' }}
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Install Rust - name: Install Rust
uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master
with: with:
toolchain: nightly-2025-04-23 toolchain: nightly-2025-04-23
components: clippy components: clippy
- uses: Swatinem/rust-cache@779680da715d629ac1d338a641029a2f4372abb5 # v2.8.2 - uses: Swatinem/rust-cache@779680da715d629ac1d338a641029a2f4372abb5 # v2.8.2
- run: cargo clippy --all-features -- -D warnings - run: cargo clippy --all-features -- -D warnings
@ -251,7 +251,7 @@ jobs:
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Install Rust - name: Install Rust
uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master
@ -287,7 +287,7 @@ jobs:
if: ${{ needs.changes.outputs.rust == 'true' }} if: ${{ needs.changes.outputs.rust == 'true' }}
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Install Rust - name: Install Rust
uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master
@ -307,7 +307,7 @@ jobs:
needs: changes needs: changes
if: ${{ needs.changes.outputs.linting_readme == 'true' }} if: ${{ needs.changes.outputs.linting_readme == 'true' }}
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0 - uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with: with:
python-version: "3.x" python-version: "3.x"
@ -349,13 +349,12 @@ jobs:
lint-rustfmt lint-rustfmt
lint-readme lint-readme
calculate-test-jobs: calculate-test-jobs:
if: ${{ !cancelled() && !failure() }} # Allow previous steps to be skipped, but not fail if: ${{ !cancelled() && !failure() }} # Allow previous steps to be skipped, but not fail
needs: linting-done needs: linting-done
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0 - uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with: with:
python-version: "3.x" python-version: "3.x"
@ -373,10 +372,10 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
strategy: strategy:
matrix: matrix:
job: ${{ fromJson(needs.calculate-test-jobs.outputs.trial_test_matrix) }} job: ${{ fromJson(needs.calculate-test-jobs.outputs.trial_test_matrix) }}
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- run: sudo apt-get -qq install xmlsec1 - run: sudo apt-get -qq install xmlsec1
- name: Set up PostgreSQL ${{ matrix.job.postgres-version }} - name: Set up PostgreSQL ${{ matrix.job.postgres-version }}
if: ${{ matrix.job.postgres-version }} if: ${{ matrix.job.postgres-version }}
@ -432,7 +431,7 @@ jobs:
- changes - changes
runs-on: ubuntu-22.04 runs-on: ubuntu-22.04
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Install Rust - name: Install Rust
uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master
@ -449,17 +448,15 @@ jobs:
- uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0 - uses: actions/setup-python@83679a892e2d95755f2dac6acb0bfd1e9ac5d548 # v6.1.0
with: with:
python-version: '3.10' python-version: "3.10"
- name: Prepare old deps - name: Prepare old deps
if: steps.cache-poetry-old-deps.outputs.cache-hit != 'true' # Note: we install using `uv` here, not poetry or pip to allow us to test with the
run: .ci/scripts/prepare_old_deps.sh # minimum version of all dependencies, both those explicitly specified and those
# implicitly brought in by the explicit dependencies.
# Note: we install using `pip` here, not poetry. `poetry install` ignores the run: |
# build-system section (https://github.com/python-poetry/poetry/issues/6154), but pip install uv
# we explicitly want to test that you can `pip install` using the oldest version uv pip install --system --resolution=lowest .[all,test]
# of poetry-core and setuptools-rust.
- run: pip install .[all,test]
# We nuke the local copy, as we've installed synapse into the virtualenv # We nuke the local copy, as we've installed synapse into the virtualenv
# (rather than use an editable install, which we no longer support). If we # (rather than use an editable install, which we no longer support). If we
@ -497,7 +494,7 @@ jobs:
extras: ["all"] extras: ["all"]
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
# Install libs necessary for PyPy to build binary wheels for dependencies # Install libs necessary for PyPy to build binary wheels for dependencies
- run: sudo apt-get -qq install xmlsec1 libxml2-dev libxslt-dev - run: sudo apt-get -qq install xmlsec1 libxml2-dev libxslt-dev
- uses: matrix-org/setup-python-poetry@5bbf6603c5c930615ec8a29f1b5d7d258d905aa4 # v2.0.0 - uses: matrix-org/setup-python-poetry@5bbf6603c5c930615ec8a29f1b5d7d258d905aa4 # v2.0.0
@ -547,7 +544,7 @@ jobs:
job: ${{ fromJson(needs.calculate-test-jobs.outputs.sytest_test_matrix) }} job: ${{ fromJson(needs.calculate-test-jobs.outputs.sytest_test_matrix) }}
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Prepare test blacklist - name: Prepare test blacklist
run: cat sytest-blacklist .ci/worker-blacklist > synapse-blacklist-with-workers run: cat sytest-blacklist .ci/worker-blacklist > synapse-blacklist-with-workers
@ -564,7 +561,7 @@ jobs:
if: ${{ always() }} if: ${{ always() }}
run: /sytest/scripts/tap_to_gha.pl /logs/results.tap run: /sytest/scripts/tap_to_gha.pl /logs/results.tap
- name: Upload SyTest logs - name: Upload SyTest logs
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0 uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
if: ${{ always() }} if: ${{ always() }}
with: with:
name: Sytest Logs - ${{ job.status }} - (${{ join(matrix.job.*, ', ') }}) name: Sytest Logs - ${{ job.status }} - (${{ join(matrix.job.*, ', ') }})
@ -594,7 +591,7 @@ jobs:
--health-retries 5 --health-retries 5
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- run: sudo apt-get -qq install xmlsec1 postgresql-client - run: sudo apt-get -qq install xmlsec1 postgresql-client
- uses: matrix-org/setup-python-poetry@5bbf6603c5c930615ec8a29f1b5d7d258d905aa4 # v2.0.0 - uses: matrix-org/setup-python-poetry@5bbf6603c5c930615ec8a29f1b5d7d258d905aa4 # v2.0.0
with: with:
@ -607,7 +604,6 @@ jobs:
PGPASSWORD: postgres PGPASSWORD: postgres
PGDATABASE: postgres PGDATABASE: postgres
portdb: portdb:
if: ${{ !failure() && !cancelled() && needs.changes.outputs.integration == 'true'}} # Allow previous steps to be skipped, but not fail if: ${{ !failure() && !cancelled() && needs.changes.outputs.integration == 'true'}} # Allow previous steps to be skipped, but not fail
needs: needs:
@ -638,7 +634,7 @@ jobs:
--health-retries 5 --health-retries 5
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Add PostgreSQL apt repository - name: Add PostgreSQL apt repository
# We need a version of pg_dump that can handle the version of # We need a version of pg_dump that can handle the version of
# PostgreSQL being tested against. The Ubuntu package repository lags # PostgreSQL being tested against. The Ubuntu package repository lags
@ -662,7 +658,7 @@ jobs:
PGPASSWORD: postgres PGPASSWORD: postgres
PGDATABASE: postgres PGDATABASE: postgres
- name: "Upload schema differences" - name: "Upload schema differences"
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0 uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
if: ${{ failure() && !cancelled() && steps.run_tester_script.outcome == 'failure' }} if: ${{ failure() && !cancelled() && steps.run_tester_script.outcome == 'failure' }}
with: with:
name: Schema dumps name: Schema dumps
@ -693,7 +689,7 @@ jobs:
steps: steps:
- name: Checkout synapse codebase - name: Checkout synapse codebase
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with: with:
path: synapse path: synapse
@ -712,14 +708,28 @@ jobs:
go-version-file: complement/go.mod go-version-file: complement/go.mod
# use p=1 concurrency as GHA boxes are underpowered and don't like running tons of synapses at once. # use p=1 concurrency as GHA boxes are underpowered and don't like running tons of synapses at once.
- run: | - name: Run Complement Tests
id: run_complement_tests
# -p=1: We're using `-p 1` to force the test packages to run serially as GHA boxes
# are underpowered and don't like running tons of Synapse instances at once.
# -json: Output JSON format so that gotestfmt can parse it.
#
# tee /tmp/gotest.log: We tee the output to a file so that we can re-process it
# later on for better formatting with gotestfmt. But we still want the command
# to output to the terminal as it runs so we can see what's happening in
# real-time.
run: |
set -o pipefail set -o pipefail
COMPLEMENT_DIR=`pwd`/complement synapse/scripts-dev/complement.sh -p 1 -json 2>&1 | synapse/.ci/scripts/gotestfmt COMPLEMENT_DIR=`pwd`/complement synapse/scripts-dev/complement.sh -p 1 -json 2>&1 | tee /tmp/gotest.log
shell: bash shell: bash
env: env:
POSTGRES: ${{ (matrix.database == 'Postgres') && 1 || '' }} POSTGRES: ${{ (matrix.database == 'Postgres') && 1 || '' }}
WORKERS: ${{ (matrix.arrangement == 'workers') && 1 || '' }} WORKERS: ${{ (matrix.arrangement == 'workers') && 1 || '' }}
name: Run Complement Tests
- name: Formatted Complement test logs
# Always run this step if we attempted to run the Complement tests.
if: always() && steps.run_complement_tests.outcome != 'skipped'
run: cat /tmp/gotest.log | gotestfmt -hide "successful-downloads,empty-packages"
cargo-test: cargo-test:
if: ${{ needs.changes.outputs.rust == 'true' }} if: ${{ needs.changes.outputs.rust == 'true' }}
@ -729,7 +739,7 @@ jobs:
- changes - changes
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Install Rust - name: Install Rust
uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master
@ -749,12 +759,12 @@ jobs:
- changes - changes
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Install Rust - name: Install Rust
uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master
with: with:
toolchain: nightly-2022-12-01 toolchain: nightly-2022-12-01
- uses: Swatinem/rust-cache@779680da715d629ac1d338a641029a2f4372abb5 # v2.8.2 - uses: Swatinem/rust-cache@779680da715d629ac1d338a641029a2f4372abb5 # v2.8.2
- run: cargo bench --no-run - run: cargo bench --no-run

View file

@ -22,7 +22,7 @@ jobs:
# This field is case-sensitive. # This field is case-sensitive.
TARGET_STATUS: Needs info TARGET_STATUS: Needs info
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with: with:
# Only clone the script file we care about, instead of the whole repo. # Only clone the script file we care about, instead of the whole repo.
sparse-checkout: .ci/scripts/triage_labelled_issue.sh sparse-checkout: .ci/scripts/triage_labelled_issue.sh

View file

@ -43,7 +43,7 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Install Rust - name: Install Rust
uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master
@ -70,7 +70,7 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- run: sudo apt-get -qq install xmlsec1 - run: sudo apt-get -qq install xmlsec1
- name: Install Rust - name: Install Rust
@ -117,7 +117,7 @@ jobs:
- ${{ github.workspace }}:/src - ${{ github.workspace }}:/src
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- name: Install Rust - name: Install Rust
uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master uses: dtolnay/rust-toolchain@e97e2d8cc328f1b50210efc529dca0028893a2d9 # master
@ -147,7 +147,7 @@ jobs:
if: ${{ always() }} if: ${{ always() }}
run: /sytest/scripts/tap_to_gha.pl /logs/results.tap run: /sytest/scripts/tap_to_gha.pl /logs/results.tap
- name: Upload SyTest logs - name: Upload SyTest logs
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0 uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
if: ${{ always() }} if: ${{ always() }}
with: with:
name: Sytest Logs - ${{ job.status }} - (${{ join(matrix.*, ', ') }}) name: Sytest Logs - ${{ job.status }} - (${{ join(matrix.*, ', ') }})
@ -175,7 +175,7 @@ jobs:
steps: steps:
- name: Run actions/checkout@v4 for synapse - name: Run actions/checkout@v4 for synapse
uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with: with:
path: synapse path: synapse
@ -217,7 +217,7 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@1af3b93b6815bc44a9784bd300feb67ff0d1eeb3 # v6.0.0 - uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
- uses: JasonEtco/create-an-issue@1b14a70e4d8dc185e5cc76d3bec9eab20257b2c5 # v2.9.2 - uses: JasonEtco/create-an-issue@1b14a70e4d8dc185e5cc76d3bec9eab20257b2c5 # v2.9.2
env: env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View file

@ -1,3 +1,95 @@
# Synapse 1.145.0 (2026-01-13)
No significant changes since 1.145.0rc4.
## End of Life of Ubuntu 25.04 Plucky Puffin
Ubuntu 25.04 (Plucky Puffin) will be end of life on Jan 17, 2026. Synapse will stop building packages for Ubuntu 25.04 shortly thereafter.
## Updates to Locked Dependencies No Longer Included in Changelog
The "Updates to locked dependencies" section has been removed from the changelog due to lack of use and the maintenance burden. ([\#19254](https://github.com/element-hq/synapse/issues/19254))
# Synapse 1.145.0rc4 (2026-01-08)
No significant changes since 1.145.0rc3.
This RC contains a fix specifically for openSUSE packaging and no other changes.
# Synapse 1.145.0rc3 (2026-01-07)
No significant changes since 1.145.0rc2.
This RC strips out unnecessary files from the wheels that were added when fixing the source distribution packaging in the previous RC.
# Synapse 1.145.0rc2 (2026-01-07)
No significant changes since 1.145.0rc1.
This RC fixes the source distribution packaging for uploading to PyPI.
# Synapse 1.145.0rc1 (2026-01-06)
## Features
- Add `memberships` endpoint to the admin API. This is useful for forensics and T&S purposes. ([\#19260](https://github.com/element-hq/synapse/issues/19260))
- Server admins can bypass the quarantine media check when downloading media by setting the `admin_unsafely_bypass_quarantine` query parameter to `true` on Client-Server API media download requests. ([\#19275](https://github.com/element-hq/synapse/issues/19275))
- Implemented pagination for the [MSC2666](https://github.com/matrix-org/matrix-spec-proposals/pull/2666) mutual rooms endpoint. Contributed by @tulir @ Beeper. ([\#19279](https://github.com/element-hq/synapse/issues/19279))
- Admin API: add worker support to `GET /_synapse/admin/v2/users/<user_id>`. ([\#19281](https://github.com/element-hq/synapse/issues/19281))
- Improve proxy support for the `federation_client.py` dev script. Contributed by Denis Kasak (@dkasak). ([\#19300](https://github.com/element-hq/synapse/issues/19300))
## Bugfixes
- Fix sliding sync performance slow down for long lived connections. ([\#19206](https://github.com/element-hq/synapse/issues/19206))
- Fix a bug where Mastodon posts (and possibly other embeds) have the wrong description for URL previews. ([\#19231](https://github.com/element-hq/synapse/issues/19231))
- Fix bug where `Duration` was logged incorrectly. ([\#19267](https://github.com/element-hq/synapse/issues/19267))
- Fix bug introduced in 1.143.0 that broke support for versions of `zope-interface` older than 6.2. ([\#19274](https://github.com/element-hq/synapse/issues/19274))
- Transform events with client metadata before serialising in /event response. ([\#19340](https://github.com/element-hq/synapse/issues/19340))
## Updates to the Docker image
- Add a way to expose metrics from the Docker image (`SYNAPSE_ENABLE_METRICS`). ([\#19324](https://github.com/element-hq/synapse/issues/19324))
## Improved Documentation
- Document the importance of `public_baseurl` when configuring OpenID Connect authentication. ([\#19270](https://github.com/element-hq/synapse/issues/19270))
## Deprecations and Removals
- Ubuntu 25.04 (Plucky Puffin) will be end of life on Jan 17, 2026. Synapse will stop building packages for Ubuntu 25.04 shortly thereafter.
- Remove the "Updates to locked dependencies" section from the changelog due to lack of use and the maintenance burden. ([\#19254](https://github.com/element-hq/synapse/issues/19254))
## Internal Changes
- Group together dependabot update PRs to reduce the review load. ([\#18402](https://github.com/element-hq/synapse/issues/18402))
- Fix `HomeServer.shutdown()` failing if the homeserver hasn't been setup yet. ([\#19187](https://github.com/element-hq/synapse/issues/19187))
- Respond with useful error codes with `Content-Length` header/s are invalid. ([\#19212](https://github.com/element-hq/synapse/issues/19212))
- Fix `HomeServer.shutdown()` failing if the homeserver failed to `start`. ([\#19232](https://github.com/element-hq/synapse/issues/19232))
- Switch the build backend from `poetry-core` to `maturin`. ([\#19234](https://github.com/element-hq/synapse/issues/19234))
- Raise the limit for concurrently-open non-security @dependabot PRs from 5 to 10. ([\#19253](https://github.com/element-hq/synapse/issues/19253))
- Require 14 days to pass before pulling in general dependency updates to help mitigate upstream supply chain attacks. ([\#19258](https://github.com/element-hq/synapse/issues/19258))
- Drop the broken netlify documentation workflow until a new one is implemented. ([\#19262](https://github.com/element-hq/synapse/issues/19262))
- Don't include debug logs in `Clock` unless explicitly enabled. ([\#19278](https://github.com/element-hq/synapse/issues/19278))
- Use `uv` to test olddeps to ensure all transitive dependencies use minimum versions. ([\#19289](https://github.com/element-hq/synapse/issues/19289))
- Add a config to be able to rate limit search in the user directory. ([\#19291](https://github.com/element-hq/synapse/issues/19291))
- Log the original bind exception when encountering `Failed to listen on 0.0.0.0, continuing because listening on [::]`. ([\#19297](https://github.com/element-hq/synapse/issues/19297))
- Unpin the version of Rust we use to build Synapse wheels (was 1.82.0) now that MacOS support has been dropped. ([\#19302](https://github.com/element-hq/synapse/issues/19302))
- Make it more clear how `shared_extra_conf` is combined in our Docker configuration scripts. ([\#19323](https://github.com/element-hq/synapse/issues/19323))
- Update CI to stream Complement progress and format logs in a separate step after all tests are done. ([\#19326](https://github.com/element-hq/synapse/issues/19326))
- Format `.github/workflows/tests.yml`. ([\#19327](https://github.com/element-hq/synapse/issues/19327))
# Synapse 1.144.0 (2025-12-09) # Synapse 1.144.0 (2025-12-09)
## Deprecation of MacOS Python wheels ## Deprecation of MacOS Python wheels

12
Cargo.lock generated
View file

@ -705,9 +705,9 @@ checksum = "241eaef5fd12c88705a01fc1066c48c4b36e0dd4377dcdc7ec3942cea7a69956"
[[package]] [[package]]
name = "log" name = "log"
version = "0.4.28" version = "0.4.29"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "34080505efa8e45a4b816c349525ebe327ceaa8559756f0356cba97ef3bf7432" checksum = "5e5032e24019045c762d3c0f28f5b6b8bbf38563a65908389bf7978758920897"
[[package]] [[package]]
name = "lru-slab" name = "lru-slab"
@ -1024,9 +1024,9 @@ checksum = "2b15c43186be67a4fd63bee50d0303afffcef381492ebe2c5d87f324e1b8815c"
[[package]] [[package]]
name = "reqwest" name = "reqwest"
version = "0.12.24" version = "0.12.26"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9d0946410b9f7b082a427e4ef5c8ff541a88b357bc6c637c40db3a68ac70a36f" checksum = "3b4c14b2d9afca6a60277086b0cc6a6ae0b568f6f7916c943a8cdc79f8be240f"
dependencies = [ dependencies = [
"base64", "base64",
"bytes", "bytes",
@ -1468,9 +1468,9 @@ dependencies = [
[[package]] [[package]]
name = "tower-http" name = "tower-http"
version = "0.6.6" version = "0.6.8"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "adc82fd73de2a9722ac5da747f12383d2bfdb93591ee6c58486e0097890f05f2" checksum = "d4e6559d53cc268e5031cd8429d05415bc4cb4aefc4aa5d6cc35fbf5b924a1f8"
dependencies = [ dependencies = [
"bitflags", "bitflags",
"bytes", "bytes",

View file

@ -1,4 +1,4 @@
.. image:: ./docs/element_logo_white_bg.svg .. image:: https://github.com/element-hq/synapse/raw/develop/docs/element_logo_white_bg.svg
:height: 60px :height: 60px
**Element Synapse - Matrix homeserver implementation** **Element Synapse - Matrix homeserver implementation**

View file

@ -4,7 +4,6 @@
title = "Synapse" title = "Synapse"
authors = ["The Matrix.org Foundation C.I.C."] authors = ["The Matrix.org Foundation C.I.C."]
language = "en" language = "en"
multilingual = false
# The directory that documentation files are stored in # The directory that documentation files are stored in
src = "docs" src = "docs"
@ -31,13 +30,10 @@ site-url = "/synapse/"
# Additional HTML, JS, CSS that's injected into each page of the book. # Additional HTML, JS, CSS that's injected into each page of the book.
# More information available in docs/website_files/README.md # More information available in docs/website_files/README.md
additional-css = [ additional-css = [
"docs/website_files/table-of-contents.css",
"docs/website_files/remove-nav-buttons.css",
"docs/website_files/indent-section-headers.css", "docs/website_files/indent-section-headers.css",
"docs/website_files/version-picker.css", "docs/website_files/version-picker.css",
] ]
additional-js = [ additional-js = [
"docs/website_files/table-of-contents.js",
"docs/website_files/version-picker.js", "docs/website_files/version-picker.js",
"docs/website_files/version.js", "docs/website_files/version.js",
] ]

View file

@ -0,0 +1 @@
Add a new config option [`enable_local_media_storage`](https://element-hq.github.io/synapse/latest/usage/configuration/config_documentation.html#enable_local_media_storage) which controls whether media is additionally stored locally when using configured `media_storage_providers`. Setting this to `false` allows off-site media storage without a local cache. Contributed by Patrice Brend'amour @dr.allgood.

View file

@ -0,0 +1 @@
Stabilise support for [MSC4312](https://github.com/matrix-org/matrix-spec-proposals/pull/4312)'s `m.oauth` User-Interactive Auth stage for resetting cross-signing identity with the OAuth 2.0 API. The old, unstable name (`org.matrix.cross_signing_reset`) is now deprecated and will be removed in a future release.

1
changelog.d/19310.misc Normal file
View file

@ -0,0 +1 @@
Add an internal `cancel_task` API to the task scheduler.

1
changelog.d/19320.misc Normal file
View file

@ -0,0 +1 @@
Tweak docstrings and signatures of `auth_types_for_event` and `get_catchup_room_event_ids`.

1
changelog.d/19321.bugfix Normal file
View file

@ -0,0 +1 @@
Fix joining a restricted v12 room locally when no local room creator is present but local users with sufficient power levels are. Contributed by @nexy7574.

1
changelog.d/19335.bugfix Normal file
View file

@ -0,0 +1 @@
Fixed parallel calls to `/_matrix/media/v1/create` being ratelimited for appservices even if `rate_limited: false` was set in the registration. Contributed by @tulir @ Beeper.

1
changelog.d/19336.docker Normal file
View file

@ -0,0 +1 @@
Add [Prometheus HTTP service discovery](https://prometheus.io/docs/prometheus/latest/configuration/configuration/#http_sd_config) endpoint for easy discovery of all workers when using the `docker/Dockerfile-workers` image (see the [*Metrics* section of our Docker testing docs](docker/README-testing.md#metrics)).

View file

@ -0,0 +1 @@
Refactor Grafana dashboard to use `server_name` label (instead of `instance`).

1
changelog.d/19341.doc Normal file
View file

@ -0,0 +1 @@
Remove docs on legacy metric names (no longer in the codebase since 2022-12-06).

1
changelog.d/19345.misc Normal file
View file

@ -0,0 +1 @@
Replace usage of deprecated `assertEquals` with `assertEqual` in unit test code.

View file

@ -0,0 +1 @@
MSC2697 (Dehydrated devices) has been removed, as the MSC is closed. Developers should migrate to MSC3814.

1
changelog.d/19348.misc Normal file
View file

@ -0,0 +1 @@
Drop support for Ubuntu 25.04 'Plucky Puffin', add support for Ubuntu 25.10 'Questing Quokka'.

1
changelog.d/19351.misc Normal file
View file

@ -0,0 +1 @@
Revert "Add an Admin API endpoint for listing quarantined media (#19268)".

1
changelog.d/19353.bugfix Normal file
View file

@ -0,0 +1 @@
Fix a bug introduced in 1.61.0 where a user's membership in a room was accidentally ignored when considering access to historical state events in rooms with the "shared" history visibility. Contributed by Lukas Tautz.

1
changelog.d/19356.misc Normal file
View file

@ -0,0 +1 @@
Bump `mdbook` from 0.4.17 to 0.5.2 and remove our custom table-of-contents plugin in favour of the new default functionality.

1
changelog.d/19358.misc Normal file
View file

@ -0,0 +1 @@
Replace deprecated usage of PyGitHub's `GitRelease.title` with `.name` in release script.

1
changelog.d/19360.bugfix Normal file
View file

@ -0,0 +1 @@
MSC4140: Store the JSON content of scheduled delayed events as text instead of a byte array. This fixes the inability to schedule a delayed event with non-ASCII characters in its content.

1
changelog.d/19368.misc Normal file
View file

@ -0,0 +1 @@
Update the Element logo in Synapse's README to be an absolute URL, allowing it to render on other sites (such as PyPI).

1
changelog.d/19372.bugfix Normal file
View file

@ -0,0 +1 @@
Always rollback database transaction when retrying (avoid orphaned connections).

1
changelog.d/19376.misc Normal file
View file

@ -0,0 +1 @@
Apply minor tweaks to v1.145.0 changelog.

1
changelog.d/19379.bugfix Normal file
View file

@ -0,0 +1 @@
Fix `InFlightGauge` typing to allow upgrading to `prometheus_client` 0.24.

1
changelog.d/19381.misc Normal file
View file

@ -0,0 +1 @@
Update Grafana dashboard syntax to use the latest from importing/exporting with Grafana 12.3.1.

1
changelog.d/19383.misc Normal file
View file

@ -0,0 +1 @@
Warn about skipping reactor metrics when using unknown reactor type.

File diff suppressed because it is too large Load diff

30
debian/changelog vendored
View file

@ -1,3 +1,33 @@
matrix-synapse-py3 (1.145.0) stable; urgency=medium
* New Synapse release 1.145.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 13 Jan 2026 08:37:42 -0700
matrix-synapse-py3 (1.145.0~rc4) stable; urgency=medium
* New Synapse release 1.145.0rc4.
-- Synapse Packaging team <packages@matrix.org> Thu, 08 Jan 2026 12:06:35 -0700
matrix-synapse-py3 (1.145.0~rc3) stable; urgency=medium
* New Synapse release 1.145.0rc3.
-- Synapse Packaging team <packages@matrix.org> Wed, 07 Jan 2026 15:32:07 -0700
matrix-synapse-py3 (1.145.0~rc2) stable; urgency=medium
* New Synapse release 1.145.0rc2.
-- Synapse Packaging team <packages@matrix.org> Wed, 07 Jan 2026 10:10:07 -0700
matrix-synapse-py3 (1.145.0~rc1) stable; urgency=medium
* New Synapse release 1.145.0rc1.
-- Synapse Packaging team <packages@matrix.org> Tue, 06 Jan 2026 09:29:39 -0700
matrix-synapse-py3 (1.144.0) stable; urgency=medium matrix-synapse-py3 (1.144.0) stable; urgency=medium
* New Synapse release 1.144.0. * New Synapse release 1.144.0.

View file

@ -145,6 +145,12 @@ for port in 8080 8081 8082; do
rc_delayed_event_mgmt: rc_delayed_event_mgmt:
per_second: 1000 per_second: 1000
burst_count: 1000 burst_count: 1000
rc_room_creation:
per_second: 1000
burst_count: 1000
rc_user_directory:
per_second: 1000
burst_count: 1000
RC RC
) )
echo "${ratelimiting}" >> "$port.config" echo "${ratelimiting}" >> "$port.config"

View file

@ -188,7 +188,12 @@ COPY --from=builder --exclude=.lock /install /usr/local
COPY ./docker/start.py /start.py COPY ./docker/start.py /start.py
COPY ./docker/conf /conf COPY ./docker/conf /conf
EXPOSE 8008/tcp 8009/tcp 8448/tcp # 8008: CS Matrix API port from Synapse
# 8448: SS Matrix API port from Synapse
EXPOSE 8008/tcp 8448/tcp
# 19090: Metrics listener port for the main process (metrics must be enabled with
# SYNAPSE_ENABLE_METRICS=1).
EXPOSE 19090/tcp
ENTRYPOINT ["/start.py"] ENTRYPOINT ["/start.py"]

View file

@ -71,6 +71,15 @@ FROM $FROM
# Expose nginx listener port # Expose nginx listener port
EXPOSE 8080/tcp EXPOSE 8080/tcp
# Metrics for workers are on ports starting from 19091 but since these are dynamic
# we don't expose them by default (metrics must be enabled with
# SYNAPSE_ENABLE_METRICS=1)
#
# Instead, we expose a single port used for Prometheus HTTP service discovery
# (`http://<synapse_container>:9469/metrics/service_discovery`) and proxy all of the
# workers' metrics endpoints through that
# (`http://<synapse_container>:9469/metrics/worker/<worker_name>`).
EXPOSE 9469/tcp
# A script to read environment variables and create the necessary # A script to read environment variables and create the necessary
# files to run the desired worker configuration. Will start supervisord. # files to run the desired worker configuration. Will start supervisord.

View file

@ -135,3 +135,49 @@ but it does not serve TLS by default.
You can configure `SYNAPSE_TLS_CERT` and `SYNAPSE_TLS_KEY` to point to a You can configure `SYNAPSE_TLS_CERT` and `SYNAPSE_TLS_KEY` to point to a
TLS certificate and key (respectively), both in PEM (textual) format. TLS certificate and key (respectively), both in PEM (textual) format.
In this case, Nginx will additionally serve using HTTPS on port 8448. In this case, Nginx will additionally serve using HTTPS on port 8448.
### Metrics
Set `SYNAPSE_ENABLE_METRICS=1` to configure `enable_metrics: true` and setup the
`metrics` listener on the main and worker processes. Defaults to `0` (disabled). The
main process will listen on port `19090` and workers on port `19091 + <worker index>`.
When using `docker/Dockerfile-workers`, to ease the complexity with the metrics setup,
we also have a [Prometheus HTTP service
discovery](https://prometheus.io/docs/prometheus/latest/configuration/configuration/#http_sd_config)
endpoint available at `http://<synapse_container>:9469/metrics/service_discovery`.
The metrics from each worker can also be accessed via
`http://<synapse_container>:9469/metrics/worker/<worker_name>` which is what the service
discovery response points to behind the scenes. This way, you only need to expose a
single port (9469) to access all metrics.
```yaml
global:
scrape_interval: 15s
scrape_timeout: 15s
evaluation_interval: 15s
scrape_configs:
- job_name: synapse
scrape_interval: 15s
metrics_path: /_synapse/metrics
scheme: http
# We set `honor_labels` so that each service can set their own `job`/`instance` label
#
# > honor_labels controls how Prometheus handles conflicts between labels that are
# > already present in scraped data and labels that Prometheus would attach
# > server-side ("job" and "instance" labels, manually configured target
# > labels, and labels generated by service discovery implementations).
# >
# > *-- https://prometheus.io/docs/prometheus/latest/configuration/configuration/#scrape_config*
honor_labels: true
# Use HTTP service discovery
#
# Reference:
# - https://prometheus.io/docs/prometheus/latest/http_sd/
# - https://prometheus.io/docs/prometheus/latest/configuration/configuration/#http_sd_config
http_sd_configs:
- url: 'http://localhost:9469/metrics/service_discovery'
```

View file

@ -75,6 +75,9 @@ The following environment variables are supported in `generate` mode:
particularly tricky. particularly tricky.
* `SYNAPSE_LOG_TESTING`: if set, Synapse will log additional information useful * `SYNAPSE_LOG_TESTING`: if set, Synapse will log additional information useful
for testing. for testing.
* `SYNAPSE_ENABLE_METRICS`: if set to `1`, the metrics listener will be enabled on the
main and worker processes. Defaults to `0` (disabled). The main process will listen on
port `19090` and workers on port `19091 + <worker index>`.
## Postgres ## Postgres

View file

@ -102,6 +102,10 @@ rc_room_creation:
per_second: 9999 per_second: 9999
burst_count: 9999 burst_count: 9999
rc_user_directory:
per_second: 9999
burst_count: 9999
federation_rr_transactions_per_room_per_second: 9999 federation_rr_transactions_per_room_per_second: 9999
allow_device_name_lookup_over_federation: true allow_device_name_lookup_over_federation: true

View file

@ -48,3 +48,5 @@ server {
proxy_set_header Host $host:$server_port; proxy_set_header Host $host:$server_port;
} }
} }
{{ nginx_prometheus_metrics_service_discovery }}

View file

@ -20,4 +20,9 @@ app_service_config_files:
{%- endfor %} {%- endfor %}
{%- endif %} {%- endif %}
{# Controlled by SYNAPSE_ENABLE_METRICS #}
{% if enable_metrics %}
enable_metrics: true
{% endif %}
{{ shared_worker_config }} {{ shared_worker_config }}

View file

@ -21,6 +21,14 @@ worker_listeners:
{%- endfor %} {%- endfor %}
{% endif %} {% endif %}
{# Controlled by SYNAPSE_ENABLE_METRICS #}
{% if metrics_port %}
- type: metrics
# Prometheus does not support Unix sockets so we don't bother with
# `SYNAPSE_USE_UNIX_SOCKET`, https://github.com/prometheus/prometheus/issues/12024
port: {{ metrics_port }}
{% endif %}
worker_log_config: {{ worker_log_config_filepath }} worker_log_config: {{ worker_log_config_filepath }}
{{ worker_extra_conf }} {{ worker_extra_conf }}

View file

@ -53,6 +53,15 @@ listeners:
- names: [federation] - names: [federation]
compress: false compress: false
{% if SYNAPSE_ENABLE_METRICS %}
- type: metrics
# The main process always uses the same port 19090
#
# Prometheus does not support Unix sockets so we don't bother with
# `SYNAPSE_USE_UNIX_SOCKET`, https://github.com/prometheus/prometheus/issues/12024
port: 19090
{% endif %}
## Database ## ## Database ##
{% if POSTGRES_PASSWORD %} {% if POSTGRES_PASSWORD %}

View file

@ -49,11 +49,16 @@
# regardless of the SYNAPSE_LOG_LEVEL setting. # regardless of the SYNAPSE_LOG_LEVEL setting.
# * SYNAPSE_LOG_TESTING: if set, Synapse will log additional information useful # * SYNAPSE_LOG_TESTING: if set, Synapse will log additional information useful
# for testing. # for testing.
# * SYNAPSE_USE_UNIX_SOCKET: TODO
# * `SYNAPSE_ENABLE_METRICS`: if set to `1`, the metrics listener will be enabled on the
# main and worker processes. Defaults to `0` (disabled). The main process will listen on
# port `19090` and workers on port `19091 + <worker index>`.
# #
# NOTE: According to Complement's ENTRYPOINT expectations for a homeserver image (as defined # NOTE: According to Complement's ENTRYPOINT expectations for a homeserver image (as defined
# in the project's README), this script may be run multiple times, and functionality should # in the project's README), this script may be run multiple times, and functionality should
# continue to work if so. # continue to work if so.
import json
import os import os
import platform import platform
import re import re
@ -71,6 +76,7 @@ from typing import (
SupportsIndex, SupportsIndex,
) )
import attr
import yaml import yaml
from jinja2 import Environment, FileSystemLoader from jinja2 import Environment, FileSystemLoader
@ -337,7 +343,7 @@ WORKERS_CONFIG: dict[str, dict[str, Any]] = {
} }
# Templates for sections that may be inserted multiple times in config files # Templates for sections that may be inserted multiple times in config files
NGINX_LOCATION_CONFIG_BLOCK = """ NGINX_LOCATION_REGEX_CONFIG_BLOCK = """
location ~* {endpoint} {{ location ~* {endpoint} {{
proxy_pass {upstream}; proxy_pass {upstream};
proxy_set_header X-Forwarded-For $remote_addr; proxy_set_header X-Forwarded-For $remote_addr;
@ -346,6 +352,25 @@ NGINX_LOCATION_CONFIG_BLOCK = """
}} }}
""" """
# Having both **regex** (`NGINX_LOCATION_REGEX_CONFIG_BLOCK`) match vs **exact**
# (`NGINX_LOCATION_EXACT_CONFIG_BLOCK`) match is necessary because we can't use a URI
# path in `proxy_pass http://localhost:19090/_synapse/metrics` with the regex version.
#
# Example of what happens if you try to use `proxy_pass http://localhost:19090/_synapse/metrics`
# with `NGINX_LOCATION_REGEX_CONFIG_BLOCK`:
# ```
# nginx | 2025/12/31 22:58:34 [emerg] 21#21: "proxy_pass" cannot have URI part in location given by regular expression, or inside named location, or inside "if" statement, or inside "limit_except" block in /etc/nginx/conf.d/matrix-synapse.conf:732
# ```
NGINX_LOCATION_EXACT_CONFIG_BLOCK = """
location = {endpoint} {{
proxy_pass {upstream};
proxy_set_header X-Forwarded-For $remote_addr;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header Host $host;
}}
"""
NGINX_UPSTREAM_CONFIG_BLOCK = """ NGINX_UPSTREAM_CONFIG_BLOCK = """
upstream {upstream_worker_base_name} {{ upstream {upstream_worker_base_name} {{
{body} {body}
@ -353,6 +378,63 @@ upstream {upstream_worker_base_name} {{
""" """
PROMETHEUS_METRICS_SERVICE_DISCOVERY_FILE_PATH = (
"/data/prometheus_service_discovery.json"
)
"""
We serve this file with nginx so people can use it with `http_sd_config` in their
Prometheus config.
"""
NGINX_HOST_PLACEHOLDER = "<HOST_PLACEHOLDER>"
"""Will be replaced with the whatever hostname:port used to access the nginx metrics endpoint."""
NGINX_PROMETHEUS_METRICS_SERVICE_DISCOVERY = """
server {{
listen 9469;
location = /metrics/service_discovery {{
alias {service_discovery_file_path};
default_type application/json;
# Find/replace the host placeholder in the response body with the actual
# host used to access this endpoint.
#
# We want to reflect back whatever host the client used to access this file.
# For example, if they accessed it via `localhost:9469`, then they
# can also reach all of the proxied metrics endpoints at the same address.
# Or if it's Prometheus in another container, it will access this via
# `host.docker.internal:9469`, etc. Or perhaps it's even some randomly assigned
# port mapping.
sub_filter '{host_placeholder}' '$http_host';
# By default, `ngx_http_sub_module` only works on `text/html` responses. We want
# to find/replace in `application/JSON`.
sub_filter_types application/json;
# Replace all occurences
sub_filter_once off;
}}
# Make the service discovery endpoint easy to find; redirect to the correct spot.
location = / {{
return 302 /metrics/service_discovery;
}}
{metrics_proxy_locations}
}}
"""
"""
Setup the nginx config necessary to serve the JSON file for Prometheus HTTP service discovery
(`http_sd_config`). Served at `/metrics/service_discovery`.
Reference:
- https://prometheus.io/docs/prometheus/latest/http_sd/
- https://prometheus.io/docs/prometheus/latest/configuration/configuration/#http_sd_config
We also proxy all of the Synapse metrics endpoints through a central place so that
people only need to expose the single 9469 port and service discovery can take care of
the rest: `/metrics/worker/<worker_name>` -> http://localhost:19090/_synapse/metrics
"""
# Utility functions # Utility functions
def log(txt: str) -> None: def log(txt: str) -> None:
print(txt) print(txt)
@ -612,9 +694,42 @@ def generate_base_homeserver_config() -> None:
subprocess.run([sys.executable, "/start.py", "migrate_config"], check=True) subprocess.run([sys.executable, "/start.py", "migrate_config"], check=True)
@attr.s(auto_attribs=True)
class Worker:
worker_name: str
"""
ex.
`event_persister:2` -> `event_persister1` and `event_persister2`
`stream_writers=account_data+presence+receipts+to_device+typing"` -> `stream_writers`
"""
worker_base_name: str
"""
ex.
`event_persister:2` -> `event_persister`
`stream_writers=account_data+presence+receipts+to_device+typing"` -> `stream_writers`
"""
worker_index: int
"""
The index of the worker starting from 1 for each worker type requested.
ex.
`event_persister:2` -> `1` and `2`
`stream_writers=account_data+presence+receipts+to_device+typing"` -> `1`
"""
worker_types: set[str]
"""
ex.
`event_persister:2` -> `{"event_persister"}`
`stream_writers=account_data+presence+receipts+to_device+typing"` -> `{"account_data", "presence", "receipts","to_device", "typing"}
"""
def parse_worker_types( def parse_worker_types(
requested_worker_types: list[str], requested_worker_types: list[str],
) -> dict[str, set[str]]: ) -> list[Worker]:
"""Read the desired list of requested workers and prepare the data for use in """Read the desired list of requested workers and prepare the data for use in
generating worker config files while also checking for potential gotchas. generating worker config files while also checking for potential gotchas.
@ -622,10 +737,7 @@ def parse_worker_types(
requested_worker_types: The list formed from the split environment variable requested_worker_types: The list formed from the split environment variable
containing the unprocessed requests for workers. containing the unprocessed requests for workers.
Returns: A dict of worker names to set of worker types. Format: Returns: A list of requested workers
{'worker_name':
{'worker_type', 'worker_type2'}
}
""" """
# A counter of worker_base_name -> int. Used for determining the name for a given # A counter of worker_base_name -> int. Used for determining the name for a given
# worker when generating its config file, as each worker's name is just # worker when generating its config file, as each worker's name is just
@ -636,8 +748,8 @@ def parse_worker_types(
# more than a single worker for cases where multiples would be bad(e.g. presence). # more than a single worker for cases where multiples would be bad(e.g. presence).
worker_type_shard_counter: dict[str, int] = defaultdict(int) worker_type_shard_counter: dict[str, int] = defaultdict(int)
# The final result of all this processing # Map from worker name to `Worker`
dict_to_return: dict[str, set[str]] = {} worker_dict: dict[str, Worker] = {}
# Handle any multipliers requested for given workers. # Handle any multipliers requested for given workers.
multiple_processed_worker_types = apply_requested_multiplier_for_worker( multiple_processed_worker_types = apply_requested_multiplier_for_worker(
@ -723,24 +835,29 @@ def parse_worker_types(
if worker_number > 1: if worker_number > 1:
# If this isn't the first worker, check that we don't have a confusing # If this isn't the first worker, check that we don't have a confusing
# mixture of worker types with the same base name. # mixture of worker types with the same base name.
first_worker_with_base_name = dict_to_return[f"{worker_base_name}1"] first_worker_with_base_name = worker_dict[f"{worker_base_name}1"]
if first_worker_with_base_name != worker_types_set: if first_worker_with_base_name.worker_types != worker_types_set:
error( error(
f"Can not use worker_name: '{worker_name}' for worker_type(s): " f"Can not use worker_name: '{worker_name}' for worker_type(s): "
f"{worker_types_set!r}. It is already in use by " f"{worker_types_set!r}. It is already in use by "
f"worker_type(s): {first_worker_with_base_name!r}" f"worker_type(s): {first_worker_with_base_name.worker_types!r}"
) )
dict_to_return[worker_name] = worker_types_set worker_dict[worker_name] = Worker(
worker_name=worker_name,
worker_base_name=worker_base_name,
worker_index=worker_number,
worker_types=worker_types_set,
)
return dict_to_return return list(worker_dict.values())
def generate_worker_files( def generate_worker_files(
environ: Mapping[str, str], environ: Mapping[str, str],
config_path: str, config_path: str,
data_dir: str, data_dir: str,
requested_worker_types: dict[str, set[str]], requested_workers: list[Worker],
) -> None: ) -> None:
"""Read the desired workers(if any) that is passed in and generate shared """Read the desired workers(if any) that is passed in and generate shared
homeserver, nginx and supervisord configs. homeserver, nginx and supervisord configs.
@ -750,14 +867,16 @@ def generate_worker_files(
config_path: The location of the generated Synapse main worker config file. config_path: The location of the generated Synapse main worker config file.
data_dir: The location of the synapse data directory. Where log and data_dir: The location of the synapse data directory. Where log and
user-facing config files live. user-facing config files live.
requested_worker_types: A Dict containing requested workers in the format of requested_workers: A list of requested workers
{'worker_name1': {'worker_type', ...}}
""" """
# Note that yaml cares about indentation, so care should be taken to insert lines # Note that yaml cares about indentation, so care should be taken to insert lines
# into files at the correct indentation below. # into files at the correct indentation below.
# Convenience helper for if using unix sockets instead of host:port # Convenience helper for if using unix sockets instead of host:port
using_unix_sockets = environ.get("SYNAPSE_USE_UNIX_SOCKET", False) using_unix_sockets = environ.get("SYNAPSE_USE_UNIX_SOCKET", False)
enable_metrics = environ.get("SYNAPSE_ENABLE_METRICS", "0") == "1"
# First read the original config file and extract the listeners block. Then we'll # First read the original config file and extract the listeners block. Then we'll
# add another listener for replication. Later we'll write out the result to the # add another listener for replication. Later we'll write out the result to the
# shared config file. # shared config file.
@ -789,7 +908,11 @@ def generate_worker_files(
# base shared worker jinja2 template. This config file will be passed to all # base shared worker jinja2 template. This config file will be passed to all
# workers, included Synapse's main process. It is intended mainly for disabling # workers, included Synapse's main process. It is intended mainly for disabling
# functionality when certain workers are spun up, and adding a replication listener. # functionality when certain workers are spun up, and adding a replication listener.
shared_config: dict[str, Any] = {"listeners": listeners} shared_config: dict[str, Any] = {
"listeners": listeners,
# Controls `enable_metrics: true`
"enable_metrics": enable_metrics,
}
# List of dicts that describe workers. # List of dicts that describe workers.
# We pass this to the Supervisor template later to generate the appropriate # We pass this to the Supervisor template later to generate the appropriate
@ -816,6 +939,8 @@ def generate_worker_files(
# Start worker ports from this arbitrary port # Start worker ports from this arbitrary port
worker_port = 18009 worker_port = 18009
# The main process metrics port is 19090, so start workers from 19091
worker_metrics_port = 19091
# A list of internal endpoints to healthcheck, starting with the main process # A list of internal endpoints to healthcheck, starting with the main process
# which exists even if no workers do. # which exists even if no workers do.
@ -832,7 +957,9 @@ def generate_worker_files(
healthcheck_urls = ["http://localhost:8080/health"] healthcheck_urls = ["http://localhost:8080/health"]
# Get the set of all worker types that we have configured # Get the set of all worker types that we have configured
all_worker_types_in_use = set(chain(*requested_worker_types.values())) all_worker_types_in_use = set(
chain(*[worker.worker_types for worker in requested_workers])
)
# Map locations to upstreams (corresponding to worker types) in Nginx # Map locations to upstreams (corresponding to worker types) in Nginx
# but only if we use the appropriate worker type # but only if we use the appropriate worker type
for worker_type in all_worker_types_in_use: for worker_type in all_worker_types_in_use:
@ -841,12 +968,13 @@ def generate_worker_files(
# For each worker type specified by the user, create config values and write it's # For each worker type specified by the user, create config values and write it's
# yaml config file # yaml config file
for worker_name, worker_types_set in requested_worker_types.items(): worker_name_to_metrics_port_map: dict[str, int] = {}
for worker in requested_workers:
# The collected and processed data will live here. # The collected and processed data will live here.
worker_config: dict[str, Any] = {} worker_config: dict[str, Any] = {}
# Merge all worker config templates for this worker into a single config # Merge all worker config templates for this worker into a single config
for worker_type in worker_types_set: for worker_type in worker.worker_types:
copy_of_template_config = WORKERS_CONFIG[worker_type].copy() copy_of_template_config = WORKERS_CONFIG[worker_type].copy()
# Merge worker type template configuration data. It's a combination of lists # Merge worker type template configuration data. It's a combination of lists
@ -856,16 +984,27 @@ def generate_worker_files(
) )
# Replace placeholder names in the config template with the actual worker name. # Replace placeholder names in the config template with the actual worker name.
worker_config = insert_worker_name_for_worker_config(worker_config, worker_name) worker_config = insert_worker_name_for_worker_config(
worker_config, worker.worker_name
worker_config.update(
{"name": worker_name, "port": str(worker_port), "config_path": config_path}
) )
# Update the shared config with any worker_type specific options. The first of a worker_config.update(
# given worker_type needs to stay assigned and not be replaced. {
worker_config["shared_extra_conf"].update(shared_config) "name": worker.worker_name,
shared_config = worker_config["shared_extra_conf"] "port": str(worker_port),
"config_path": config_path,
}
)
# Keep the `shared_config` up to date with the `shared_extra_conf` from each
# worker.
shared_config = {
**worker_config["shared_extra_conf"],
# We combine `shared_config` second to avoid overwriting existing keys just
# for sanity sake (always use the first worker).
**shared_config,
}
if using_unix_sockets: if using_unix_sockets:
healthcheck_urls.append( healthcheck_urls.append(
f"--unix-socket /run/worker.{worker_port} http://localhost/health" f"--unix-socket /run/worker.{worker_port} http://localhost/health"
@ -877,39 +1016,51 @@ def generate_worker_files(
# the `events` stream. For other workers, the worker name is the same # the `events` stream. For other workers, the worker name is the same
# name of the stream they write to, but for some reason it is not the # name of the stream they write to, but for some reason it is not the
# case for event_persister. # case for event_persister.
if "event_persister" in worker_types_set: if "event_persister" in worker.worker_types:
worker_types_set.add("events") worker.worker_types.add("events")
# Update the shared config with sharding-related options if necessary # Update the shared config with sharding-related options if necessary
add_worker_roles_to_shared_config( add_worker_roles_to_shared_config(
shared_config, worker_types_set, worker_name, worker_port shared_config, worker.worker_types, worker.worker_name, worker_port
) )
# Enable the worker in supervisord # Enable the worker in supervisord
worker_descriptors.append(worker_config) worker_descriptors.append(worker_config)
# Write out the worker's logging config file # Write out the worker's logging config file
log_config_filepath = generate_worker_log_config(environ, worker_name, data_dir) log_config_filepath = generate_worker_log_config(
environ, worker.worker_name, data_dir
)
worker_name_to_metrics_port_map[worker.worker_name] = worker_metrics_port
if enable_metrics:
# Enable prometheus metrics endpoint on this worker
worker_config["metrics_port"] = worker_metrics_port
if enable_metrics:
# Enable prometheus metrics endpoint on this worker
worker_config["metrics_port"] = worker_metrics_port
# Then a worker config file # Then a worker config file
convert( convert(
"/conf/worker.yaml.j2", "/conf/worker.yaml.j2",
f"/conf/workers/{worker_name}.yaml", f"/conf/workers/{worker.worker_name}.yaml",
**worker_config, **worker_config,
worker_log_config_filepath=log_config_filepath, worker_log_config_filepath=log_config_filepath,
using_unix_sockets=using_unix_sockets, using_unix_sockets=using_unix_sockets,
) )
# Save this worker's port number to the correct nginx upstreams # Save this worker's port number to the correct nginx upstreams
for worker_type in worker_types_set: for worker_type in worker.worker_types:
nginx_upstreams.setdefault(worker_type, set()).add(worker_port) nginx_upstreams.setdefault(worker_type, set()).add(worker_port)
worker_port += 1 worker_port += 1
worker_metrics_port += 1
# Build the nginx location config blocks # Build the nginx location config blocks
nginx_location_config = "" nginx_location_config = ""
for endpoint, upstream in nginx_locations.items(): for endpoint, upstream in nginx_locations.items():
nginx_location_config += NGINX_LOCATION_CONFIG_BLOCK.format( nginx_location_config += NGINX_LOCATION_REGEX_CONFIG_BLOCK.format(
endpoint=endpoint, endpoint=endpoint,
upstream=upstream, upstream=upstream,
) )
@ -932,6 +1083,111 @@ def generate_worker_files(
body=body, body=body,
) )
# Provide a Prometheus metrics service discovery endpoint to easily be able to pick
# up all of the workers
nginx_prometheus_metrics_service_discovery = ""
if enable_metrics:
# Write JSON file for Prometheus service discovery pointing to all of the
# workers. We serve this file with nginx so people can use it with
# `http_sd_config` in their Prometheus config.
#
# > It fetches targets from an HTTP endpoint containing a list of zero or more
# > `<static_config>`s. The target must reply with an HTTP 200 response. The HTTP
# > header `Content-Type` must be `application/json`, and the body must be valid
# > JSON.
# >
# > *-- https://prometheus.io/docs/prometheus/latest/configuration/configuration/#http_sd_config*
#
# Another reference: https://prometheus.io/docs/prometheus/latest/http_sd/
prometheus_http_service_discovery_content = [
{
"targets": [NGINX_HOST_PLACEHOLDER],
"labels": {
# The downstream user should also configure `honor_labels: true` in
# their Prometheus config to prevent Prometheus from overwriting the
# `job` labels.
#
# > honor_labels controls how Prometheus handles conflicts between labels that are
# > already present in scraped data and labels that Prometheus would attach
# > server-side ("job" and "instance" labels, manually configured target
# > labels, and labels generated by service discovery implementations).
# >
# > *-- https://prometheus.io/docs/prometheus/latest/configuration/configuration/#scrape_config*
#
# Reference:
# - https://prometheus.io/docs/concepts/jobs_instances/
# - https://prometheus.io/docs/prometheus/latest/configuration/configuration/#scrape_config
"job": worker.worker_base_name,
"index": f"{worker.worker_index}",
# This allows us to change the `metrics_path` on a per-target basis.
# We want to grab the metrics from our nginx proxied location (setup
# below).
#
# While there doesn't seem to be official docs on these special
# labels (`__metrics_path__`, `__scheme__`, `__scrape_interval__`,
# `__scrape_timeout__`), this discussion best summarizes how this
# works: https://github.com/prometheus/prometheus/discussions/13217
"__metrics_path__": f"/metrics/worker/{worker.worker_name}",
},
}
for worker in requested_workers
]
# Add the main Synapse process as well
prometheus_http_service_discovery_content.append(
{
"targets": [NGINX_HOST_PLACEHOLDER],
"labels": {
# We use `"synapse"` as the job name for the main process because it
# matches what we expect people to use from a monolith setup with
# their static scrape config. It's `job` name used in our Grafana
# dashboard for the main process.
"job": "synapse",
"index": "1",
"__metrics_path__": "/metrics/worker/main",
},
}
)
# Check to make sure the file doesn't already exist
if os.path.isfile(PROMETHEUS_METRICS_SERVICE_DISCOVERY_FILE_PATH):
error(
f"Prometheus service discovery file "
f"'{PROMETHEUS_METRICS_SERVICE_DISCOVERY_FILE_PATH}' already exists (unexpected)! "
f"This is a problem because the existing file probably doesn't match the "
"Synapse workers we're setting up now."
)
# Write the file
with open(PROMETHEUS_METRICS_SERVICE_DISCOVERY_FILE_PATH, "w") as outfile:
outfile.write(
json.dumps(prometheus_http_service_discovery_content, indent=4)
)
# Proxy all of the Synapse metrics endpoints through a central place so that
# people only need to expose the single 9469 port and service discovery can take
# care of the rest: `/metrics/worker/<worker_name>` ->
# http://localhost:19090/_synapse/metrics
#
# Build the nginx location config blocks
metrics_proxy_locations = ""
for worker in requested_workers:
metrics_proxy_locations += NGINX_LOCATION_EXACT_CONFIG_BLOCK.format(
endpoint=f"/metrics/worker/{worker.worker_name}",
upstream=f"http://localhost:{worker_name_to_metrics_port_map[worker.worker_name]}/_synapse/metrics",
)
# Add the main Synapse process as well
metrics_proxy_locations += NGINX_LOCATION_EXACT_CONFIG_BLOCK.format(
endpoint="/metrics/worker/main",
upstream="http://localhost:19090/_synapse/metrics",
)
# Add a nginx server/location to serve the JSON file
nginx_prometheus_metrics_service_discovery = NGINX_PROMETHEUS_METRICS_SERVICE_DISCOVERY.format(
service_discovery_file_path=PROMETHEUS_METRICS_SERVICE_DISCOVERY_FILE_PATH,
host_placeholder=NGINX_HOST_PLACEHOLDER,
metrics_proxy_locations=metrics_proxy_locations,
)
# Finally, we'll write out the config files. # Finally, we'll write out the config files.
# log config for the master process # log config for the master process
@ -949,7 +1205,7 @@ def generate_worker_files(
if reg_path.suffix.lower() in (".yaml", ".yml") if reg_path.suffix.lower() in (".yaml", ".yml")
] ]
workers_in_use = len(requested_worker_types) > 0 workers_in_use = len(requested_workers) > 0
# If there are workers, add the main process to the instance_map too. # If there are workers, add the main process to the instance_map too.
if workers_in_use: if workers_in_use:
@ -984,6 +1240,7 @@ def generate_worker_files(
tls_cert_path=os.environ.get("SYNAPSE_TLS_CERT"), tls_cert_path=os.environ.get("SYNAPSE_TLS_CERT"),
tls_key_path=os.environ.get("SYNAPSE_TLS_KEY"), tls_key_path=os.environ.get("SYNAPSE_TLS_KEY"),
using_unix_sockets=using_unix_sockets, using_unix_sockets=using_unix_sockets,
nginx_prometheus_metrics_service_discovery=nginx_prometheus_metrics_service_discovery,
) )
# Supervisord config # Supervisord config
@ -1084,15 +1341,20 @@ def main(args: list[str], environ: MutableMapping[str, str]) -> None:
if not worker_types_env: if not worker_types_env:
# No workers, just the main process # No workers, just the main process
worker_types = [] worker_types = []
requested_worker_types: dict[str, Any] = {} requested_workers: list[Worker] = []
else: else:
# Split type names by comma, ignoring whitespace. # Split type names by comma, ignoring whitespace.
worker_types = split_and_strip_string(worker_types_env, ",") worker_types = split_and_strip_string(worker_types_env, ",")
requested_worker_types = parse_worker_types(worker_types) requested_workers = parse_worker_types(worker_types)
# Always regenerate all other config files # Always regenerate all other config files
log("Generating worker config files") log("Generating worker config files")
generate_worker_files(environ, config_path, data_dir, requested_worker_types) generate_worker_files(
environ=environ,
config_path=config_path,
data_dir=data_dir,
requested_workers=requested_workers,
)
# Mark workers as being configured # Mark workers as being configured
with open(mark_filepath, "w") as f: with open(mark_filepath, "w") as f:

View file

@ -31,6 +31,25 @@ def flush_buffers() -> None:
sys.stderr.flush() sys.stderr.flush()
def strtobool(val: str) -> bool:
"""Convert a string representation of truth to True or False
True values are 'y', 'yes', 't', 'true', 'on', and '1'; false values
are 'n', 'no', 'f', 'false', 'off', and '0'. Raises ValueError if
'val' is anything else.
This is lifted from distutils.util.strtobool, with the exception that it actually
returns a bool, rather than an int.
"""
val = val.lower()
if val in ("y", "yes", "t", "true", "on", "1"):
return True
elif val in ("n", "no", "f", "false", "off", "0"):
return False
else:
raise ValueError("invalid truth value %r" % (val,))
def convert(src: str, dst: str, environ: Mapping[str, object]) -> None: def convert(src: str, dst: str, environ: Mapping[str, object]) -> None:
"""Generate a file from a template """Generate a file from a template
@ -98,19 +117,16 @@ def generate_config_from_template(
os.mkdir(config_dir) os.mkdir(config_dir)
# Convert SYNAPSE_NO_TLS to boolean if exists # Convert SYNAPSE_NO_TLS to boolean if exists
if "SYNAPSE_NO_TLS" in environ: tlsanswerstring = environ.get("SYNAPSE_NO_TLS")
tlsanswerstring = str.lower(environ["SYNAPSE_NO_TLS"]) if tlsanswerstring is not None:
if tlsanswerstring in ("true", "on", "1", "yes"): try:
environ["SYNAPSE_NO_TLS"] = True environ["SYNAPSE_NO_TLS"] = strtobool(tlsanswerstring)
else: except ValueError:
if tlsanswerstring in ("false", "off", "0", "no"): error(
environ["SYNAPSE_NO_TLS"] = False 'Environment variable "SYNAPSE_NO_TLS" found but value "'
else: + tlsanswerstring
error( + '" unrecognized; exiting.'
'Environment variable "SYNAPSE_NO_TLS" found but value "' )
+ tlsanswerstring
+ '" unrecognized; exiting.'
)
if "SYNAPSE_LOG_CONFIG" not in environ: if "SYNAPSE_LOG_CONFIG" not in environ:
environ["SYNAPSE_LOG_CONFIG"] = config_dir + "/log.config" environ["SYNAPSE_LOG_CONFIG"] = config_dir + "/log.config"
@ -164,6 +180,18 @@ def run_generate_config(environ: Mapping[str, str], ownership: str | None) -> No
config_dir = environ.get("SYNAPSE_CONFIG_DIR", "/data") config_dir = environ.get("SYNAPSE_CONFIG_DIR", "/data")
config_path = environ.get("SYNAPSE_CONFIG_PATH", config_dir + "/homeserver.yaml") config_path = environ.get("SYNAPSE_CONFIG_PATH", config_dir + "/homeserver.yaml")
data_dir = environ.get("SYNAPSE_DATA_DIR", "/data") data_dir = environ.get("SYNAPSE_DATA_DIR", "/data")
enable_metrics_raw = environ.get("SYNAPSE_ENABLE_METRICS", "0")
enable_metrics = False
if enable_metrics_raw is not None:
try:
enable_metrics = strtobool(enable_metrics_raw)
except ValueError:
error(
'Environment variable "SYNAPSE_ENABLE_METRICS" found but value "'
+ enable_metrics_raw
+ '" unrecognized; exiting.'
)
# create a suitable log config from our template # create a suitable log config from our template
log_config_file = "%s/%s.log.config" % (config_dir, server_name) log_config_file = "%s/%s.log.config" % (config_dir, server_name)
@ -190,6 +218,9 @@ def run_generate_config(environ: Mapping[str, str], ownership: str | None) -> No
"--open-private-ports", "--open-private-ports",
] ]
if enable_metrics:
args.append("--enable-metrics")
if ownership is not None: if ownership is not None:
# make sure that synapse has perms to write to the data dir. # make sure that synapse has perms to write to the data dir.
log(f"Setting ownership on {data_dir} to {ownership}") log(f"Setting ownership on {data_dir} to {ownership}")

5
docs/.htmltest.yml Normal file
View file

@ -0,0 +1,5 @@
# Configuration for htmltest, which we run in CI to check that links aren't broken in the built documentation.
# See all config options: https://github.com/wjdp/htmltest#wrench-configuration
# Don't check external links, as that requires network access and is slow.
CheckExternal: false

View file

@ -88,6 +88,20 @@ is quarantined, Synapse will:
- Quarantine any existing cached remote media. - Quarantine any existing cached remote media.
- Quarantine any future remote media. - Quarantine any future remote media.
## Downloading quarantined media
Normally, when media is quarantined, it will return a 404 error when downloaded.
Admins can bypass this by adding `?admin_unsafely_bypass_quarantine=true`
to the [normal download URL](https://spec.matrix.org/v1.16/client-server-api/#get_matrixclientv1mediadownloadservernamemediaid).
Bypassing the quarantine check is not recommended. Media is typically quarantined
to prevent harmful content from being served to users, which includes admins. Only
set the bypass parameter if you intentionally want to access potentially harmful
content.
Non-admin users cannot bypass quarantine checks, even when specifying the above
query parameter.
## Quarantining media by ID ## Quarantining media by ID
This API quarantines a single piece of local or remote media. This API quarantines a single piece of local or remote media.

View file

@ -36,9 +36,10 @@ It returns a JSON body like the following:
- "scheduled" - Task is scheduled but not active - "scheduled" - Task is scheduled but not active
- "active" - Task is active and probably running, and if not will be run on next scheduler loop run - "active" - Task is active and probably running, and if not will be run on next scheduler loop run
- "complete" - Task has completed successfully - "complete" - Task has completed successfully
- "cancelled" - Task has been cancelled
- "failed" - Task is over and either returned a failed status, or had an exception - "failed" - Task is over and either returned a failed status, or had an exception
* `max_timestamp`: int - Is optional. Returns only the scheduled tasks with a timestamp inferior to the specified one. * `max_timestamp`: int - Is optional. Returns only the scheduled tasks with a timestamp (in milliseconds since the unix epoch) inferior to the specified one.
**Response** **Response**

View file

@ -505,6 +505,55 @@ with a body of:
} }
``` ```
## List room memberships of a user
Gets a list of room memberships for a specific `user_id`. This
endpoint differs from
[`GET /_synapse/admin/v1/users/<user_id>/joined_rooms`](#list-joined-rooms-of-a-user)
in that it returns rooms with memberships other than "join".
The API is:
```
GET /_synapse/admin/v1/users/<user_id>/memberships
```
A response body like the following is returned:
```json
{
"memberships": {
"!DuGcnbhHGaSZQoNQR:matrix.org": "join",
"!ZtSaPCawyWtxfWiIy:matrix.org": "leave",
}
}
```
which is a list of room membership states for the given user. This endpoint can
be used with both local and remote users, with the caveat that the homeserver will
only be aware of the memberships for rooms that one of its local users has joined.
Remote user memberships may also be out of date if all local users have since left
a room. The homeserver will thus no longer receive membership updates about it.
The list includes rooms that the user has since left; other membership states (knock,
invite, etc.) are also possible.
Note that rooms will only disappear from this list if they are
[purged](./rooms.md#delete-room-api) from the homeserver.
**Parameters**
The following parameters should be set in the URL:
- `user_id` - fully qualified: for example, `@user:server.com`.
**Response**
The following fields are returned in the JSON response body:
- `memberships` - A map of `room_id` (string) to `membership` state (string).
## List joined rooms of a user ## List joined rooms of a user
Gets a list of all `room_id` that a specific `user_id` is joined to and is a member of (participating in). Gets a list of all `room_id` that a specific `user_id` is joined to and is a member of (participating in).

View file

@ -123,193 +123,21 @@ Example Prometheus target for Synapse with workers:
static_configs: static_configs:
- targets: ["my.server.here:port"] - targets: ["my.server.here:port"]
labels: labels:
instance: "my.server"
job: "master" job: "master"
index: 1 index: 1
- targets: ["my.workerserver.here:port"] - targets: ["my.workerserver.here:port"]
labels: labels:
instance: "my.server"
job: "generic_worker" job: "generic_worker"
index: 1 index: 1
- targets: ["my.workerserver.here:port"] - targets: ["my.workerserver.here:port"]
labels: labels:
instance: "my.server"
job: "generic_worker" job: "generic_worker"
index: 2 index: 2
- targets: ["my.workerserver.here:port"] - targets: ["my.workerserver.here:port"]
labels: labels:
instance: "my.server"
job: "media_repository" job: "media_repository"
index: 1 index: 1
``` ```
Labels (`instance`, `job`, `index`) can be defined as anything. Labels (`job`, `index`) can be defined as anything.
The labels are used to group graphs in grafana. The labels are used to group graphs in grafana.
## Renaming of metrics & deprecation of old names in 1.2
Synapse 1.2 updates the Prometheus metrics to match the naming
convention of the upstream `prometheus_client`. The old names are
considered deprecated and will be removed in a future version of
Synapse.
**The old names will be disabled by default in Synapse v1.71.0 and removed
altogether in Synapse v1.73.0.**
| New Name | Old Name |
| ---------------------------------------------------------------------------- | ---------------------------------------------------------------------- |
| python_gc_objects_collected_total | python_gc_objects_collected |
| python_gc_objects_uncollectable_total | python_gc_objects_uncollectable |
| python_gc_collections_total | python_gc_collections |
| process_cpu_seconds_total | process_cpu_seconds |
| synapse_federation_client_sent_transactions_total | synapse_federation_client_sent_transactions |
| synapse_federation_client_events_processed_total | synapse_federation_client_events_processed |
| synapse_event_processing_loop_count_total | synapse_event_processing_loop_count |
| synapse_event_processing_loop_room_count_total | synapse_event_processing_loop_room_count |
| synapse_util_caches_cache_hits | synapse_util_caches_cache:hits |
| synapse_util_caches_cache_size | synapse_util_caches_cache:size |
| synapse_util_caches_cache_evicted_size | synapse_util_caches_cache:evicted_size |
| synapse_util_caches_cache | synapse_util_caches_cache:total |
| synapse_util_caches_response_cache_size | synapse_util_caches_response_cache:size |
| synapse_util_caches_response_cache_hits | synapse_util_caches_response_cache:hits |
| synapse_util_caches_response_cache_evicted_size | synapse_util_caches_response_cache:evicted_size |
| synapse_util_metrics_block_count_total | synapse_util_metrics_block_count |
| synapse_util_metrics_block_time_seconds_total | synapse_util_metrics_block_time_seconds |
| synapse_util_metrics_block_ru_utime_seconds_total | synapse_util_metrics_block_ru_utime_seconds |
| synapse_util_metrics_block_ru_stime_seconds_total | synapse_util_metrics_block_ru_stime_seconds |
| synapse_util_metrics_block_db_txn_count_total | synapse_util_metrics_block_db_txn_count |
| synapse_util_metrics_block_db_txn_duration_seconds_total | synapse_util_metrics_block_db_txn_duration_seconds |
| synapse_util_metrics_block_db_sched_duration_seconds_total | synapse_util_metrics_block_db_sched_duration_seconds |
| synapse_background_process_start_count_total | synapse_background_process_start_count |
| synapse_background_process_ru_utime_seconds_total | synapse_background_process_ru_utime_seconds |
| synapse_background_process_ru_stime_seconds_total | synapse_background_process_ru_stime_seconds |
| synapse_background_process_db_txn_count_total | synapse_background_process_db_txn_count |
| synapse_background_process_db_txn_duration_seconds_total | synapse_background_process_db_txn_duration_seconds |
| synapse_background_process_db_sched_duration_seconds_total | synapse_background_process_db_sched_duration_seconds |
| synapse_storage_events_persisted_events_total | synapse_storage_events_persisted_events |
| synapse_storage_events_persisted_events_sep_total | synapse_storage_events_persisted_events_sep |
| synapse_storage_events_state_delta_total | synapse_storage_events_state_delta |
| synapse_storage_events_state_delta_single_event_total | synapse_storage_events_state_delta_single_event |
| synapse_storage_events_state_delta_reuse_delta_total | synapse_storage_events_state_delta_reuse_delta |
| synapse_federation_server_received_pdus_total | synapse_federation_server_received_pdus |
| synapse_federation_server_received_edus_total | synapse_federation_server_received_edus |
| synapse_handler_presence_notified_presence_total | synapse_handler_presence_notified_presence |
| synapse_handler_presence_federation_presence_out_total | synapse_handler_presence_federation_presence_out |
| synapse_handler_presence_presence_updates_total | synapse_handler_presence_presence_updates |
| synapse_handler_presence_timers_fired_total | synapse_handler_presence_timers_fired |
| synapse_handler_presence_federation_presence_total | synapse_handler_presence_federation_presence |
| synapse_handler_presence_bump_active_time_total | synapse_handler_presence_bump_active_time |
| synapse_federation_client_sent_edus_total | synapse_federation_client_sent_edus |
| synapse_federation_client_sent_pdu_destinations_count_total | synapse_federation_client_sent_pdu_destinations:count |
| synapse_federation_client_sent_pdu_destinations_total | synapse_federation_client_sent_pdu_destinations:total |
| synapse_handlers_appservice_events_processed_total | synapse_handlers_appservice_events_processed |
| synapse_notifier_notified_events_total | synapse_notifier_notified_events |
| synapse_push_bulk_push_rule_evaluator_push_rules_invalidation_counter_total | synapse_push_bulk_push_rule_evaluator_push_rules_invalidation_counter |
| synapse_push_bulk_push_rule_evaluator_push_rules_state_size_counter_total | synapse_push_bulk_push_rule_evaluator_push_rules_state_size_counter |
| synapse_http_httppusher_http_pushes_processed_total | synapse_http_httppusher_http_pushes_processed |
| synapse_http_httppusher_http_pushes_failed_total | synapse_http_httppusher_http_pushes_failed |
| synapse_http_httppusher_badge_updates_processed_total | synapse_http_httppusher_badge_updates_processed |
| synapse_http_httppusher_badge_updates_failed_total | synapse_http_httppusher_badge_updates_failed |
| synapse_admin_mau_current | synapse_admin_mau:current |
| synapse_admin_mau_max | synapse_admin_mau:max |
| synapse_admin_mau_registered_reserved_users | synapse_admin_mau:registered_reserved_users |
Removal of deprecated metrics & time based counters becoming histograms in 0.31.0
---------------------------------------------------------------------------------
The duplicated metrics deprecated in Synapse 0.27.0 have been removed.
All time duration-based metrics have been changed to be seconds. This
affects:
| msec -> sec metrics |
| -------------------------------------- |
| python_gc_time |
| python_twisted_reactor_tick_time |
| synapse_storage_query_time |
| synapse_storage_schedule_time |
| synapse_storage_transaction_time |
Several metrics have been changed to be histograms, which sort entries
into buckets and allow better analysis. The following metrics are now
histograms:
| Altered metrics |
| ------------------------------------------------ |
| python_gc_time |
| python_twisted_reactor_pending_calls |
| python_twisted_reactor_tick_time |
| synapse_http_server_response_time_seconds |
| synapse_storage_query_time |
| synapse_storage_schedule_time |
| synapse_storage_transaction_time |
Block and response metrics renamed for 0.27.0
---------------------------------------------
Synapse 0.27.0 begins the process of rationalising the duplicate
`*:count` metrics reported for the resource tracking for code blocks and
HTTP requests.
At the same time, the corresponding `*:total` metrics are being renamed,
as the `:total` suffix no longer makes sense in the absence of a
corresponding `:count` metric.
To enable a graceful migration path, this release just adds new names
for the metrics being renamed. A future release will remove the old
ones.
The following table shows the new metrics, and the old metrics which
they are replacing.
| New name | Old name |
| ------------------------------------------------------------- | ---------------------------------------------------------- |
| synapse_util_metrics_block_count | synapse_util_metrics_block_timer:count |
| synapse_util_metrics_block_count | synapse_util_metrics_block_ru_utime:count |
| synapse_util_metrics_block_count | synapse_util_metrics_block_ru_stime:count |
| synapse_util_metrics_block_count | synapse_util_metrics_block_db_txn_count:count |
| synapse_util_metrics_block_count | synapse_util_metrics_block_db_txn_duration:count |
| synapse_util_metrics_block_time_seconds | synapse_util_metrics_block_timer:total |
| synapse_util_metrics_block_ru_utime_seconds | synapse_util_metrics_block_ru_utime:total |
| synapse_util_metrics_block_ru_stime_seconds | synapse_util_metrics_block_ru_stime:total |
| synapse_util_metrics_block_db_txn_count | synapse_util_metrics_block_db_txn_count:total |
| synapse_util_metrics_block_db_txn_duration_seconds | synapse_util_metrics_block_db_txn_duration:total |
| synapse_http_server_response_count | synapse_http_server_requests |
| synapse_http_server_response_count | synapse_http_server_response_time:count |
| synapse_http_server_response_count | synapse_http_server_response_ru_utime:count |
| synapse_http_server_response_count | synapse_http_server_response_ru_stime:count |
| synapse_http_server_response_count | synapse_http_server_response_db_txn_count:count |
| synapse_http_server_response_count | synapse_http_server_response_db_txn_duration:count |
| synapse_http_server_response_time_seconds | synapse_http_server_response_time:total |
| synapse_http_server_response_ru_utime_seconds | synapse_http_server_response_ru_utime:total |
| synapse_http_server_response_ru_stime_seconds | synapse_http_server_response_ru_stime:total |
| synapse_http_server_response_db_txn_count | synapse_http_server_response_db_txn_count:total |
| synapse_http_server_response_db_txn_duration_seconds | synapse_http_server_response_db_txn_duration:total |
Standard Metric Names
---------------------
As of synapse version 0.18.2, the format of the process-wide metrics has
been changed to fit prometheus standard naming conventions. Additionally
the units have been changed to seconds, from milliseconds.
| New name | Old name |
| ---------------------------------------- | --------------------------------- |
| process_cpu_user_seconds_total | process_resource_utime / 1000 |
| process_cpu_system_seconds_total | process_resource_stime / 1000 |
| process_open_fds (no \'type\' label) | process_fds |
The python-specific counts of garbage collector performance have been
renamed.
| New name | Old name |
| -------------------------------- | -------------------------- |
| python_gc_time | reactor_gc_time |
| python_gc_unreachable_total | reactor_gc_unreachable |
| python_gc_counts | reactor_gc_counts |
The twisted-specific reactor metrics have been renamed.
| New name | Old name |
| -------------------------------------- | ----------------------- |
| python_twisted_reactor_pending_calls | reactor_pending_calls |
| python_twisted_reactor_tick_time | reactor_tick_time |

View file

@ -50,6 +50,11 @@ setting in your configuration file.
See the [configuration manual](usage/configuration/config_documentation.md#oidc_providers) for some sample settings, as well as See the [configuration manual](usage/configuration/config_documentation.md#oidc_providers) for some sample settings, as well as
the text below for example configurations for specific providers. the text below for example configurations for specific providers.
For setups using [`.well-known` delegation](delegate.md), make sure
[`public_baseurl`](usage/configuration/config_documentation.md#public_baseurl) is set
appropriately. If unset, Synapse defaults to `https://<server_name>/` which is used in
the OIDC callback URL.
## OIDC Back-Channel Logout ## OIDC Back-Channel Logout
Synapse supports receiving [OpenID Connect Back-Channel Logout](https://openid.net/specs/openid-connect-backchannel-1_0.html) notifications. Synapse supports receiving [OpenID Connect Back-Channel Logout](https://openid.net/specs/openid-connect-backchannel-1_0.html) notifications.

View file

@ -24,14 +24,18 @@
server_name: "SERVERNAME" server_name: "SERVERNAME"
pid_file: DATADIR/homeserver.pid pid_file: DATADIR/homeserver.pid
listeners: listeners:
- port: 8008 - bind_addresses:
- ::1
- 127.0.0.1
port: 8008
resources:
- compress: false
names:
- client
- federation
tls: false tls: false
type: http type: http
x_forwarded: true x_forwarded: true
bind_addresses: ['::1', '127.0.0.1']
resources:
- names: [client, federation]
compress: false
database: database:
name: sqlite3 name: sqlite3
args: args:

View file

@ -117,6 +117,22 @@ each upgrade are complete before moving on to the next upgrade, to avoid
stacking them up. You can monitor the currently running background updates with stacking them up. You can monitor the currently running background updates with
[the Admin API](usage/administration/admin_api/background_updates.html#status). [the Admin API](usage/administration/admin_api/background_updates.html#status).
# Upgrading to v1.146.0
## Drop support for Ubuntu 25.04 Plucky Puffin, and add support for 25.10 Questing Quokka
Ubuntu 25.04 Plucky Puffin [is end-of-life as of 17 Jan
2026](https://endoflife.date/ubuntu). This release drops support for Ubuntu
25.04, and in its place adds support for Ubuntu 25.10 Questing Quokka.
## Removal of MSC2697 (Legacy) Dehydrated devices
The endpoints for
[MSC2697](https://github.com/matrix-org/matrix-spec-proposals/pull/2697) have now
been removed, since the MSC is closed. Developers who rely on this feature should
migrate to [MSC3814](https://github.com/matrix-org/matrix-spec-proposals/pull/3814)
which introduces support for a newer version of dehydrated devices.
# Upgrading to v1.144.0 # Upgrading to v1.144.0
## Worker support for unstable MSC4140 `/restart` endpoint ## Worker support for unstable MSC4140 `/restart` endpoint
@ -828,7 +844,7 @@ the names of Prometheus metrics.
If you want to test your changes before legacy names are disabled by default, If you want to test your changes before legacy names are disabled by default,
you may specify `enable_legacy_metrics: false` in your homeserver configuration. you may specify `enable_legacy_metrics: false` in your homeserver configuration.
A list of affected metrics is available on the [Metrics How-to page](https://element-hq.github.io/synapse/v1.69/metrics-howto.html?highlight=metrics%20deprecated#renaming-of-metrics--deprecation-of-old-names-in-12). A list of affected metrics is available on the [Metrics How-to page](https://element-hq.github.io/synapse/v1.69/metrics-howto.html#renaming-of-metrics--deprecation-of-old-names-in-12).
## Deprecation of the `generate_short_term_login_token` module API method ## Deprecation of the `generate_short_term_login_token` module API method
@ -2423,7 +2439,7 @@ back to v1.3.1, subject to the following:
Some counter metrics have been renamed, with the old names deprecated. Some counter metrics have been renamed, with the old names deprecated.
See [the metrics See [the metrics
documentation](metrics-howto.md#renaming-of-metrics--deprecation-of-old-names-in-12) documentation](https://element-hq.github.io/synapse/v1.69/metrics-howto.html#renaming-of-metrics--deprecation-of-old-names-in-12)
for details. for details.
# Upgrading to v1.1.0 # Upgrading to v1.1.0

View file

@ -2041,6 +2041,25 @@ rc_room_creation:
burst_count: 5.0 burst_count: 5.0
``` ```
--- ---
### `rc_user_directory`
*(object)* This option allows admins to ratelimit searches in the user directory.
_Added in Synapse 1.145.0._
This setting has the following sub-options:
* `per_second` (number): Maximum number of requests a client can send per second.
* `burst_count` (number): Maximum number of requests a client can send before being throttled.
Default configuration:
```yaml
rc_user_directory:
per_second: 0.016
burst_count: 200.0
```
---
### `federation_rr_transactions_per_room_per_second` ### `federation_rr_transactions_per_room_per_second`
*(integer)* Sets outgoing federation transaction frequency for sending read-receipts, per-room. *(integer)* Sets outgoing federation transaction frequency for sending read-receipts, per-room.
@ -2092,6 +2111,16 @@ Example configuration:
enable_media_repo: false enable_media_repo: false
``` ```
--- ---
### `enable_local_media_storage`
*(boolean)* Enable the local on-disk media storage provider. When disabled, media is stored only in configured `media_storage_providers` and temporary files are used for processing.
**Warning:** If this option is set to `false` and no `media_storage_providers` are configured, all media requests will return 404 errors as there will be no storage backend available. Defaults to `true`.
Example configuration:
```yaml
enable_local_media_storage: false
```
---
### `media_store_path` ### `media_store_path`
*(string)* Directory where uploaded images and attachments are stored. Defaults to `"media_store"`. *(string)* Directory where uploaded images and attachments are stored. Defaults to `"media_store"`.

View file

@ -9,27 +9,18 @@ point to additional JS/CSS in this directory that are added on each page load. I
addition, the `theme` directory contains files that overwrite their counterparts in addition, the `theme` directory contains files that overwrite their counterparts in
each of the default themes included with mdbook. each of the default themes included with mdbook.
Currently we use these files to generate a floating Table of Contents panel. The code for Currently we use these files to make a few modifications:
which was partially taken from
[JorelAli/mdBook-pagetoc](https://github.com/JorelAli/mdBook-pagetoc/)
before being modified such that it scrolls with the content of the page. This is handled
by the `table-of-contents.js/css` files. The table of contents panel only appears on pages
that have more than one header, as well as only appearing on desktop-sized monitors.
We remove the navigation arrows which typically appear on the left and right side of the * We stylise the chapter titles in the left sidebar by indenting them
screen on desktop as they interfere with the table of contents. This is handled by slightly so that they are more visually distinguishable from the section headers
the `remove-nav-buttons.css` file. (the bold titles). This is done through the `indent-section-headers.css` file.
Finally, we also stylise the chapter titles in the left sidebar by indenting them * We add a version picker pertaining to the different documentation versions
slightly so that they are more visually distinguishable from the section headers shipped with each version of Synapse. This functionality was implemented through
(the bold titles). This is done through the `indent-section-headers.css` file. the `version-picker.js` and `version-picker.css` files, and is currently the only
requirement for the custom `theme/`.
In addition to these modifications, we have added a version picker to the documentation.
Users can switch between documentations for different versions of Synapse.
This functionality was implemented through the `version-picker.js` and
`version-picker.css` files.
More information can be found in mdbook's official documentation for More information can be found in mdbook's official documentation for
[injecting page JS/CSS](https://rust-lang.github.io/mdBook/format/config.html) [injecting page JS/CSS](https://rust-lang.github.io/mdBook/format/config.html)
and and
[customising the default themes](https://rust-lang.github.io/mdBook/format/theme/index.html). [customising the default themes](https://rust-lang.github.io/mdBook/format/theme/index.html).

View file

@ -1,8 +0,0 @@
/* Remove the prev, next chapter buttons as they interfere with the
* table of contents.
* Note that the table of contents only appears on desktop, thus we
* only remove the desktop (wide) chapter buttons.
*/
.nav-wide-wrapper {
display: none
}

View file

@ -1,47 +0,0 @@
:root {
--pagetoc-width: 250px;
}
@media only screen and (max-width:1439px) {
.sidetoc {
display: none;
}
}
@media only screen and (min-width:1440px) {
main {
position: relative;
margin-left: 100px !important;
margin-right: var(--pagetoc-width) !important;
}
.sidetoc {
margin-left: auto;
margin-right: auto;
left: calc(100% + (var(--content-max-width))/4 - 140px);
position: absolute;
text-align: right;
}
.pagetoc {
position: fixed;
width: var(--pagetoc-width);
overflow: auto;
right: 20px;
height: calc(100% - var(--menu-bar-height));
}
.pagetoc a {
color: var(--fg) !important;
display: block;
padding: 5px 15px 5px 10px;
text-align: left;
text-decoration: none;
}
.pagetoc a:hover,
.pagetoc a.active {
background: var(--sidebar-bg) !important;
color: var(--sidebar-fg) !important;
}
.pagetoc .active {
background: var(--sidebar-bg);
color: var(--sidebar-fg);
}
}

View file

@ -1,148 +0,0 @@
const getPageToc = () => document.getElementsByClassName('pagetoc')[0];
const pageToc = getPageToc();
const pageTocChildren = [...pageToc.children];
const headers = [...document.getElementsByClassName('header')];
// Select highlighted item in ToC when clicking an item
pageTocChildren.forEach(child => {
child.addEventHandler('click', () => {
pageTocChildren.forEach(child => {
child.classList.remove('active');
});
child.classList.add('active');
});
});
/**
* Test whether a node is in the viewport
*/
function isInViewport(node) {
const rect = node.getBoundingClientRect();
return rect.top >= 0 && rect.left >= 0 && rect.bottom <= (window.innerHeight || document.documentElement.clientHeight) && rect.right <= (window.innerWidth || document.documentElement.clientWidth);
}
/**
* Set a new ToC entry.
* Clear any previously highlighted ToC items, set the new one,
* and adjust the ToC scroll position.
*/
function setTocEntry() {
let activeEntry;
const pageTocChildren = [...getPageToc().children];
// Calculate which header is the current one at the top of screen
headers.forEach(header => {
if (window.pageYOffset >= header.offsetTop) {
activeEntry = header;
}
});
// Update selected item in ToC when scrolling
pageTocChildren.forEach(child => {
if (activeEntry.href.localeCompare(child.href) === 0) {
child.classList.add('active');
} else {
child.classList.remove('active');
}
});
let tocEntryForLocation = document.querySelector(`nav a[href="${activeEntry.href}"]`);
if (tocEntryForLocation) {
const headingForLocation = document.querySelector(activeEntry.hash);
if (headingForLocation && isInViewport(headingForLocation)) {
// Update ToC scroll
const nav = getPageToc();
const content = document.querySelector('html');
if (content.scrollTop !== 0) {
nav.scrollTo({
top: tocEntryForLocation.offsetTop - 100,
left: 0,
behavior: 'smooth',
});
} else {
nav.scrollTop = 0;
}
}
}
}
/**
* Populate sidebar on load
*/
window.addEventListener('load', () => {
// Prevent rendering the table of contents of the "print book" page, as it
// will end up being rendered into the output (in a broken-looking way)
// Get the name of the current page (i.e. 'print.html')
const pageNameExtension = window.location.pathname.split('/').pop();
// Split off the extension (as '.../print' is also a valid page name), which
// should result in 'print'
const pageName = pageNameExtension.split('.')[0];
if (pageName === "print") {
// Don't render the table of contents on this page
return;
}
// Only create table of contents if there is more than one header on the page
if (headers.length <= 1) {
return;
}
// Create an entry in the page table of contents for each header in the document
headers.forEach((header, index) => {
const link = document.createElement('a');
// Indent shows hierarchy
let indent = '0px';
switch (header.parentElement.tagName) {
case 'H1':
indent = '5px';
break;
case 'H2':
indent = '20px';
break;
case 'H3':
indent = '30px';
break;
case 'H4':
indent = '40px';
break;
case 'H5':
indent = '50px';
break;
case 'H6':
indent = '60px';
break;
default:
break;
}
let tocEntry;
if (index == 0) {
// Create a bolded title for the first element
tocEntry = document.createElement("strong");
tocEntry.innerHTML = header.text;
} else {
// All other elements are non-bold
tocEntry = document.createTextNode(header.text);
}
link.appendChild(tocEntry);
link.style.paddingLeft = indent;
link.href = header.href;
pageToc.appendChild(link);
});
setTocEntry.call();
});
// Handle active headers on scroll, if there is more than one header on the page
if (headers.length > 1) {
window.addEventListener('scroll', setTocEntry);
}

View file

@ -1,11 +1,11 @@
<!DOCTYPE HTML> <!DOCTYPE HTML>
<html lang="{{ language }}" class="sidebar-visible no-js {{ default_theme }}"> <html lang="{{ language }}" class="{{ default_theme }} sidebar-visible" dir="{{ text_direction }}">
<head> <head>
<!-- Book generated using mdBook --> <!-- Book generated using mdBook -->
<meta charset="UTF-8"> <meta charset="UTF-8">
<title>{{ title }}</title> <title>{{ title }}</title>
{{#if is_print }} {{#if is_print }}
<meta name="robots" content="noindex" /> <meta name="robots" content="noindex">
{{/if}} {{/if}}
{{#if base_url}} {{#if base_url}}
<base href="{{ base_url }}"> <base href="{{ base_url }}">
@ -15,60 +15,78 @@
<!-- Custom HTML head --> <!-- Custom HTML head -->
{{> head}} {{> head}}
<meta content="text/html; charset=utf-8" http-equiv="Content-Type">
<meta name="description" content="{{ description }}"> <meta name="description" content="{{ description }}">
<meta name="viewport" content="width=device-width, initial-scale=1"> <meta name="viewport" content="width=device-width, initial-scale=1">
<meta name="theme-color" content="#ffffff" /> <meta name="theme-color" content="#ffffff">
{{#if favicon_svg}} {{#if favicon_svg}}
<link rel="icon" href="{{ path_to_root }}favicon.svg"> <link rel="icon" href="{{ resource "favicon.svg" }}">
{{/if}} {{/if}}
{{#if favicon_png}} {{#if favicon_png}}
<link rel="shortcut icon" href="{{ path_to_root }}favicon.png"> <link rel="shortcut icon" href="{{ resource "favicon.png" }}">
{{/if}} {{/if}}
<link rel="stylesheet" href="{{ path_to_root }}css/variables.css"> <link rel="stylesheet" href="{{ resource "css/variables.css" }}">
<link rel="stylesheet" href="{{ path_to_root }}css/general.css"> <link rel="stylesheet" href="{{ resource "css/general.css" }}">
<link rel="stylesheet" href="{{ path_to_root }}css/chrome.css"> <link rel="stylesheet" href="{{ resource "css/chrome.css" }}">
{{#if print_enable}} {{#if print_enable}}
<link rel="stylesheet" href="{{ path_to_root }}css/print.css" media="print"> <link rel="stylesheet" href="{{ resource "css/print.css" }}" media="print">
{{/if}} {{/if}}
<!-- Fonts --> <!-- Fonts -->
<link rel="stylesheet" href="{{ path_to_root }}FontAwesome/css/font-awesome.css"> <link rel="stylesheet" href="{{ resource "fonts/fonts.css" }}">
{{#if copy_fonts}}
<link rel="stylesheet" href="{{ path_to_root }}fonts/fonts.css">
{{/if}}
<!-- Highlight.js Stylesheets --> <!-- Highlight.js Stylesheets -->
<link rel="stylesheet" href="{{ path_to_root }}highlight.css"> <link rel="stylesheet" id="mdbook-highlight-css" href="{{ resource "highlight.css" }}">
<link rel="stylesheet" href="{{ path_to_root }}tomorrow-night.css"> <link rel="stylesheet" id="mdbook-tomorrow-night-css" href="{{ resource "tomorrow-night.css" }}">
<link rel="stylesheet" href="{{ path_to_root }}ayu-highlight.css"> <link rel="stylesheet" id="mdbook-ayu-highlight-css" href="{{ resource "ayu-highlight.css" }}">
<!-- Custom theme stylesheets --> <!-- Custom theme stylesheets -->
{{#each additional_css}} {{#each additional_css}}
<link rel="stylesheet" href="{{ ../path_to_root }}{{ this }}"> <link rel="stylesheet" href="{{ resource this }}">
{{/each}} {{/each}}
{{#if mathjax_support}} {{#if mathjax_support}}
<!-- MathJax --> <!-- MathJax -->
<script async type="text/javascript" src="https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.1/MathJax.js?config=TeX-AMS-MML_HTMLorMML"></script> <script async src="https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.1/MathJax.js?config=TeX-AMS-MML_HTMLorMML"></script>
{{/if}} {{/if}}
<!-- Provide site root and default themes to javascript -->
<script>
const path_to_root = "{{ path_to_root }}";
const default_light_theme = "{{ default_theme }}";
const default_dark_theme = "{{ preferred_dark_theme }}";
{{#if search_js}}
window.path_to_searchindex_js = "{{ resource "searchindex.js" }}";
{{/if}}
</script>
<!-- Start loading toc.js asap -->
<script src="{{ resource "toc.js" }}"></script>
</head> </head>
<body> <body>
<!-- Provide site root to javascript --> <div id="mdbook-help-container">
<script type="text/javascript"> <div id="mdbook-help-popup">
var path_to_root = "{{ path_to_root }}"; <h2 class="mdbook-help-title">Keyboard shortcuts</h2>
var default_theme = window.matchMedia("(prefers-color-scheme: dark)").matches ? "{{ preferred_dark_theme }}" : "{{ default_theme }}"; <div>
</script> <p>Press <kbd>←</kbd> or <kbd>→</kbd> to navigate between chapters</p>
{{#if search_enabled}}
<p>Press <kbd>S</kbd> or <kbd>/</kbd> to search in the book</p>
{{/if}}
<p>Press <kbd>?</kbd> to show this help</p>
<p>Press <kbd>Esc</kbd> to hide this help</p>
</div>
</div>
</div>
<div id="mdbook-body-container">
<!-- Work around some values being stored in localStorage wrapped in quotes --> <!-- Work around some values being stored in localStorage wrapped in quotes -->
<script type="text/javascript"> <script>
try { try {
var theme = localStorage.getItem('mdbook-theme'); let theme = localStorage.getItem('mdbook-theme');
var sidebar = localStorage.getItem('mdbook-sidebar'); let sidebar = localStorage.getItem('mdbook-sidebar');
if (theme.startsWith('"') && theme.endsWith('"')) { if (theme.startsWith('"') && theme.endsWith('"')) {
localStorage.setItem('mdbook-theme', theme.slice(1, theme.length - 1)); localStorage.setItem('mdbook-theme', theme.slice(1, theme.length - 1));
} }
if (sidebar.startsWith('"') && sidebar.endsWith('"')) { if (sidebar.startsWith('"') && sidebar.endsWith('"')) {
localStorage.setItem('mdbook-sidebar', sidebar.slice(1, sidebar.length - 1)); localStorage.setItem('mdbook-sidebar', sidebar.slice(1, sidebar.length - 1));
} }
@ -76,91 +94,107 @@
</script> </script>
<!-- Set the theme before any content is loaded, prevents flash --> <!-- Set the theme before any content is loaded, prevents flash -->
<script type="text/javascript"> <script>
var theme; const default_theme = window.matchMedia("(prefers-color-scheme: dark)").matches ? default_dark_theme : default_light_theme;
let theme;
try { theme = localStorage.getItem('mdbook-theme'); } catch(e) { } try { theme = localStorage.getItem('mdbook-theme'); } catch(e) { }
if (theme === null || theme === undefined) { theme = default_theme; } if (theme === null || theme === undefined) { theme = default_theme; }
var html = document.querySelector('html'); const html = document.documentElement;
html.classList.remove('no-js')
html.classList.remove('{{ default_theme }}') html.classList.remove('{{ default_theme }}')
html.classList.add(theme); html.classList.add(theme);
html.classList.add('js'); html.classList.add("js");
</script> </script>
<input type="checkbox" id="mdbook-sidebar-toggle-anchor" class="hidden">
<!-- Hide / unhide sidebar before it is displayed --> <!-- Hide / unhide sidebar before it is displayed -->
<script type="text/javascript"> <script>
var html = document.querySelector('html'); let sidebar = null;
var sidebar = 'hidden'; const sidebar_toggle = document.getElementById("mdbook-sidebar-toggle-anchor");
if (document.body.clientWidth >= 1080) { if (document.body.clientWidth >= 1080) {
try { sidebar = localStorage.getItem('mdbook-sidebar'); } catch(e) { } try { sidebar = localStorage.getItem('mdbook-sidebar'); } catch(e) { }
sidebar = sidebar || 'visible'; sidebar = sidebar || 'visible';
} else {
sidebar = 'hidden';
sidebar_toggle.checked = false;
}
if (sidebar === 'visible') {
sidebar_toggle.checked = true;
} else {
html.classList.remove('sidebar-visible');
} }
html.classList.remove('sidebar-visible');
html.classList.add("sidebar-" + sidebar);
</script> </script>
<nav id="sidebar" class="sidebar" aria-label="Table of contents"> <nav id="mdbook-sidebar" class="sidebar" aria-label="Table of contents">
<div class="sidebar-scrollbox"> <!-- populated by js -->
{{#toc}}{{/toc}} <mdbook-sidebar-scrollbox class="sidebar-scrollbox"></mdbook-sidebar-scrollbox>
<noscript>
<iframe class="sidebar-iframe-outer" src="{{ path_to_root }}toc.html"></iframe>
</noscript>
<div id="mdbook-sidebar-resize-handle" class="sidebar-resize-handle">
<div class="sidebar-resize-indicator"></div>
</div> </div>
<div id="sidebar-resize-handle" class="sidebar-resize-handle"></div>
</nav> </nav>
<div id="page-wrapper" class="page-wrapper"> <div id="mdbook-page-wrapper" class="page-wrapper">
<div class="page"> <div class="page">
{{> header}} {{> header}}
<div id="menu-bar-hover-placeholder"></div> <div id="mdbook-menu-bar-hover-placeholder"></div>
<div id="menu-bar" class="menu-bar sticky bordered"> <div id="mdbook-menu-bar" class="menu-bar sticky">
<div class="left-buttons"> <div class="left-buttons">
<button id="sidebar-toggle" class="icon-button" type="button" title="Toggle Table of Contents" aria-label="Toggle Table of Contents" aria-controls="sidebar"> <label id="mdbook-sidebar-toggle" class="icon-button" for="mdbook-sidebar-toggle-anchor" title="Toggle Table of Contents" aria-label="Toggle Table of Contents" aria-controls="mdbook-sidebar">
<i class="fa fa-bars"></i> {{fa "solid" "bars"}}
</label>
<button id="mdbook-theme-toggle" class="icon-button" type="button" title="Change theme" aria-label="Change theme" aria-haspopup="true" aria-expanded="false" aria-controls="mdbook-theme-list">
{{fa "solid" "paintbrush"}}
</button> </button>
<button id="theme-toggle" class="icon-button" type="button" title="Change theme" aria-label="Change theme" aria-haspopup="true" aria-expanded="false" aria-controls="theme-list"> <ul id="mdbook-theme-list" class="theme-popup" aria-label="Themes" role="menu">
<i class="fa fa-paint-brush"></i> <li role="none"><button role="menuitem" class="theme" id="mdbook-theme-default_theme">Auto</button></li>
</button> <li role="none"><button role="menuitem" class="theme" id="mdbook-theme-light">Light</button></li>
<ul id="theme-list" class="theme-popup" aria-label="Themes" role="menu"> <li role="none"><button role="menuitem" class="theme" id="mdbook-theme-rust">Rust</button></li>
<li role="none"><button role="menuitem" class="theme" id="light">{{ theme_option "Light" }}</button></li> <li role="none"><button role="menuitem" class="theme" id="mdbook-theme-coal">Coal</button></li>
<li role="none"><button role="menuitem" class="theme" id="rust">{{ theme_option "Rust" }}</button></li> <li role="none"><button role="menuitem" class="theme" id="mdbook-theme-navy">Navy</button></li>
<li role="none"><button role="menuitem" class="theme" id="coal">{{ theme_option "Coal" }}</button></li> <li role="none"><button role="menuitem" class="theme" id="mdbook-theme-ayu">Ayu</button></li>
<li role="none"><button role="menuitem" class="theme" id="navy">{{ theme_option "Navy" }}</button></li>
<li role="none"><button role="menuitem" class="theme" id="ayu">{{ theme_option "Ayu" }}</button></li>
</ul> </ul>
{{#if search_enabled}} {{#if search_enabled}}
<button id="search-toggle" class="icon-button" type="button" title="Search. (Shortkey: s)" aria-label="Toggle Searchbar" aria-expanded="false" aria-keyshortcuts="S" aria-controls="searchbar"> <button id="mdbook-search-toggle" class="icon-button" type="button" title="Search (`/`)" aria-label="Toggle Searchbar" aria-expanded="false" aria-keyshortcuts="/ s" aria-controls="mdbook-searchbar">
<i class="fa fa-search"></i> {{fa "solid" "magnifying-glass"}}
</button> </button>
{{/if}} {{/if}}
<div class="version-picker"> </div>
<div class="dropdown">
<div class="select"> <!-- BEGIN CUSTOM SYNAPSE MODIFICATIONS -->
<span></span> <div class="version-picker">
<i class="fa fa-chevron-down"></i> <div class="dropdown">
</div> <div class="select">
<input type="hidden" name="version"> <span></span>
<ul class="dropdown-menu"> <i class="fa fa-chevron-down"></i>
<!-- Versions will be added dynamically in version-picker.js -->
</ul>
</div> </div>
<input type="hidden" name="version">
<ul class="dropdown-menu">
<!-- Versions will be added dynamically in version-picker.js -->
</ul>
</div> </div>
</div> </div>
<!-- END CUSTOM SYNAPSE MODIFICATIONS -->
<h1 class="menu-title">{{ book_title }}</h1> <h1 class="menu-title">{{ book_title }}</h1>
<div class="right-buttons"> <div class="right-buttons">
{{#if print_enable}} {{#if print_enable}}
<a href="{{ path_to_root }}print.html" title="Print this book" aria-label="Print this book"> <a href="{{ path_to_root }}print.html" title="Print this book" aria-label="Print this book">
<i id="print-button" class="fa fa-print"></i> {{fa "solid" "print" "print-button"}}
</a> </a>
{{/if}} {{/if}}
{{#if git_repository_url}} {{#if git_repository_url}}
<a href="{{git_repository_url}}" title="Git repository" aria-label="Git repository"> <a href="{{git_repository_url}}" title="Git repository" aria-label="Git repository">
<i id="git-repository-button" class="fa {{git_repository_icon}}"></i> {{fa git_repository_icon_class git_repository_icon}}
</a> </a>
{{/if}} {{/if}}
{{#if git_repository_edit_url}} {{#if git_repository_edit_url}}
<a href="{{git_repository_edit_url}}" title="Suggest an edit" aria-label="Suggest an edit"> <a href="{{git_repository_edit_url}}" title="Suggest an edit" aria-label="Suggest an edit" rel="edit">
<i id="git-edit-button" class="fa fa-edit"></i> {{fa "solid" "pencil" "git-edit-button"}}
</a> </a>
{{/if}} {{/if}}
@ -168,50 +202,58 @@
</div> </div>
{{#if search_enabled}} {{#if search_enabled}}
<div id="search-wrapper" class="hidden"> <div id="mdbook-search-wrapper" class="hidden">
<form id="searchbar-outer" class="searchbar-outer"> <form id="mdbook-searchbar-outer" class="searchbar-outer">
<input type="search" id="searchbar" name="searchbar" placeholder="Search this book ..." aria-controls="searchresults-outer" aria-describedby="searchresults-header"> <div class="search-wrapper">
<input type="search" id="mdbook-searchbar" name="searchbar" placeholder="Search this book ..." aria-controls="mdbook-searchresults-outer" aria-describedby="searchresults-header">
<div class="spinner-wrapper">
{{fa "solid" "spinner" "fa-spin"}}
</div>
</div>
</form> </form>
<div id="searchresults-outer" class="searchresults-outer hidden"> <div id="mdbook-searchresults-outer" class="searchresults-outer hidden">
<div id="searchresults-header" class="searchresults-header"></div> <div id="mdbook-searchresults-header" class="searchresults-header"></div>
<ul id="searchresults"> <ul id="mdbook-searchresults">
</ul> </ul>
</div> </div>
</div> </div>
{{/if}} {{/if}}
<!-- Apply ARIA attributes after the sidebar and the sidebar toggle button are added to the DOM --> <!-- Apply ARIA attributes after the sidebar and the sidebar toggle button are added to the DOM -->
<script type="text/javascript"> <script>
document.getElementById('sidebar-toggle').setAttribute('aria-expanded', sidebar === 'visible'); document.getElementById('mdbook-sidebar-toggle').setAttribute('aria-expanded', sidebar === 'visible');
document.getElementById('sidebar').setAttribute('aria-hidden', sidebar !== 'visible'); document.getElementById('mdbook-sidebar').setAttribute('aria-hidden', sidebar !== 'visible');
Array.from(document.querySelectorAll('#sidebar a')).forEach(function(link) { Array.from(document.querySelectorAll('#mdbook-sidebar a')).forEach(function(link) {
link.setAttribute('tabIndex', sidebar === 'visible' ? 0 : -1); link.setAttribute('tabIndex', sidebar === 'visible' ? 0 : -1);
}); });
</script> </script>
<div id="content" class="content"> <div id="mdbook-content" class="content">
<main> <main>
<!-- Page table of contents -->
<div class="sidetoc">
<nav class="pagetoc"></nav>
</div>
{{{ content }}} {{{ content }}}
</main> </main>
<nav class="nav-wrapper" aria-label="Page navigation"> <nav class="nav-wrapper" aria-label="Page navigation">
<!-- Mobile navigation buttons --> <!-- Mobile navigation buttons -->
{{#previous}} {{#if previous}}
<a rel="prev" href="{{ path_to_root }}{{link}}" class="mobile-nav-chapters previous" title="Previous chapter" aria-label="Previous chapter" aria-keyshortcuts="Left"> <a rel="prev" href="{{ path_to_root }}{{previous.link}}" class="mobile-nav-chapters previous" title="Previous chapter" aria-label="Previous chapter" aria-keyshortcuts="Left">
<i class="fa fa-angle-left"></i> {{#if (eq ../text_direction "rtl")}}
{{fa "solid" "angle-right"}}
{{else}}
{{fa "solid" "angle-left"}}
{{/if}}
</a> </a>
{{/previous}} {{/if}}
{{#next}} {{#if next}}
<a rel="next" href="{{ path_to_root }}{{link}}" class="mobile-nav-chapters next" title="Next chapter" aria-label="Next chapter" aria-keyshortcuts="Right"> <a rel="next prefetch" href="{{ path_to_root }}{{next.link}}" class="mobile-nav-chapters next" title="Next chapter" aria-label="Next chapter" aria-keyshortcuts="Right">
<i class="fa fa-angle-right"></i> {{#if (eq ../text_direction "rtl")}}
{{fa "solid" "angle-left"}}
{{else}}
{{fa "solid" "angle-right"}}
{{/if}}
</a> </a>
{{/next}} {{/if}}
<div style="clear: both"></div> <div style="clear: both"></div>
</nav> </nav>
@ -219,92 +261,92 @@
</div> </div>
<nav class="nav-wide-wrapper" aria-label="Page navigation"> <nav class="nav-wide-wrapper" aria-label="Page navigation">
{{#previous}} {{#if previous}}
<a rel="prev" href="{{ path_to_root }}{{link}}" class="nav-chapters previous" title="Previous chapter" aria-label="Previous chapter" aria-keyshortcuts="Left"> <a rel="prev" href="{{ path_to_root }}{{previous.link}}" class="nav-chapters previous" title="Previous chapter" aria-label="Previous chapter" aria-keyshortcuts="Left">
<i class="fa fa-angle-left"></i> {{#if (eq ../text_direction "rtl")}}
{{fa "solid" "angle-right"}}
{{else}}
{{fa "solid" "angle-left"}}
{{/if}}
</a> </a>
{{/previous}} {{/if}}
{{#next}} {{#if next}}
<a rel="next" href="{{ path_to_root }}{{link}}" class="nav-chapters next" title="Next chapter" aria-label="Next chapter" aria-keyshortcuts="Right"> <a rel="next prefetch" href="{{ path_to_root }}{{next.link}}" class="nav-chapters next" title="Next chapter" aria-label="Next chapter" aria-keyshortcuts="Right">
<i class="fa fa-angle-right"></i> {{#if (eq text_direction "rtl")}}
{{fa "solid" "angle-left"}}
{{else}}
{{fa "solid" "angle-right"}}
{{/if}}
</a> </a>
{{/next}} {{/if}}
</nav> </nav>
</div> </div>
{{#if livereload}} <template id=fa-eye>{{fa "solid" "eye"}}</template>
<template id=fa-eye-slash>{{fa "solid" "eye-slash"}}</template>
<template id=fa-copy>{{fa "regular" "copy"}}</template>
<template id=fa-play>{{fa "solid" "play"}}</template>
<template id=fa-clock-rotate-left>{{fa "solid" "clock-rotate-left"}}</template>
{{#if live_reload_endpoint}}
<!-- Livereload script (if served using the cli tool) --> <!-- Livereload script (if served using the cli tool) -->
<script type="text/javascript"> <script>
var socket = new WebSocket("{{{livereload}}}"); const wsProtocol = location.protocol === 'https:' ? 'wss:' : 'ws:';
const wsAddress = wsProtocol + "//" + location.host + "/" + "{{{live_reload_endpoint}}}";
const socket = new WebSocket(wsAddress);
socket.onmessage = function (event) { socket.onmessage = function (event) {
if (event.data === "reload") { if (event.data === "reload") {
socket.close(); socket.close();
location.reload(); location.reload();
} }
}; };
window.onbeforeunload = function() { window.onbeforeunload = function() {
socket.close(); socket.close();
} }
</script> </script>
{{/if}} {{/if}}
{{#if google_analytics}}
<!-- Google Analytics Tag -->
<script type="text/javascript">
var localAddrs = ["localhost", "127.0.0.1", ""];
// make sure we don't activate google analytics if the developer is
// inspecting the book locally...
if (localAddrs.indexOf(document.location.hostname) === -1) {
(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
(i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
})(window,document,'script','https://www.google-analytics.com/analytics.js','ga');
ga('create', '{{google_analytics}}', 'auto');
ga('send', 'pageview');
}
</script>
{{/if}}
{{#if playground_line_numbers}} {{#if playground_line_numbers}}
<script type="text/javascript"> <script>
window.playground_line_numbers = true; window.playground_line_numbers = true;
</script> </script>
{{/if}} {{/if}}
{{#if playground_copyable}} {{#if playground_copyable}}
<script type="text/javascript"> <script>
window.playground_copyable = true; window.playground_copyable = true;
</script> </script>
{{/if}} {{/if}}
{{#if playground_js}} {{#if playground_js}}
<script src="{{ path_to_root }}ace.js" type="text/javascript" charset="utf-8"></script> <script src="{{ resource "ace.js" }}"></script>
<script src="{{ path_to_root }}editor.js" type="text/javascript" charset="utf-8"></script> <script src="{{ resource "mode-rust.js" }}"></script>
<script src="{{ path_to_root }}mode-rust.js" type="text/javascript" charset="utf-8"></script> <script src="{{ resource "editor.js" }}"></script>
<script src="{{ path_to_root }}theme-dawn.js" type="text/javascript" charset="utf-8"></script> <script src="{{ resource "theme-dawn.js" }}"></script>
<script src="{{ path_to_root }}theme-tomorrow_night.js" type="text/javascript" charset="utf-8"></script> <script src="{{ resource "theme-tomorrow_night.js" }}"></script>
{{/if}} {{/if}}
{{#if search_js}} {{#if search_js}}
<script src="{{ path_to_root }}elasticlunr.min.js" type="text/javascript" charset="utf-8"></script> <script src="{{ resource "elasticlunr.min.js" }}"></script>
<script src="{{ path_to_root }}mark.min.js" type="text/javascript" charset="utf-8"></script> <script src="{{ resource "mark.min.js" }}"></script>
<script src="{{ path_to_root }}searcher.js" type="text/javascript" charset="utf-8"></script> <script src="{{ resource "searcher.js" }}"></script>
{{/if}} {{/if}}
<script src="{{ path_to_root }}clipboard.min.js" type="text/javascript" charset="utf-8"></script> <script src="{{ resource "clipboard.min.js" }}"></script>
<script src="{{ path_to_root }}highlight.js" type="text/javascript" charset="utf-8"></script> <script src="{{ resource "highlight.js" }}"></script>
<script src="{{ path_to_root }}book.js" type="text/javascript" charset="utf-8"></script> <script src="{{ resource "book.js" }}"></script>
<!-- Custom JS scripts --> <!-- Custom JS scripts -->
{{#each additional_js}} {{#each additional_js}}
<script type="text/javascript" src="{{ ../path_to_root }}{{this}}"></script> <script src="{{ resource this}}"></script>
{{/each}} {{/each}}
{{#if is_print}} {{#if is_print}}
{{#if mathjax_support}} {{#if mathjax_support}}
<script type="text/javascript"> <script>
window.addEventListener('load', function() { window.addEventListener('load', function() {
MathJax.Hub.Register.StartupHook('End', function() { MathJax.Hub.Register.StartupHook('End', function() {
window.setTimeout(window.print, 100); window.setTimeout(window.print, 100);
@ -312,7 +354,7 @@
}); });
</script> </script>
{{else}} {{else}}
<script type="text/javascript"> <script>
window.addEventListener('load', function() { window.addEventListener('load', function() {
window.setTimeout(window.print, 100); window.setTimeout(window.print, 100);
}); });
@ -320,5 +362,21 @@
{{/if}} {{/if}}
{{/if}} {{/if}}
{{#if fragment_map}}
<script>
document.addEventListener('DOMContentLoaded', function() {
const fragmentMap =
{{{fragment_map}}}
;
const target = fragmentMap[window.location.hash];
if (target) {
let url = new URL(target, window.location.href);
window.location.replace(url.href);
}
});
</script>
{{/if}}
</div>
</body> </body>
</html> </html>

View file

@ -255,6 +255,8 @@ information.
^/_matrix/client/(api/v1|r0|v3|unstable)/directory/room/.*$ ^/_matrix/client/(api/v1|r0|v3|unstable)/directory/room/.*$
^/_matrix/client/(r0|v3|unstable)/capabilities$ ^/_matrix/client/(r0|v3|unstable)/capabilities$
^/_matrix/client/(r0|v3|unstable)/notifications$ ^/_matrix/client/(r0|v3|unstable)/notifications$
# Admin API requests
^/_synapse/admin/v1/rooms/[^/]+$ ^/_synapse/admin/v1/rooms/[^/]+$
# Encryption requests # Encryption requests
@ -300,6 +302,9 @@ Additionally, the following REST endpoints can be handled for GET requests:
# Presence requests # Presence requests
^/_matrix/client/(api/v1|r0|v3|unstable)/presence/ ^/_matrix/client/(api/v1|r0|v3|unstable)/presence/
# Admin API requests
^/_synapse/admin/v2/users/[^/]+$
Pagination requests can also be handled, but all requests for a given Pagination requests can also be handled, but all requests for a given
room must be routed to the same instance. Additionally, care must be taken to room must be routed to the same instance. Additionally, care must be taken to
ensure that the purge history admin API is not used while pagination requests ensure that the purge history admin API is not used while pagination requests

629
poetry.lock generated
View file

@ -26,15 +26,15 @@ files = [
[[package]] [[package]]
name = "authlib" name = "authlib"
version = "1.6.5" version = "1.6.6"
description = "The ultimate Python library in building OAuth and OpenID Connect servers and clients." description = "The ultimate Python library in building OAuth and OpenID Connect servers and clients."
optional = true optional = true
python-versions = ">=3.9" python-versions = ">=3.9"
groups = ["main"] groups = ["main"]
markers = "extra == \"all\" or extra == \"jwt\" or extra == \"oidc\"" markers = "extra == \"all\" or extra == \"jwt\" or extra == \"oidc\""
files = [ files = [
{file = "authlib-1.6.5-py2.py3-none-any.whl", hash = "sha256:3e0e0507807f842b02175507bdee8957a1d5707fd4afb17c32fb43fee90b6e3a"}, {file = "authlib-1.6.6-py2.py3-none-any.whl", hash = "sha256:7d9e9bc535c13974313a87f53e8430eb6ea3d1cf6ae4f6efcd793f2e949143fd"},
{file = "authlib-1.6.5.tar.gz", hash = "sha256:6aaf9c79b7cc96c900f0b284061691c5d4e61221640a948fe690b556a6d6d10b"}, {file = "authlib-1.6.6.tar.gz", hash = "sha256:45770e8e056d0f283451d9996fbb59b70d45722b45d854d58f32878d0a40c38e"},
] ]
[package.dependencies] [package.dependencies]
@ -134,14 +134,14 @@ typecheck = ["mypy"]
[[package]] [[package]]
name = "bleach" name = "bleach"
version = "6.2.0" version = "6.3.0"
description = "An easy safelist-based HTML-sanitizing tool." description = "An easy safelist-based HTML-sanitizing tool."
optional = false optional = false
python-versions = ">=3.9" python-versions = ">=3.10"
groups = ["main", "dev"] groups = ["main", "dev"]
files = [ files = [
{file = "bleach-6.2.0-py3-none-any.whl", hash = "sha256:117d9c6097a7c3d22fd578fcd8d35ff1e125df6736f554da4e432fdd63f31e5e"}, {file = "bleach-6.3.0-py3-none-any.whl", hash = "sha256:fe10ec77c93ddf3d13a73b035abaac7a9f5e436513864ccdad516693213c65d6"},
{file = "bleach-6.2.0.tar.gz", hash = "sha256:123e894118b8a599fd80d3ec1a6d4cc7ce4e5882b1317a7e1ba69b56e95f991f"}, {file = "bleach-6.3.0.tar.gz", hash = "sha256:6f3b91b1c0a02bb9a78b5a454c92506aa0fdf197e1d5e114d2e00c6f64306d22"},
] ]
[package.dependencies] [package.dependencies]
@ -176,83 +176,100 @@ files = [
[[package]] [[package]]
name = "cffi" name = "cffi"
version = "1.17.1" version = "2.0.0"
description = "Foreign Function Interface for Python calling C code." description = "Foreign Function Interface for Python calling C code."
optional = false optional = false
python-versions = ">=3.8" python-versions = ">=3.9"
groups = ["main", "dev"] groups = ["main", "dev"]
files = [ files = [
{file = "cffi-1.17.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:df8b1c11f177bc2313ec4b2d46baec87a5f3e71fc8b45dab2ee7cae86d9aba14"}, {file = "cffi-2.0.0-cp310-cp310-macosx_10_13_x86_64.whl", hash = "sha256:0cf2d91ecc3fcc0625c2c530fe004f82c110405f101548512cce44322fa8ac44"},
{file = "cffi-1.17.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:8f2cdc858323644ab277e9bb925ad72ae0e67f69e804f4898c070998d50b1a67"}, {file = "cffi-2.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:f73b96c41e3b2adedc34a7356e64c8eb96e03a3782b535e043a986276ce12a49"},
{file = "cffi-1.17.1-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:edae79245293e15384b51f88b00613ba9f7198016a5948b5dddf4917d4d26382"}, {file = "cffi-2.0.0-cp310-cp310-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:53f77cbe57044e88bbd5ed26ac1d0514d2acf0591dd6bb02a3ae37f76811b80c"},
{file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:45398b671ac6d70e67da8e4224a065cec6a93541bb7aebe1b198a61b58c7b702"}, {file = "cffi-2.0.0-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:3e837e369566884707ddaf85fc1744b47575005c0a229de3327f8f9a20f4efeb"},
{file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ad9413ccdeda48c5afdae7e4fa2192157e991ff761e7ab8fdd8926f40b160cc3"}, {file = "cffi-2.0.0-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:5eda85d6d1879e692d546a078b44251cdd08dd1cfb98dfb77b670c97cee49ea0"},
{file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5da5719280082ac6bd9aa7becb3938dc9f9cbd57fac7d2871717b1feb0902ab6"}, {file = "cffi-2.0.0-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:9332088d75dc3241c702d852d4671613136d90fa6881da7d770a483fd05248b4"},
{file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bb1a08b8008b281856e5971307cc386a8e9c5b625ac297e853d36da6efe9c17"}, {file = "cffi-2.0.0-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:fc7de24befaeae77ba923797c7c87834c73648a05a4bde34b3b7e5588973a453"},
{file = "cffi-1.17.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:045d61c734659cc045141be4bae381a41d89b741f795af1dd018bfb532fd0df8"}, {file = "cffi-2.0.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:cf364028c016c03078a23b503f02058f1814320a56ad535686f90565636a9495"},
{file = "cffi-1.17.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:6883e737d7d9e4899a8a695e00ec36bd4e5e4f18fabe0aca0efe0a4b44cdb13e"}, {file = "cffi-2.0.0-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e11e82b744887154b182fd3e7e8512418446501191994dbf9c9fc1f32cc8efd5"},
{file = "cffi-1.17.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:6b8b4a92e1c65048ff98cfe1f735ef8f1ceb72e3d5f0c25fdb12087a23da22be"}, {file = "cffi-2.0.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:8ea985900c5c95ce9db1745f7933eeef5d314f0565b27625d9a10ec9881e1bfb"},
{file = "cffi-1.17.1-cp310-cp310-win32.whl", hash = "sha256:c9c3d058ebabb74db66e431095118094d06abf53284d9c81f27300d0e0d8bc7c"}, {file = "cffi-2.0.0-cp310-cp310-win32.whl", hash = "sha256:1f72fb8906754ac8a2cc3f9f5aaa298070652a0ffae577e0ea9bd480dc3c931a"},
{file = "cffi-1.17.1-cp310-cp310-win_amd64.whl", hash = "sha256:0f048dcf80db46f0098ccac01132761580d28e28bc0f78ae0d58048063317e15"}, {file = "cffi-2.0.0-cp310-cp310-win_amd64.whl", hash = "sha256:b18a3ed7d5b3bd8d9ef7a8cb226502c6bf8308df1525e1cc676c3680e7176739"},
{file = "cffi-1.17.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a45e3c6913c5b87b3ff120dcdc03f6131fa0065027d0ed7ee6190736a74cd401"}, {file = "cffi-2.0.0-cp311-cp311-macosx_10_13_x86_64.whl", hash = "sha256:b4c854ef3adc177950a8dfc81a86f5115d2abd545751a304c5bcf2c2c7283cfe"},
{file = "cffi-1.17.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:30c5e0cb5ae493c04c8b42916e52ca38079f1b235c2f8ae5f4527b963c401caf"}, {file = "cffi-2.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:2de9a304e27f7596cd03d16f1b7c72219bd944e99cc52b84d0145aefb07cbd3c"},
{file = "cffi-1.17.1-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f75c7ab1f9e4aca5414ed4d8e5c0e303a34f4421f8a0d47a4d019ceff0ab6af4"}, {file = "cffi-2.0.0-cp311-cp311-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:baf5215e0ab74c16e2dd324e8ec067ef59e41125d3eade2b863d294fd5035c92"},
{file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a1ed2dd2972641495a3ec98445e09766f077aee98a1c896dcb4ad0d303628e41"}, {file = "cffi-2.0.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:730cacb21e1bdff3ce90babf007d0a0917cc3e6492f336c2f0134101e0944f93"},
{file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:46bf43160c1a35f7ec506d254e5c890f3c03648a4dbac12d624e4490a7046cd1"}, {file = "cffi-2.0.0-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:6824f87845e3396029f3820c206e459ccc91760e8fa24422f8b0c3d1731cbec5"},
{file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a24ed04c8ffd54b0729c07cee15a81d964e6fee0e3d4d342a27b020d22959dc6"}, {file = "cffi-2.0.0-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:9de40a7b0323d889cf8d23d1ef214f565ab154443c42737dfe52ff82cf857664"},
{file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:610faea79c43e44c71e1ec53a554553fa22321b65fae24889706c0a84d4ad86d"}, {file = "cffi-2.0.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:8941aaadaf67246224cee8c3803777eed332a19d909b47e29c9842ef1e79ac26"},
{file = "cffi-1.17.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:a9b15d491f3ad5d692e11f6b71f7857e7835eb677955c00cc0aefcd0669adaf6"}, {file = "cffi-2.0.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:a05d0c237b3349096d3981b727493e22147f934b20f6f125a3eba8f994bec4a9"},
{file = "cffi-1.17.1-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:de2ea4b5833625383e464549fec1bc395c1bdeeb5f25c4a3a82b5a8c756ec22f"}, {file = "cffi-2.0.0-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:94698a9c5f91f9d138526b48fe26a199609544591f859c870d477351dc7b2414"},
{file = "cffi-1.17.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:fc48c783f9c87e60831201f2cce7f3b2e4846bf4d8728eabe54d60700b318a0b"}, {file = "cffi-2.0.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:5fed36fccc0612a53f1d4d9a816b50a36702c28a2aa880cb8a122b3466638743"},
{file = "cffi-1.17.1-cp311-cp311-win32.whl", hash = "sha256:85a950a4ac9c359340d5963966e3e0a94a676bd6245a4b55bc43949eee26a655"}, {file = "cffi-2.0.0-cp311-cp311-win32.whl", hash = "sha256:c649e3a33450ec82378822b3dad03cc228b8f5963c0c12fc3b1e0ab940f768a5"},
{file = "cffi-1.17.1-cp311-cp311-win_amd64.whl", hash = "sha256:caaf0640ef5f5517f49bc275eca1406b0ffa6aa184892812030f04c2abf589a0"}, {file = "cffi-2.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:66f011380d0e49ed280c789fbd08ff0d40968ee7b665575489afa95c98196ab5"},
{file = "cffi-1.17.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:805b4371bf7197c329fcb3ead37e710d1bca9da5d583f5073b799d5c5bd1eee4"}, {file = "cffi-2.0.0-cp311-cp311-win_arm64.whl", hash = "sha256:c6638687455baf640e37344fe26d37c404db8b80d037c3d29f58fe8d1c3b194d"},
{file = "cffi-1.17.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:733e99bc2df47476e3848417c5a4540522f234dfd4ef3ab7fafdf555b082ec0c"}, {file = "cffi-2.0.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:6d02d6655b0e54f54c4ef0b94eb6be0607b70853c45ce98bd278dc7de718be5d"},
{file = "cffi-1.17.1-cp312-cp312-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1257bdabf294dceb59f5e70c64a3e2f462c30c7ad68092d01bbbfb1c16b1ba36"}, {file = "cffi-2.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8eca2a813c1cb7ad4fb74d368c2ffbbb4789d377ee5bb8df98373c2cc0dee76c"},
{file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:da95af8214998d77a98cc14e3a3bd00aa191526343078b530ceb0bd710fb48a5"}, {file = "cffi-2.0.0-cp312-cp312-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:21d1152871b019407d8ac3985f6775c079416c282e431a4da6afe7aefd2bccbe"},
{file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d63afe322132c194cf832bfec0dc69a99fb9bb6bbd550f161a49e9e855cc78ff"}, {file = "cffi-2.0.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:b21e08af67b8a103c71a250401c78d5e0893beff75e28c53c98f4de42f774062"},
{file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f79fc4fc25f1c8698ff97788206bb3c2598949bfe0fef03d299eb1b5356ada99"}, {file = "cffi-2.0.0-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:1e3a615586f05fc4065a8b22b8152f0c1b00cdbc60596d187c2a74f9e3036e4e"},
{file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b62ce867176a75d03a665bad002af8e6d54644fad99a3c70905c543130e39d93"}, {file = "cffi-2.0.0-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:81afed14892743bbe14dacb9e36d9e0e504cd204e0b165062c488942b9718037"},
{file = "cffi-1.17.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:386c8bf53c502fff58903061338ce4f4950cbdcb23e2902d86c0f722b786bbe3"}, {file = "cffi-2.0.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:3e17ed538242334bf70832644a32a7aae3d83b57567f9fd60a26257e992b79ba"},
{file = "cffi-1.17.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:4ceb10419a9adf4460ea14cfd6bc43d08701f0835e979bf821052f1805850fe8"}, {file = "cffi-2.0.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3925dd22fa2b7699ed2617149842d2e6adde22b262fcbfada50e3d195e4b3a94"},
{file = "cffi-1.17.1-cp312-cp312-win32.whl", hash = "sha256:a08d7e755f8ed21095a310a693525137cfe756ce62d066e53f502a83dc550f65"}, {file = "cffi-2.0.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:2c8f814d84194c9ea681642fd164267891702542f028a15fc97d4674b6206187"},
{file = "cffi-1.17.1-cp312-cp312-win_amd64.whl", hash = "sha256:51392eae71afec0d0c8fb1a53b204dbb3bcabcb3c9b807eedf3e1e6ccf2de903"}, {file = "cffi-2.0.0-cp312-cp312-win32.whl", hash = "sha256:da902562c3e9c550df360bfa53c035b2f241fed6d9aef119048073680ace4a18"},
{file = "cffi-1.17.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f3a2b4222ce6b60e2e8b337bb9596923045681d71e5a082783484d845390938e"}, {file = "cffi-2.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:da68248800ad6320861f129cd9c1bf96ca849a2771a59e0344e88681905916f5"},
{file = "cffi-1.17.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:0984a4925a435b1da406122d4d7968dd861c1385afe3b45ba82b750f229811e2"}, {file = "cffi-2.0.0-cp312-cp312-win_arm64.whl", hash = "sha256:4671d9dd5ec934cb9a73e7ee9676f9362aba54f7f34910956b84d727b0d73fb6"},
{file = "cffi-1.17.1-cp313-cp313-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d01b12eeeb4427d3110de311e1774046ad344f5b1a7403101878976ecd7a10f3"}, {file = "cffi-2.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:00bdf7acc5f795150faa6957054fbbca2439db2f775ce831222b66f192f03beb"},
{file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:706510fe141c86a69c8ddc029c7910003a17353970cff3b904ff0686a5927683"}, {file = "cffi-2.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:45d5e886156860dc35862657e1494b9bae8dfa63bf56796f2fb56e1679fc0bca"},
{file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:de55b766c7aa2e2a3092c51e0483d700341182f08e67c63630d5b6f200bb28e5"}, {file = "cffi-2.0.0-cp313-cp313-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:07b271772c100085dd28b74fa0cd81c8fb1a3ba18b21e03d7c27f3436a10606b"},
{file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c59d6e989d07460165cc5ad3c61f9fd8f1b4796eacbd81cee78957842b834af4"}, {file = "cffi-2.0.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:d48a880098c96020b02d5a1f7d9251308510ce8858940e6fa99ece33f610838b"},
{file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dd398dbc6773384a17fe0d3e7eeb8d1a21c2200473ee6806bb5e6a8e62bb73dd"}, {file = "cffi-2.0.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:f93fd8e5c8c0a4aa1f424d6173f14a892044054871c771f8566e4008eaa359d2"},
{file = "cffi-1.17.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:3edc8d958eb099c634dace3c7e16560ae474aa3803a5df240542b305d14e14ed"}, {file = "cffi-2.0.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:dd4f05f54a52fb558f1ba9f528228066954fee3ebe629fc1660d874d040ae5a3"},
{file = "cffi-1.17.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:72e72408cad3d5419375fc87d289076ee319835bdfa2caad331e377589aebba9"}, {file = "cffi-2.0.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c8d3b5532fc71b7a77c09192b4a5a200ea992702734a2e9279a37f2478236f26"},
{file = "cffi-1.17.1-cp313-cp313-win32.whl", hash = "sha256:e03eab0a8677fa80d646b5ddece1cbeaf556c313dcfac435ba11f107ba117b5d"}, {file = "cffi-2.0.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:d9b29c1f0ae438d5ee9acb31cadee00a58c46cc9c0b2f9038c6b0b3470877a8c"},
{file = "cffi-1.17.1-cp313-cp313-win_amd64.whl", hash = "sha256:f6a16c31041f09ead72d69f583767292f750d24913dadacf5756b966aacb3f1a"}, {file = "cffi-2.0.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6d50360be4546678fc1b79ffe7a66265e28667840010348dd69a314145807a1b"},
{file = "cffi-1.17.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:636062ea65bd0195bc012fea9321aca499c0504409f413dc88af450b57ffd03b"}, {file = "cffi-2.0.0-cp313-cp313-win32.whl", hash = "sha256:74a03b9698e198d47562765773b4a8309919089150a0bb17d829ad7b44b60d27"},
{file = "cffi-1.17.1-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c7eac2ef9b63c79431bc4b25f1cd649d7f061a28808cbc6c47b534bd789ef964"}, {file = "cffi-2.0.0-cp313-cp313-win_amd64.whl", hash = "sha256:19f705ada2530c1167abacb171925dd886168931e0a7b78f5bffcae5c6b5be75"},
{file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e221cf152cff04059d011ee126477f0d9588303eb57e88923578ace7baad17f9"}, {file = "cffi-2.0.0-cp313-cp313-win_arm64.whl", hash = "sha256:256f80b80ca3853f90c21b23ee78cd008713787b1b1e93eae9f3d6a7134abd91"},
{file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:31000ec67d4221a71bd3f67df918b1f88f676f1c3b535a7eb473255fdc0b83fc"}, {file = "cffi-2.0.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:fc33c5141b55ed366cfaad382df24fe7dcbc686de5be719b207bb248e3053dc5"},
{file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6f17be4345073b0a7b8ea599688f692ac3ef23ce28e5df79c04de519dbc4912c"}, {file = "cffi-2.0.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c654de545946e0db659b3400168c9ad31b5d29593291482c43e3564effbcee13"},
{file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0e2b1fac190ae3ebfe37b979cc1ce69c81f4e4fe5746bb401dca63a9062cdaf1"}, {file = "cffi-2.0.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:24b6f81f1983e6df8db3adc38562c83f7d4a0c36162885ec7f7b77c7dcbec97b"},
{file = "cffi-1.17.1-cp38-cp38-win32.whl", hash = "sha256:7596d6620d3fa590f677e9ee430df2958d2d6d6de2feeae5b20e82c00b76fbf8"}, {file = "cffi-2.0.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:12873ca6cb9b0f0d3a0da705d6086fe911591737a59f28b7936bdfed27c0d47c"},
{file = "cffi-1.17.1-cp38-cp38-win_amd64.whl", hash = "sha256:78122be759c3f8a014ce010908ae03364d00a1f81ab5c7f4a7a5120607ea56e1"}, {file = "cffi-2.0.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:d9b97165e8aed9272a6bb17c01e3cc5871a594a446ebedc996e2397a1c1ea8ef"},
{file = "cffi-1.17.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:b2ab587605f4ba0bf81dc0cb08a41bd1c0a5906bd59243d56bad7668a6fc6c16"}, {file = "cffi-2.0.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:afb8db5439b81cf9c9d0c80404b60c3cc9c3add93e114dcae767f1477cb53775"},
{file = "cffi-1.17.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:28b16024becceed8c6dfbc75629e27788d8a3f9030691a1dbf9821a128b22c36"}, {file = "cffi-2.0.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:737fe7d37e1a1bffe70bd5754ea763a62a066dc5913ca57e957824b72a85e205"},
{file = "cffi-1.17.1-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1d599671f396c4723d016dbddb72fe8e0397082b0a77a4fab8028923bec050e8"}, {file = "cffi-2.0.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:38100abb9d1b1435bc4cc340bb4489635dc2f0da7456590877030c9b3d40b0c1"},
{file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ca74b8dbe6e8e8263c0ffd60277de77dcee6c837a3d0881d8c1ead7268c9e576"}, {file = "cffi-2.0.0-cp314-cp314-win32.whl", hash = "sha256:087067fa8953339c723661eda6b54bc98c5625757ea62e95eb4898ad5e776e9f"},
{file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f7f5baafcc48261359e14bcd6d9bff6d4b28d9103847c9e136694cb0501aef87"}, {file = "cffi-2.0.0-cp314-cp314-win_amd64.whl", hash = "sha256:203a48d1fb583fc7d78a4c6655692963b860a417c0528492a6bc21f1aaefab25"},
{file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:98e3969bcff97cae1b2def8ba499ea3d6f31ddfdb7635374834cf89a1a08ecf0"}, {file = "cffi-2.0.0-cp314-cp314-win_arm64.whl", hash = "sha256:dbd5c7a25a7cb98f5ca55d258b103a2054f859a46ae11aaf23134f9cc0d356ad"},
{file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cdf5ce3acdfd1661132f2a9c19cac174758dc2352bfe37d98aa7512c6b7178b3"}, {file = "cffi-2.0.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:9a67fc9e8eb39039280526379fb3a70023d77caec1852002b4da7e8b270c4dd9"},
{file = "cffi-1.17.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:9755e4345d1ec879e3849e62222a18c7174d65a6a92d5b346b1863912168b595"}, {file = "cffi-2.0.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:7a66c7204d8869299919db4d5069a82f1561581af12b11b3c9f48c584eb8743d"},
{file = "cffi-1.17.1-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:f1e22e8c4419538cb197e4dd60acc919d7696e5ef98ee4da4e01d3f8cfa4cc5a"}, {file = "cffi-2.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7cc09976e8b56f8cebd752f7113ad07752461f48a58cbba644139015ac24954c"},
{file = "cffi-1.17.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:c03e868a0b3bc35839ba98e74211ed2b05d2119be4e8a0f224fba9384f1fe02e"}, {file = "cffi-2.0.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:92b68146a71df78564e4ef48af17551a5ddd142e5190cdf2c5624d0c3ff5b2e8"},
{file = "cffi-1.17.1-cp39-cp39-win32.whl", hash = "sha256:e31ae45bc2e29f6b2abd0de1cc3b9d5205aa847cafaecb8af1476a609a2f6eb7"}, {file = "cffi-2.0.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:b1e74d11748e7e98e2f426ab176d4ed720a64412b6a15054378afdb71e0f37dc"},
{file = "cffi-1.17.1-cp39-cp39-win_amd64.whl", hash = "sha256:d016c76bdd850f3c626af19b0542c9677ba156e4ee4fccfdd7848803533ef662"}, {file = "cffi-2.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:28a3a209b96630bca57cce802da70c266eb08c6e97e5afd61a75611ee6c64592"},
{file = "cffi-1.17.1.tar.gz", hash = "sha256:1c39c6016c32bc48dd54561950ebd6836e1670f2ae46128f67cf49e789c52824"}, {file = "cffi-2.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:7553fb2090d71822f02c629afe6042c299edf91ba1bf94951165613553984512"},
{file = "cffi-2.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:6c6c373cfc5c83a975506110d17457138c8c63016b563cc9ed6e056a82f13ce4"},
{file = "cffi-2.0.0-cp314-cp314t-win32.whl", hash = "sha256:1fc9ea04857caf665289b7a75923f2c6ed559b8298a1b8c49e59f7dd95c8481e"},
{file = "cffi-2.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:d68b6cef7827e8641e8ef16f4494edda8b36104d79773a334beaa1e3521430f6"},
{file = "cffi-2.0.0-cp314-cp314t-win_arm64.whl", hash = "sha256:0a1527a803f0a659de1af2e1fd700213caba79377e27e4693648c2923da066f9"},
{file = "cffi-2.0.0-cp39-cp39-macosx_10_13_x86_64.whl", hash = "sha256:fe562eb1a64e67dd297ccc4f5addea2501664954f2692b69a76449ec7913ecbf"},
{file = "cffi-2.0.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:de8dad4425a6ca6e4e5e297b27b5c824ecc7581910bf9aee86cb6835e6812aa7"},
{file = "cffi-2.0.0-cp39-cp39-manylinux1_i686.manylinux2014_i686.manylinux_2_17_i686.manylinux_2_5_i686.whl", hash = "sha256:4647afc2f90d1ddd33441e5b0e85b16b12ddec4fca55f0d9671fef036ecca27c"},
{file = "cffi-2.0.0-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:3f4d46d8b35698056ec29bca21546e1551a205058ae1a181d871e278b0b28165"},
{file = "cffi-2.0.0-cp39-cp39-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:e6e73b9e02893c764e7e8d5bb5ce277f1a009cd5243f8228f75f842bf937c534"},
{file = "cffi-2.0.0-cp39-cp39-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:cb527a79772e5ef98fb1d700678fe031e353e765d1ca2d409c92263c6d43e09f"},
{file = "cffi-2.0.0-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:61d028e90346df14fedc3d1e5441df818d095f3b87d286825dfcbd6459b7ef63"},
{file = "cffi-2.0.0-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:0f6084a0ea23d05d20c3edcda20c3d006f9b6f3fefeac38f59262e10cef47ee2"},
{file = "cffi-2.0.0-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:1cd13c99ce269b3ed80b417dcd591415d3372bcac067009b6e0f59c7d4015e65"},
{file = "cffi-2.0.0-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:89472c9762729b5ae1ad974b777416bfda4ac5642423fa93bd57a09204712322"},
{file = "cffi-2.0.0-cp39-cp39-win32.whl", hash = "sha256:2081580ebb843f759b9f617314a24ed5738c51d2aee65d31e02f6f7a2b97707a"},
{file = "cffi-2.0.0-cp39-cp39-win_amd64.whl", hash = "sha256:b882b3df248017dba09d6b16defe9b5c407fe32fc7c65a9c69798e6175601be9"},
{file = "cffi-2.0.0.tar.gz", hash = "sha256:44d1b5909021139fe36001ae048dbdde8214afa20200eda0f64c068cac5d5529"},
] ]
[package.dependencies] [package.dependencies]
pycparser = "*" pycparser = {version = "*", markers = "implementation_name != \"PyPy\""}
[[package]] [[package]]
name = "charset-normalizer" name = "charset-normalizer"
@ -381,62 +398,80 @@ files = [
[[package]] [[package]]
name = "cryptography" name = "cryptography"
version = "45.0.7" version = "46.0.3"
description = "cryptography is a package which provides cryptographic recipes and primitives to Python developers." description = "cryptography is a package which provides cryptographic recipes and primitives to Python developers."
optional = false optional = false
python-versions = "!=3.9.0,!=3.9.1,>=3.7" python-versions = "!=3.9.0,!=3.9.1,>=3.8"
groups = ["main", "dev"] groups = ["main", "dev"]
files = [ files = [
{file = "cryptography-45.0.7-cp311-abi3-macosx_10_9_universal2.whl", hash = "sha256:3be4f21c6245930688bd9e162829480de027f8bf962ede33d4f8ba7d67a00cee"}, {file = "cryptography-46.0.3-cp311-abi3-macosx_10_9_universal2.whl", hash = "sha256:109d4ddfadf17e8e7779c39f9b18111a09efb969a301a31e987416a0191ed93a"},
{file = "cryptography-45.0.7-cp311-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:67285f8a611b0ebc0857ced2081e30302909f571a46bfa7a3cc0ad303fe015c6"}, {file = "cryptography-46.0.3-cp311-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:09859af8466b69bc3c27bdf4f5d84a665e0f7ab5088412e9e2ec49758eca5cbc"},
{file = "cryptography-45.0.7-cp311-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:577470e39e60a6cd7780793202e63536026d9b8641de011ed9d8174da9ca5339"}, {file = "cryptography-46.0.3-cp311-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:01ca9ff2885f3acc98c29f1860552e37f6d7c7d013d7334ff2a9de43a449315d"},
{file = "cryptography-45.0.7-cp311-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:4bd3e5c4b9682bc112d634f2c6ccc6736ed3635fc3319ac2bb11d768cc5a00d8"}, {file = "cryptography-46.0.3-cp311-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:6eae65d4c3d33da080cff9c4ab1f711b15c1d9760809dad6ea763f3812d254cb"},
{file = "cryptography-45.0.7-cp311-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:465ccac9d70115cd4de7186e60cfe989de73f7bb23e8a7aa45af18f7412e75bf"}, {file = "cryptography-46.0.3-cp311-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:e5bf0ed4490068a2e72ac03d786693adeb909981cc596425d09032d372bcc849"},
{file = "cryptography-45.0.7-cp311-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:16ede8a4f7929b4b7ff3642eba2bf79aa1d71f24ab6ee443935c0d269b6bc513"}, {file = "cryptography-46.0.3-cp311-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:5ecfccd2329e37e9b7112a888e76d9feca2347f12f37918facbb893d7bb88ee8"},
{file = "cryptography-45.0.7-cp311-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:8978132287a9d3ad6b54fcd1e08548033cc09dc6aacacb6c004c73c3eb5d3ac3"}, {file = "cryptography-46.0.3-cp311-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:a2c0cd47381a3229c403062f764160d57d4d175e022c1df84e168c6251a22eec"},
{file = "cryptography-45.0.7-cp311-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:b6a0e535baec27b528cb07a119f321ac024592388c5681a5ced167ae98e9fff3"}, {file = "cryptography-46.0.3-cp311-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:549e234ff32571b1f4076ac269fcce7a808d3bf98b76c8dd560e42dbc66d7d91"},
{file = "cryptography-45.0.7-cp311-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:a24ee598d10befaec178efdff6054bc4d7e883f615bfbcd08126a0f4931c83a6"}, {file = "cryptography-46.0.3-cp311-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:c0a7bb1a68a5d3471880e264621346c48665b3bf1c3759d682fc0864c540bd9e"},
{file = "cryptography-45.0.7-cp311-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:fa26fa54c0a9384c27fcdc905a2fb7d60ac6e47d14bc2692145f2b3b1e2cfdbd"}, {file = "cryptography-46.0.3-cp311-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:10b01676fc208c3e6feeb25a8b83d81767e8059e1fe86e1dc62d10a3018fa926"},
{file = "cryptography-45.0.7-cp311-abi3-win32.whl", hash = "sha256:bef32a5e327bd8e5af915d3416ffefdbe65ed975b646b3805be81b23580b57b8"}, {file = "cryptography-46.0.3-cp311-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:0abf1ffd6e57c67e92af68330d05760b7b7efb243aab8377e583284dbab72c71"},
{file = "cryptography-45.0.7-cp311-abi3-win_amd64.whl", hash = "sha256:3808e6b2e5f0b46d981c24d79648e5c25c35e59902ea4391a0dcb3e667bf7443"}, {file = "cryptography-46.0.3-cp311-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:a04bee9ab6a4da801eb9b51f1b708a1b5b5c9eb48c03f74198464c66f0d344ac"},
{file = "cryptography-45.0.7-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:bfb4c801f65dd61cedfc61a83732327fafbac55a47282e6f26f073ca7a41c3b2"}, {file = "cryptography-46.0.3-cp311-abi3-win32.whl", hash = "sha256:f260d0d41e9b4da1ed1e0f1ce571f97fe370b152ab18778e9e8f67d6af432018"},
{file = "cryptography-45.0.7-cp37-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:81823935e2f8d476707e85a78a405953a03ef7b7b4f55f93f7c2d9680e5e0691"}, {file = "cryptography-46.0.3-cp311-abi3-win_amd64.whl", hash = "sha256:a9a3008438615669153eb86b26b61e09993921ebdd75385ddd748702c5adfddb"},
{file = "cryptography-45.0.7-cp37-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:3994c809c17fc570c2af12c9b840d7cea85a9fd3e5c0e0491f4fa3c029216d59"}, {file = "cryptography-46.0.3-cp311-abi3-win_arm64.whl", hash = "sha256:5d7f93296ee28f68447397bf5198428c9aeeab45705a55d53a6343455dcb2c3c"},
{file = "cryptography-45.0.7-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:dad43797959a74103cb59c5dac71409f9c27d34c8a05921341fb64ea8ccb1dd4"}, {file = "cryptography-46.0.3-cp314-cp314t-macosx_10_9_universal2.whl", hash = "sha256:00a5e7e87938e5ff9ff5447ab086a5706a957137e6e433841e9d24f38a065217"},
{file = "cryptography-45.0.7-cp37-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:ce7a453385e4c4693985b4a4a3533e041558851eae061a58a5405363b098fcd3"}, {file = "cryptography-46.0.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:c8daeb2d2174beb4575b77482320303f3d39b8e81153da4f0fb08eb5fe86a6c5"},
{file = "cryptography-45.0.7-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:b04f85ac3a90c227b6e5890acb0edbaf3140938dbecf07bff618bf3638578cf1"}, {file = "cryptography-46.0.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:39b6755623145ad5eff1dab323f4eae2a32a77a7abef2c5089a04a3d04366715"},
{file = "cryptography-45.0.7-cp37-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:48c41a44ef8b8c2e80ca4527ee81daa4c527df3ecbc9423c41a420a9559d0e27"}, {file = "cryptography-46.0.3-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:db391fa7c66df6762ee3f00c95a89e6d428f4d60e7abc8328f4fe155b5ac6e54"},
{file = "cryptography-45.0.7-cp37-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:f3df7b3d0f91b88b2106031fd995802a2e9ae13e02c36c1fc075b43f420f3a17"}, {file = "cryptography-46.0.3-cp314-cp314t-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:78a97cf6a8839a48c49271cdcbd5cf37ca2c1d6b7fdd86cc864f302b5e9bf459"},
{file = "cryptography-45.0.7-cp37-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:dd342f085542f6eb894ca00ef70236ea46070c8a13824c6bde0dfdcd36065b9b"}, {file = "cryptography-46.0.3-cp314-cp314t-manylinux_2_28_ppc64le.whl", hash = "sha256:dfb781ff7eaa91a6f7fd41776ec37c5853c795d3b358d4896fdbb5df168af422"},
{file = "cryptography-45.0.7-cp37-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:1993a1bb7e4eccfb922b6cd414f072e08ff5816702a0bdb8941c247a6b1b287c"}, {file = "cryptography-46.0.3-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:6f61efb26e76c45c4a227835ddeae96d83624fb0d29eb5df5b96e14ed1a0afb7"},
{file = "cryptography-45.0.7-cp37-abi3-win32.whl", hash = "sha256:18fcf70f243fe07252dcb1b268a687f2358025ce32f9f88028ca5c364b123ef5"}, {file = "cryptography-46.0.3-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:23b1a8f26e43f47ceb6d6a43115f33a5a37d57df4ea0ca295b780ae8546e8044"},
{file = "cryptography-45.0.7-cp37-abi3-win_amd64.whl", hash = "sha256:7285a89df4900ed3bfaad5679b1e668cb4b38a8de1ccbfc84b05f34512da0a90"}, {file = "cryptography-46.0.3-cp314-cp314t-manylinux_2_34_ppc64le.whl", hash = "sha256:b419ae593c86b87014b9be7396b385491ad7f320bde96826d0dd174459e54665"},
{file = "cryptography-45.0.7-pp310-pypy310_pp73-macosx_10_9_x86_64.whl", hash = "sha256:de58755d723e86175756f463f2f0bddd45cc36fbd62601228a3f8761c9f58252"}, {file = "cryptography-46.0.3-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:50fc3343ac490c6b08c0cf0d704e881d0d660be923fd3076db3e932007e726e3"},
{file = "cryptography-45.0.7-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:a20e442e917889d1a6b3c570c9e3fa2fdc398c20868abcea268ea33c024c4083"}, {file = "cryptography-46.0.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:22d7e97932f511d6b0b04f2bfd818d73dcd5928db509460aaf48384778eb6d20"},
{file = "cryptography-45.0.7-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:258e0dff86d1d891169b5af222d362468a9570e2532923088658aa866eb11130"}, {file = "cryptography-46.0.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:d55f3dffadd674514ad19451161118fd010988540cee43d8bc20675e775925de"},
{file = "cryptography-45.0.7-pp310-pypy310_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:d97cf502abe2ab9eff8bd5e4aca274da8d06dd3ef08b759a8d6143f4ad65d4b4"}, {file = "cryptography-46.0.3-cp314-cp314t-win32.whl", hash = "sha256:8a6e050cb6164d3f830453754094c086ff2d0b2f3a897a1d9820f6139a1f0914"},
{file = "cryptography-45.0.7-pp310-pypy310_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:c987dad82e8c65ebc985f5dae5e74a3beda9d0a2a4daf8a1115f3772b59e5141"}, {file = "cryptography-46.0.3-cp314-cp314t-win_amd64.whl", hash = "sha256:760f83faa07f8b64e9c33fc963d790a2edb24efb479e3520c14a45741cd9b2db"},
{file = "cryptography-45.0.7-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:c13b1e3afd29a5b3b2656257f14669ca8fa8d7956d509926f0b130b600b50ab7"}, {file = "cryptography-46.0.3-cp314-cp314t-win_arm64.whl", hash = "sha256:516ea134e703e9fe26bcd1277a4b59ad30586ea90c365a87781d7887a646fe21"},
{file = "cryptography-45.0.7-pp311-pypy311_pp73-macosx_10_9_x86_64.whl", hash = "sha256:4a862753b36620af6fc54209264f92c716367f2f0ff4624952276a6bbd18cbde"}, {file = "cryptography-46.0.3-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:cb3d760a6117f621261d662bccc8ef5bc32ca673e037c83fbe565324f5c46936"},
{file = "cryptography-45.0.7-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:06ce84dc14df0bf6ea84666f958e6080cdb6fe1231be2a51f3fc1267d9f3fb34"}, {file = "cryptography-46.0.3-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:4b7387121ac7d15e550f5cb4a43aef2559ed759c35df7336c402bb8275ac9683"},
{file = "cryptography-45.0.7-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:d0c5c6bac22b177bf8da7435d9d27a6834ee130309749d162b26c3105c0795a9"}, {file = "cryptography-46.0.3-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:15ab9b093e8f09daab0f2159bb7e47532596075139dd74365da52ecc9cb46c5d"},
{file = "cryptography-45.0.7-pp311-pypy311_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:2f641b64acc00811da98df63df7d59fd4706c0df449da71cb7ac39a0732b40ae"}, {file = "cryptography-46.0.3-cp38-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:46acf53b40ea38f9c6c229599a4a13f0d46a6c3fa9ef19fc1a124d62e338dfa0"},
{file = "cryptography-45.0.7-pp311-pypy311_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:f5414a788ecc6ee6bc58560e85ca624258a55ca434884445440a810796ea0e0b"}, {file = "cryptography-46.0.3-cp38-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:10ca84c4668d066a9878890047f03546f3ae0a6b8b39b697457b7757aaf18dbc"},
{file = "cryptography-45.0.7-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:1f3d56f73595376f4244646dd5c5870c14c196949807be39e79e7bd9bac3da63"}, {file = "cryptography-46.0.3-cp38-abi3-manylinux_2_28_ppc64le.whl", hash = "sha256:36e627112085bb3b81b19fed209c05ce2a52ee8b15d161b7c643a7d5a88491f3"},
{file = "cryptography-45.0.7.tar.gz", hash = "sha256:4b1654dfc64ea479c242508eb8c724044f1e964a47d1d1cacc5132292d851971"}, {file = "cryptography-46.0.3-cp38-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:1000713389b75c449a6e979ffc7dcc8ac90b437048766cef052d4d30b8220971"},
{file = "cryptography-46.0.3-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:b02cf04496f6576afffef5ddd04a0cb7d49cf6be16a9059d793a30b035f6b6ac"},
{file = "cryptography-46.0.3-cp38-abi3-manylinux_2_34_ppc64le.whl", hash = "sha256:71e842ec9bc7abf543b47cf86b9a743baa95f4677d22baa4c7d5c69e49e9bc04"},
{file = "cryptography-46.0.3-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:402b58fc32614f00980b66d6e56a5b4118e6cb362ae8f3fda141ba4689bd4506"},
{file = "cryptography-46.0.3-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:ef639cb3372f69ec44915fafcd6698b6cc78fbe0c2ea41be867f6ed612811963"},
{file = "cryptography-46.0.3-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:3b51b8ca4f1c6453d8829e1eb7299499ca7f313900dd4d89a24b8b87c0a780d4"},
{file = "cryptography-46.0.3-cp38-abi3-win32.whl", hash = "sha256:6276eb85ef938dc035d59b87c8a7dc559a232f954962520137529d77b18ff1df"},
{file = "cryptography-46.0.3-cp38-abi3-win_amd64.whl", hash = "sha256:416260257577718c05135c55958b674000baef9a1c7d9e8f306ec60d71db850f"},
{file = "cryptography-46.0.3-cp38-abi3-win_arm64.whl", hash = "sha256:d89c3468de4cdc4f08a57e214384d0471911a3830fcdaf7a8cc587e42a866372"},
{file = "cryptography-46.0.3-pp310-pypy310_pp73-macosx_10_9_x86_64.whl", hash = "sha256:a23582810fedb8c0bc47524558fb6c56aac3fc252cb306072fd2815da2a47c32"},
{file = "cryptography-46.0.3-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:e7aec276d68421f9574040c26e2a7c3771060bc0cff408bae1dcb19d3ab1e63c"},
{file = "cryptography-46.0.3-pp311-pypy311_pp73-macosx_10_9_x86_64.whl", hash = "sha256:7ce938a99998ed3c8aa7e7272dca1a610401ede816d36d0693907d863b10d9ea"},
{file = "cryptography-46.0.3-pp311-pypy311_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:191bb60a7be5e6f54e30ba16fdfae78ad3a342a0599eb4193ba88e3f3d6e185b"},
{file = "cryptography-46.0.3-pp311-pypy311_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:c70cc23f12726be8f8bc72e41d5065d77e4515efae3690326764ea1b07845cfb"},
{file = "cryptography-46.0.3-pp311-pypy311_pp73-manylinux_2_34_aarch64.whl", hash = "sha256:9394673a9f4de09e28b5356e7fff97d778f8abad85c9d5ac4a4b7e25a0de7717"},
{file = "cryptography-46.0.3-pp311-pypy311_pp73-manylinux_2_34_x86_64.whl", hash = "sha256:94cd0549accc38d1494e1f8de71eca837d0509d0d44bf11d158524b0e12cebf9"},
{file = "cryptography-46.0.3-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:6b5063083824e5509fdba180721d55909ffacccc8adbec85268b48439423d78c"},
{file = "cryptography-46.0.3.tar.gz", hash = "sha256:a8b17438104fed022ce745b362294d9ce35b4c2e45c1d958ad4a4b019285f4a1"},
] ]
[package.dependencies] [package.dependencies]
cffi = {version = ">=1.14", markers = "platform_python_implementation != \"PyPy\""} cffi = {version = ">=2.0.0", markers = "python_full_version >= \"3.9.0\" and platform_python_implementation != \"PyPy\""}
typing-extensions = {version = ">=4.13.2", markers = "python_full_version < \"3.11.0\""}
[package.extras] [package.extras]
docs = ["sphinx (>=5.3.0)", "sphinx-inline-tabs ; python_full_version >= \"3.8.0\"", "sphinx-rtd-theme (>=3.0.0) ; python_full_version >= \"3.8.0\""] docs = ["sphinx (>=5.3.0)", "sphinx-inline-tabs", "sphinx-rtd-theme (>=3.0.0)"]
docstest = ["pyenchant (>=3)", "readme-renderer (>=30.0)", "sphinxcontrib-spelling (>=7.3.1)"] docstest = ["pyenchant (>=3)", "readme-renderer (>=30.0)", "sphinxcontrib-spelling (>=7.3.1)"]
nox = ["nox (>=2024.4.15)", "nox[uv] (>=2024.3.2) ; python_full_version >= \"3.8.0\""] nox = ["nox[uv] (>=2024.4.15)"]
pep8test = ["check-sdist ; python_full_version >= \"3.8.0\"", "click (>=8.0.1)", "mypy (>=1.4)", "ruff (>=0.3.6)"] pep8test = ["check-sdist", "click (>=8.0.1)", "mypy (>=1.14)", "ruff (>=0.11.11)"]
sdist = ["build (>=1.0.0)"] sdist = ["build (>=1.0.0)"]
ssh = ["bcrypt (>=3.1.5)"] ssh = ["bcrypt (>=3.1.5)"]
test = ["certifi (>=2024)", "cryptography-vectors (==45.0.7)", "pretend (>=0.7)", "pytest (>=7.4.0)", "pytest-benchmark (>=4.0)", "pytest-cov (>=2.10.1)", "pytest-xdist (>=3.5.0)"] test = ["certifi (>=2024)", "cryptography-vectors (==46.0.3)", "pretend (>=0.7)", "pytest (>=7.4.0)", "pytest-benchmark (>=4.0)", "pytest-cov (>=2.10.1)", "pytest-xdist (>=3.5.0)"]
test-randomorder = ["pytest-randomly"] test-randomorder = ["pytest-randomly"]
[[package]] [[package]]
@ -991,6 +1026,92 @@ files = [
[package.dependencies] [package.dependencies]
pyasn1 = ">=0.4.6" pyasn1 = ">=0.4.6"
[[package]]
name = "librt"
version = "0.6.3"
description = "Mypyc runtime library"
optional = false
python-versions = ">=3.9"
groups = ["dev"]
files = [
{file = "librt-0.6.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:45660d26569cc22ed30adf583389d8a0d1b468f8b5e518fcf9bfe2cd298f9dd1"},
{file = "librt-0.6.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:54f3b2177fb892d47f8016f1087d21654b44f7fc4cf6571c1c6b3ea531ab0fcf"},
{file = "librt-0.6.3-cp310-cp310-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:c5b31bed2c2f2fa1fcb4815b75f931121ae210dc89a3d607fb1725f5907f1437"},
{file = "librt-0.6.3-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8f8ed5053ef9fb08d34f1fd80ff093ccbd1f67f147633a84cf4a7d9b09c0f089"},
{file = "librt-0.6.3-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3f0e4bd9bcb0ee34fa3dbedb05570da50b285f49e52c07a241da967840432513"},
{file = "librt-0.6.3-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:d8f89c8d20dfa648a3f0a56861946eb00e5b00d6b00eea14bc5532b2fcfa8ef1"},
{file = "librt-0.6.3-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:ecc2c526547eacd20cb9fbba19a5268611dbc70c346499656d6cf30fae328977"},
{file = "librt-0.6.3-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:fbedeb9b48614d662822ee514567d2d49a8012037fc7b4cd63f282642c2f4b7d"},
{file = "librt-0.6.3-cp310-cp310-win32.whl", hash = "sha256:0765b0fe0927d189ee14b087cd595ae636bef04992e03fe6dfdaa383866c8a46"},
{file = "librt-0.6.3-cp310-cp310-win_amd64.whl", hash = "sha256:8c659f9fb8a2f16dc4131b803fa0144c1dadcb3ab24bb7914d01a6da58ae2457"},
{file = "librt-0.6.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:61348cc488b18d1b1ff9f3e5fcd5ac43ed22d3e13e862489d2267c2337285c08"},
{file = "librt-0.6.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:64645b757d617ad5f98c08e07620bc488d4bced9ced91c6279cec418f16056fa"},
{file = "librt-0.6.3-cp311-cp311-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:26b8026393920320bb9a811b691d73c5981385d537ffc5b6e22e53f7b65d4122"},
{file = "librt-0.6.3-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d998b432ed9ffccc49b820e913c8f327a82026349e9c34fa3690116f6b70770f"},
{file = "librt-0.6.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e18875e17ef69ba7dfa9623f2f95f3eda6f70b536079ee6d5763ecdfe6cc9040"},
{file = "librt-0.6.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:a218f85081fc3f70cddaed694323a1ad7db5ca028c379c214e3a7c11c0850523"},
{file = "librt-0.6.3-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:1ef42ff4edd369e84433ce9b188a64df0837f4f69e3d34d3b34d4955c599d03f"},
{file = "librt-0.6.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:0e0f2b79993fec23a685b3e8107ba5f8675eeae286675a216da0b09574fa1e47"},
{file = "librt-0.6.3-cp311-cp311-win32.whl", hash = "sha256:fd98cacf4e0fabcd4005c452cb8a31750258a85cab9a59fb3559e8078da408d7"},
{file = "librt-0.6.3-cp311-cp311-win_amd64.whl", hash = "sha256:e17b5b42c8045867ca9d1f54af00cc2275198d38de18545edaa7833d7e9e4ac8"},
{file = "librt-0.6.3-cp311-cp311-win_arm64.whl", hash = "sha256:87597e3d57ec0120a3e1d857a708f80c02c42ea6b00227c728efbc860f067c45"},
{file = "librt-0.6.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:74418f718083009108dc9a42c21bf2e4802d49638a1249e13677585fcc9ca176"},
{file = "librt-0.6.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:514f3f363d1ebc423357d36222c37e5c8e6674b6eae8d7195ac9a64903722057"},
{file = "librt-0.6.3-cp312-cp312-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:cf1115207a5049d1f4b7b4b72de0e52f228d6c696803d94843907111cbf80610"},
{file = "librt-0.6.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ad8ba80cdcea04bea7b78fcd4925bfbf408961e9d8397d2ee5d3ec121e20c08c"},
{file = "librt-0.6.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4018904c83eab49c814e2494b4e22501a93cdb6c9f9425533fe693c3117126f9"},
{file = "librt-0.6.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:8983c5c06ac9c990eac5eb97a9f03fe41dc7e9d7993df74d9e8682a1056f596c"},
{file = "librt-0.6.3-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:d7769c579663a6f8dbf34878969ac71befa42067ce6bf78e6370bf0d1194997c"},
{file = "librt-0.6.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:d3c9a07eafdc70556f8c220da4a538e715668c0c63cabcc436a026e4e89950bf"},
{file = "librt-0.6.3-cp312-cp312-win32.whl", hash = "sha256:38320386a48a15033da295df276aea93a92dfa94a862e06893f75ea1d8bbe89d"},
{file = "librt-0.6.3-cp312-cp312-win_amd64.whl", hash = "sha256:c0ecf4786ad0404b072196b5df774b1bb23c8aacdcacb6c10b4128bc7b00bd01"},
{file = "librt-0.6.3-cp312-cp312-win_arm64.whl", hash = "sha256:9f2a6623057989ebc469cd9cc8fe436c40117a0147627568d03f84aef7854c55"},
{file = "librt-0.6.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:9e716f9012148a81f02f46a04fc4c663420c6fbfeacfac0b5e128cf43b4413d3"},
{file = "librt-0.6.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:669ff2495728009a96339c5ad2612569c6d8be4474e68f3f3ac85d7c3261f5f5"},
{file = "librt-0.6.3-cp313-cp313-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:349b6873ebccfc24c9efd244e49da9f8a5c10f60f07575e248921aae2123fc42"},
{file = "librt-0.6.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0c74c26736008481c9f6d0adf1aedb5a52aff7361fea98276d1f965c0256ee70"},
{file = "librt-0.6.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:408a36ddc75e91918cb15b03460bdc8a015885025d67e68c6f78f08c3a88f522"},
{file = "librt-0.6.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:e61ab234624c9ffca0248a707feffe6fac2343758a36725d8eb8a6efef0f8c30"},
{file = "librt-0.6.3-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:324462fe7e3896d592b967196512491ec60ca6e49c446fe59f40743d08c97917"},
{file = "librt-0.6.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:36b2ec8c15030002c7f688b4863e7be42820d7c62d9c6eece3db54a2400f0530"},
{file = "librt-0.6.3-cp313-cp313-win32.whl", hash = "sha256:25b1b60cb059471c0c0c803e07d0dfdc79e41a0a122f288b819219ed162672a3"},
{file = "librt-0.6.3-cp313-cp313-win_amd64.whl", hash = "sha256:10a95ad074e2a98c9e4abc7f5b7d40e5ecbfa84c04c6ab8a70fabf59bd429b88"},
{file = "librt-0.6.3-cp313-cp313-win_arm64.whl", hash = "sha256:17000df14f552e86877d67e4ab7966912224efc9368e998c96a6974a8d609bf9"},
{file = "librt-0.6.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:8e695f25d1a425ad7a272902af8ab8c8d66c1998b177e4b5f5e7b4e215d0c88a"},
{file = "librt-0.6.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:3e84a4121a7ae360ca4da436548a9c1ca8ca134a5ced76c893cc5944426164bd"},
{file = "librt-0.6.3-cp314-cp314-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:05f385a414de3f950886ea0aad8f109650d4b712cf9cc14cc17f5f62a9ab240b"},
{file = "librt-0.6.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:36a8e337461150b05ca2c7bdedb9e591dfc262c5230422cea398e89d0c746cdc"},
{file = "librt-0.6.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:dcbe48f6a03979384f27086484dc2a14959be1613cb173458bd58f714f2c48f3"},
{file = "librt-0.6.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:4bca9e4c260233fba37b15c4ec2f78aa99c1a79fbf902d19dd4a763c5c3fb751"},
{file = "librt-0.6.3-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:760c25ed6ac968e24803eb5f7deb17ce026902d39865e83036bacbf5cf242aa8"},
{file = "librt-0.6.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:4aa4a93a353ccff20df6e34fa855ae8fd788832c88f40a9070e3ddd3356a9f0e"},
{file = "librt-0.6.3-cp314-cp314-win32.whl", hash = "sha256:cb92741c2b4ea63c09609b064b26f7f5d9032b61ae222558c55832ec3ad0bcaf"},
{file = "librt-0.6.3-cp314-cp314-win_amd64.whl", hash = "sha256:fdcd095b1b812d756fa5452aca93b962cf620694c0cadb192cec2bb77dcca9a2"},
{file = "librt-0.6.3-cp314-cp314-win_arm64.whl", hash = "sha256:822ca79e28720a76a935c228d37da6579edef048a17cd98d406a2484d10eda78"},
{file = "librt-0.6.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:078cd77064d1640cb7b0650871a772956066174d92c8aeda188a489b58495179"},
{file = "librt-0.6.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:5cc22f7f5c0cc50ed69f4b15b9c51d602aabc4500b433aaa2ddd29e578f452f7"},
{file = "librt-0.6.3-cp314-cp314t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:14b345eb7afb61b9fdcdfda6738946bd11b8e0f6be258666b0646af3b9bb5916"},
{file = "librt-0.6.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6d46aa46aa29b067f0b8b84f448fd9719aaf5f4c621cc279164d76a9dc9ab3e8"},
{file = "librt-0.6.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1b51ba7d9d5d9001494769eca8c0988adce25d0a970c3ba3f2eb9df9d08036fc"},
{file = "librt-0.6.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:ced0925a18fddcff289ef54386b2fc230c5af3c83b11558571124bfc485b8c07"},
{file = "librt-0.6.3-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:6bac97e51f66da2ca012adddbe9fd656b17f7368d439de30898f24b39512f40f"},
{file = "librt-0.6.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:b2922a0e8fa97395553c304edc3bd36168d8eeec26b92478e292e5d4445c1ef0"},
{file = "librt-0.6.3-cp314-cp314t-win32.whl", hash = "sha256:f33462b19503ba68d80dac8a1354402675849259fb3ebf53b67de86421735a3a"},
{file = "librt-0.6.3-cp314-cp314t-win_amd64.whl", hash = "sha256:04f8ce401d4f6380cfc42af0f4e67342bf34c820dae01343f58f472dbac75dcf"},
{file = "librt-0.6.3-cp314-cp314t-win_arm64.whl", hash = "sha256:afb39550205cc5e5c935762c6bf6a2bb34f7d21a68eadb25e2db7bf3593fecc0"},
{file = "librt-0.6.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:09262cb2445b6f15d09141af20b95bb7030c6f13b00e876ad8fdd1a9045d6aa5"},
{file = "librt-0.6.3-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:57705e8eec76c5b77130d729c0f70190a9773366c555c5457c51eace80afd873"},
{file = "librt-0.6.3-cp39-cp39-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:3ac2a7835434b31def8ed5355dd9b895bbf41642d61967522646d1d8b9681106"},
{file = "librt-0.6.3-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:71f0a5918aebbea1e7db2179a8fe87e8a8732340d9e8b8107401fb407eda446e"},
{file = "librt-0.6.3-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:aa346e202e6e1ebc01fe1c69509cffe486425884b96cb9ce155c99da1ecbe0e9"},
{file = "librt-0.6.3-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:92267f865c7bbd12327a0d394666948b9bf4b51308b52947c0cc453bfa812f5d"},
{file = "librt-0.6.3-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:86605d5bac340beb030cbc35859325982a79047ebdfba1e553719c7126a2389d"},
{file = "librt-0.6.3-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:98e4bbecbef8d2a60ecf731d735602feee5ac0b32117dbbc765e28b054bac912"},
{file = "librt-0.6.3-cp39-cp39-win32.whl", hash = "sha256:3caa0634c02d5ff0b2ae4a28052e0d8c5f20d497623dc13f629bd4a9e2a6efad"},
{file = "librt-0.6.3-cp39-cp39-win_amd64.whl", hash = "sha256:b47395091e7e0ece1e6ebac9b98bf0c9084d1e3d3b2739aa566be7e56e3f7bf2"},
{file = "librt-0.6.3.tar.gz", hash = "sha256:c724a884e642aa2bbad52bb0203ea40406ad742368a5f90da1b220e970384aae"},
]
[[package]] [[package]]
name = "lxml" name = "lxml"
version = "6.0.2" version = "6.0.2"
@ -1413,53 +1534,54 @@ docs = ["sphinx (>=8,<9)", "sphinx-autobuild"]
[[package]] [[package]]
name = "mypy" name = "mypy"
version = "1.17.1" version = "1.19.0"
description = "Optional static typing for Python" description = "Optional static typing for Python"
optional = false optional = false
python-versions = ">=3.9" python-versions = ">=3.9"
groups = ["dev"] groups = ["dev"]
files = [ files = [
{file = "mypy-1.17.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:3fbe6d5555bf608c47203baa3e72dbc6ec9965b3d7c318aa9a4ca76f465bd972"}, {file = "mypy-1.19.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:6148ede033982a8c5ca1143de34c71836a09f105068aaa8b7d5edab2b053e6c8"},
{file = "mypy-1.17.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:80ef5c058b7bce08c83cac668158cb7edea692e458d21098c7d3bce35a5d43e7"}, {file = "mypy-1.19.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:a9ac09e52bb0f7fb912f5d2a783345c72441a08ef56ce3e17c1752af36340a39"},
{file = "mypy-1.17.1-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c4a580f8a70c69e4a75587bd925d298434057fe2a428faaf927ffe6e4b9a98df"}, {file = "mypy-1.19.0-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:11f7254c15ab3f8ed68f8e8f5cbe88757848df793e31c36aaa4d4f9783fd08ab"},
{file = "mypy-1.17.1-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:dd86bb649299f09d987a2eebb4d52d10603224500792e1bee18303bbcc1ce390"}, {file = "mypy-1.19.0-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:318ba74f75899b0e78b847d8c50821e4c9637c79d9a59680fc1259f29338cb3e"},
{file = "mypy-1.17.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:a76906f26bd8d51ea9504966a9c25419f2e668f012e0bdf3da4ea1526c534d94"}, {file = "mypy-1.19.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:cf7d84f497f78b682edd407f14a7b6e1a2212b433eedb054e2081380b7395aa3"},
{file = "mypy-1.17.1-cp310-cp310-win_amd64.whl", hash = "sha256:e79311f2d904ccb59787477b7bd5d26f3347789c06fcd7656fa500875290264b"}, {file = "mypy-1.19.0-cp310-cp310-win_amd64.whl", hash = "sha256:c3385246593ac2b97f155a0e9639be906e73534630f663747c71908dfbf26134"},
{file = "mypy-1.17.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:ad37544be07c5d7fba814eb370e006df58fed8ad1ef33ed1649cb1889ba6ff58"}, {file = "mypy-1.19.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a31e4c28e8ddb042c84c5e977e28a21195d086aaffaf08b016b78e19c9ef8106"},
{file = "mypy-1.17.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:064e2ff508e5464b4bd807a7c1625bc5047c5022b85c70f030680e18f37273a5"}, {file = "mypy-1.19.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:34ec1ac66d31644f194b7c163d7f8b8434f1b49719d403a5d26c87fff7e913f7"},
{file = "mypy-1.17.1-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:70401bbabd2fa1aa7c43bb358f54037baf0586f41e83b0ae67dd0534fc64edfd"}, {file = "mypy-1.19.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:cb64b0ba5980466a0f3f9990d1c582bcab8db12e29815ecb57f1408d99b4bff7"},
{file = "mypy-1.17.1-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e92bdc656b7757c438660f775f872a669b8ff374edc4d18277d86b63edba6b8b"}, {file = "mypy-1.19.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:120cffe120cca5c23c03c77f84abc0c14c5d2e03736f6c312480020082f1994b"},
{file = "mypy-1.17.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:c1fdf4abb29ed1cb091cf432979e162c208a5ac676ce35010373ff29247bcad5"}, {file = "mypy-1.19.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:7a500ab5c444268a70565e374fc803972bfd1f09545b13418a5174e29883dab7"},
{file = "mypy-1.17.1-cp311-cp311-win_amd64.whl", hash = "sha256:ff2933428516ab63f961644bc49bc4cbe42bbffb2cd3b71cc7277c07d16b1a8b"}, {file = "mypy-1.19.0-cp311-cp311-win_amd64.whl", hash = "sha256:c14a98bc63fd867530e8ec82f217dae29d0550c86e70debc9667fff1ec83284e"},
{file = "mypy-1.17.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:69e83ea6553a3ba79c08c6e15dbd9bfa912ec1e493bf75489ef93beb65209aeb"}, {file = "mypy-1.19.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:0fb3115cb8fa7c5f887c8a8d81ccdcb94cff334684980d847e5a62e926910e1d"},
{file = "mypy-1.17.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1b16708a66d38abb1e6b5702f5c2c87e133289da36f6a1d15f6a5221085c6403"}, {file = "mypy-1.19.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f3e19e3b897562276bb331074d64c076dbdd3e79213f36eed4e592272dabd760"},
{file = "mypy-1.17.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:89e972c0035e9e05823907ad5398c5a73b9f47a002b22359b177d40bdaee7056"}, {file = "mypy-1.19.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b9d491295825182fba01b6ffe2c6fe4e5a49dbf4e2bb4d1217b6ced3b4797bc6"},
{file = "mypy-1.17.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:03b6d0ed2b188e35ee6d5c36b5580cffd6da23319991c49ab5556c023ccf1341"}, {file = "mypy-1.19.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:6016c52ab209919b46169651b362068f632efcd5eb8ef9d1735f6f86da7853b2"},
{file = "mypy-1.17.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:c837b896b37cd103570d776bda106eabb8737aa6dd4f248451aecf53030cdbeb"}, {file = "mypy-1.19.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:f188dcf16483b3e59f9278c4ed939ec0254aa8a60e8fc100648d9ab5ee95a431"},
{file = "mypy-1.17.1-cp312-cp312-win_amd64.whl", hash = "sha256:665afab0963a4b39dff7c1fa563cc8b11ecff7910206db4b2e64dd1ba25aed19"}, {file = "mypy-1.19.0-cp312-cp312-win_amd64.whl", hash = "sha256:0e3c3d1e1d62e678c339e7ade72746a9e0325de42cd2cccc51616c7b2ed1a018"},
{file = "mypy-1.17.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:93378d3203a5c0800c6b6d850ad2f19f7a3cdf1a3701d3416dbf128805c6a6a7"}, {file = "mypy-1.19.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:7686ed65dbabd24d20066f3115018d2dce030d8fa9db01aa9f0a59b6813e9f9e"},
{file = "mypy-1.17.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:15d54056f7fe7a826d897789f53dd6377ec2ea8ba6f776dc83c2902b899fee81"}, {file = "mypy-1.19.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:fd4a985b2e32f23bead72e2fb4bbe5d6aceee176be471243bd831d5b2644672d"},
{file = "mypy-1.17.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:209a58fed9987eccc20f2ca94afe7257a8f46eb5df1fb69958650973230f91e6"}, {file = "mypy-1.19.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:fc51a5b864f73a3a182584b1ac75c404396a17eced54341629d8bdcb644a5bba"},
{file = "mypy-1.17.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:099b9a5da47de9e2cb5165e581f158e854d9e19d2e96b6698c0d64de911dd849"}, {file = "mypy-1.19.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:37af5166f9475872034b56c5efdcf65ee25394e9e1d172907b84577120714364"},
{file = "mypy-1.17.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:fa6ffadfbe6994d724c5a1bb6123a7d27dd68fc9c059561cd33b664a79578e14"}, {file = "mypy-1.19.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:510c014b722308c9bd377993bcbf9a07d7e0692e5fa8fc70e639c1eb19fc6bee"},
{file = "mypy-1.17.1-cp313-cp313-win_amd64.whl", hash = "sha256:9a2b7d9180aed171f033c9f2fc6c204c1245cf60b0cb61cf2e7acc24eea78e0a"}, {file = "mypy-1.19.0-cp313-cp313-win_amd64.whl", hash = "sha256:cabbee74f29aa9cd3b444ec2f1e4fa5a9d0d746ce7567a6a609e224429781f53"},
{file = "mypy-1.17.1-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:15a83369400454c41ed3a118e0cc58bd8123921a602f385cb6d6ea5df050c733"}, {file = "mypy-1.19.0-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:f2e36bed3c6d9b5f35d28b63ca4b727cb0228e480826ffc8953d1892ddc8999d"},
{file = "mypy-1.17.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:55b918670f692fc9fba55c3298d8a3beae295c5cded0a55dccdc5bbead814acd"}, {file = "mypy-1.19.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:a18d8abdda14035c5718acb748faec09571432811af129bf0d9e7b2d6699bf18"},
{file = "mypy-1.17.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:62761474061feef6f720149d7ba876122007ddc64adff5ba6f374fda35a018a0"}, {file = "mypy-1.19.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f75e60aca3723a23511948539b0d7ed514dda194bc3755eae0bfc7a6b4887aa7"},
{file = "mypy-1.17.1-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c49562d3d908fd49ed0938e5423daed8d407774a479b595b143a3d7f87cdae6a"}, {file = "mypy-1.19.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8f44f2ae3c58421ee05fe609160343c25f70e3967f6e32792b5a78006a9d850f"},
{file = "mypy-1.17.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:397fba5d7616a5bc60b45c7ed204717eaddc38f826e3645402c426057ead9a91"}, {file = "mypy-1.19.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:63ea6a00e4bd6822adbfc75b02ab3653a17c02c4347f5bb0cf1d5b9df3a05835"},
{file = "mypy-1.17.1-cp314-cp314-win_amd64.whl", hash = "sha256:9d6b20b97d373f41617bd0708fd46aa656059af57f2ef72aa8c7d6a2b73b74ed"}, {file = "mypy-1.19.0-cp314-cp314-win_amd64.whl", hash = "sha256:3ad925b14a0bb99821ff6f734553294aa6a3440a8cb082fe1f5b84dfb662afb1"},
{file = "mypy-1.17.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5d1092694f166a7e56c805caaf794e0585cabdbf1df36911c414e4e9abb62ae9"}, {file = "mypy-1.19.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:0dde5cb375cb94deff0d4b548b993bec52859d1651e073d63a1386d392a95495"},
{file = "mypy-1.17.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:79d44f9bfb004941ebb0abe8eff6504223a9c1ac51ef967d1263c6572bbebc99"}, {file = "mypy-1.19.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:1cf9c59398db1c68a134b0b5354a09a1e124523f00bacd68e553b8bd16ff3299"},
{file = "mypy-1.17.1-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b01586eed696ec905e61bd2568f48740f7ac4a45b3a468e6423a03d3788a51a8"}, {file = "mypy-1.19.0-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3210d87b30e6af9c8faed61be2642fcbe60ef77cec64fa1ef810a630a4cf671c"},
{file = "mypy-1.17.1-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:43808d9476c36b927fbcd0b0255ce75efe1b68a080154a38ae68a7e62de8f0f8"}, {file = "mypy-1.19.0-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e2c1101ab41d01303103ab6ef82cbbfedb81c1a060c868fa7cc013d573d37ab5"},
{file = "mypy-1.17.1-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:feb8cc32d319edd5859da2cc084493b3e2ce5e49a946377663cc90f6c15fb259"}, {file = "mypy-1.19.0-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:0ea4fd21bb48f0da49e6d3b37ef6bd7e8228b9fe41bbf4d80d9364d11adbd43c"},
{file = "mypy-1.17.1-cp39-cp39-win_amd64.whl", hash = "sha256:d7598cf74c3e16539d4e2f0b8d8c318e00041553d83d4861f87c7a72e95ac24d"}, {file = "mypy-1.19.0-cp39-cp39-win_amd64.whl", hash = "sha256:16f76ff3f3fd8137aadf593cb4607d82634fca675e8211ad75c43d86033ee6c6"},
{file = "mypy-1.17.1-py3-none-any.whl", hash = "sha256:a9f52c0351c21fe24c21d8c0eb1f62967b262d6729393397b6f443c3b773c3b9"}, {file = "mypy-1.19.0-py3-none-any.whl", hash = "sha256:0c01c99d626380752e527d5ce8e69ffbba2046eb8a060db0329690849cf9b6f9"},
{file = "mypy-1.17.1.tar.gz", hash = "sha256:25e01ec741ab5bb3eec8ba9cdb0f769230368a22c959c4937360efb89b7e9f01"}, {file = "mypy-1.19.0.tar.gz", hash = "sha256:f6b874ca77f733222641e5c46e4711648c4037ea13646fd0cdc814c2eaec2528"},
] ]
[package.dependencies] [package.dependencies]
librt = ">=0.6.2"
mypy_extensions = ">=1.0.0" mypy_extensions = ">=1.0.0"
pathspec = ">=0.9.0" pathspec = ">=0.9.0"
tomli = {version = ">=1.1.0", markers = "python_version < \"3.11\""} tomli = {version = ">=1.1.0", markers = "python_version < \"3.11\""}
@ -1486,18 +1608,18 @@ files = [
[[package]] [[package]]
name = "mypy-zope" name = "mypy-zope"
version = "1.0.13" version = "1.0.14"
description = "Plugin for mypy to support zope interfaces" description = "Plugin for mypy to support zope interfaces"
optional = false optional = false
python-versions = "*" python-versions = "*"
groups = ["dev"] groups = ["dev"]
files = [ files = [
{file = "mypy_zope-1.0.13-py3-none-any.whl", hash = "sha256:13740c4cbc910cca2c143c6709e1c483c991abeeeb7b629ad6f73d8ac1edad15"}, {file = "mypy_zope-1.0.14-py3-none-any.whl", hash = "sha256:8842ade93630421dbec0c9906d6515f6e65c6407ef8b9b2eb7f4f73ae1e8a42a"},
{file = "mypy_zope-1.0.13.tar.gz", hash = "sha256:63fb4d035ea874baf280dc69e714dcde4bd2a4a4837a0fd8d90ce91bea510f99"}, {file = "mypy_zope-1.0.14.tar.gz", hash = "sha256:42555ad4703f2e50c912de3ebe0c7197619c3f71864817fabc5385ecea0f8449"},
] ]
[package.dependencies] [package.dependencies]
mypy = ">=1.0.0,<1.18.0" mypy = ">=1.0.0,<1.20.0"
"zope.interface" = "*" "zope.interface" = "*"
"zope.schema" = "*" "zope.schema" = "*"
@ -1575,14 +1697,14 @@ files = [
[[package]] [[package]]
name = "phonenumbers" name = "phonenumbers"
version = "9.0.18" version = "9.0.19"
description = "Python version of Google's common library for parsing, formatting, storing and validating international phone numbers." description = "Python version of Google's common library for parsing, formatting, storing and validating international phone numbers."
optional = false optional = false
python-versions = "*" python-versions = "*"
groups = ["main"] groups = ["main"]
files = [ files = [
{file = "phonenumbers-9.0.18-py2.py3-none-any.whl", hash = "sha256:d3354454ac31c97f8a08121df97a7145b8dca641f734c6f1518a41c2f60c5764"}, {file = "phonenumbers-9.0.19-py2.py3-none-any.whl", hash = "sha256:004abdfe2010518c2383f148515664a742e8a5d5540e07c049735c139d7e8b09"},
{file = "phonenumbers-9.0.18.tar.gz", hash = "sha256:5537c61ba95b11b992c95e804da6e49193cc06b1224f632ade64631518a48ed1"}, {file = "phonenumbers-9.0.19.tar.gz", hash = "sha256:e0674e31554362f4d95383558f7aefde738ef2e7bf96d28a10afd3e87d63a65c"},
] ]
[[package]] [[package]]
@ -1760,14 +1882,14 @@ psycopg2 = "*"
[[package]] [[package]]
name = "pyasn1" name = "pyasn1"
version = "0.6.1" version = "0.6.2"
description = "Pure-Python implementation of ASN.1 types and DER/BER/CER codecs (X.208)" description = "Pure-Python implementation of ASN.1 types and DER/BER/CER codecs (X.208)"
optional = false optional = false
python-versions = ">=3.8" python-versions = ">=3.8"
groups = ["main"] groups = ["main"]
files = [ files = [
{file = "pyasn1-0.6.1-py3-none-any.whl", hash = "sha256:0d632f46f2ba09143da3a8afe9e33fb6f92fa2320ab7e886e2d0f7672af84629"}, {file = "pyasn1-0.6.2-py3-none-any.whl", hash = "sha256:1eb26d860996a18e9b6ed05e7aae0e9fc21619fcee6af91cca9bad4fbea224bf"},
{file = "pyasn1-0.6.1.tar.gz", hash = "sha256:6f580d2bdd84365380830acf45550f2511469f673cb4a5ae3857a3170128b034"}, {file = "pyasn1-0.6.2.tar.gz", hash = "sha256:9b59a2b25ba7e4f8197db7686c09fb33e658b98339fadb826e9512629017833b"},
] ]
[[package]] [[package]]
@ -1792,6 +1914,7 @@ description = "C parser in Python"
optional = false optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*" python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
groups = ["main", "dev"] groups = ["main", "dev"]
markers = "implementation_name != \"PyPy\""
files = [ files = [
{file = "pycparser-2.21-py2.py3-none-any.whl", hash = "sha256:8ee45429555515e1f6b185e78100aea234072576aa43ab53aefcae078162fca9"}, {file = "pycparser-2.21-py2.py3-none-any.whl", hash = "sha256:8ee45429555515e1f6b185e78100aea234072576aa43ab53aefcae078162fca9"},
{file = "pycparser-2.21.tar.gz", hash = "sha256:e644fdec12f7872f86c58ff790da456218b10f863970249516d60a5eaca77206"}, {file = "pycparser-2.21.tar.gz", hash = "sha256:e644fdec12f7872f86c58ff790da456218b10f863970249516d60a5eaca77206"},
@ -2039,30 +2162,45 @@ files = [
[[package]] [[package]]
name = "pynacl" name = "pynacl"
version = "1.5.0" version = "1.6.2"
description = "Python binding to the Networking and Cryptography (NaCl) library" description = "Python binding to the Networking and Cryptography (NaCl) library"
optional = false optional = false
python-versions = ">=3.6" python-versions = ">=3.8"
groups = ["main", "dev"] groups = ["main", "dev"]
files = [ files = [
{file = "PyNaCl-1.5.0-cp36-abi3-macosx_10_10_universal2.whl", hash = "sha256:401002a4aaa07c9414132aaed7f6836ff98f59277a234704ff66878c2ee4a0d1"}, {file = "pynacl-1.6.2-cp314-cp314t-macosx_10_10_universal2.whl", hash = "sha256:622d7b07cc5c02c666795792931b50c91f3ce3c2649762efb1ef0d5684c81594"},
{file = "PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:52cb72a79269189d4e0dc537556f4740f7f0a9ec41c1322598799b0bdad4ef92"}, {file = "pynacl-1.6.2-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:d071c6a9a4c94d79eb665db4ce5cedc537faf74f2355e4d502591d850d3913c0"},
{file = "PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a36d4a9dda1f19ce6e03c9a784a2921a4b726b02e1c736600ca9c22029474394"}, {file = "pynacl-1.6.2-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:fe9847ca47d287af41e82be1dd5e23023d3c31a951da134121ab02e42ac218c9"},
{file = "PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl", hash = "sha256:0c84947a22519e013607c9be43706dd42513f9e6ae5d39d3613ca1e142fba44d"}, {file = "pynacl-1.6.2-cp314-cp314t-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:04316d1fc625d860b6c162fff704eb8426b1a8bcd3abacea11142cbd99a6b574"},
{file = "PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:06b8f6fa7f5de8d5d2f7573fe8c863c051225a27b61e6860fd047b1775807858"}, {file = "pynacl-1.6.2-cp314-cp314t-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:44081faff368d6c5553ccf55322ef2819abb40e25afaec7e740f159f74813634"},
{file = "PyNaCl-1.5.0-cp36-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:a422368fc821589c228f4c49438a368831cb5bbc0eab5ebe1d7fac9dded6567b"}, {file = "pynacl-1.6.2-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:a9f9932d8d2811ce1a8ffa79dcbdf3970e7355b5c8eb0c1a881a57e7f7d96e88"},
{file = "PyNaCl-1.5.0-cp36-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:61f642bf2378713e2c2e1de73444a3778e5f0a38be6fee0fe532fe30060282ff"}, {file = "pynacl-1.6.2-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:bc4a36b28dd72fb4845e5d8f9760610588a96d5a51f01d84d8c6ff9849968c14"},
{file = "PyNaCl-1.5.0-cp36-abi3-win32.whl", hash = "sha256:e46dae94e34b085175f8abb3b0aaa7da40767865ac82c928eeb9e57e1ea8a543"}, {file = "pynacl-1.6.2-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:3bffb6d0f6becacb6526f8f42adfb5efb26337056ee0831fb9a7044d1a964444"},
{file = "PyNaCl-1.5.0-cp36-abi3-win_amd64.whl", hash = "sha256:20f42270d27e1b6a29f54032090b972d97f0a1b0948cc52392041ef7831fee93"}, {file = "pynacl-1.6.2-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:2fef529ef3ee487ad8113d287a593fa26f48ee3620d92ecc6f1d09ea38e0709b"},
{file = "PyNaCl-1.5.0.tar.gz", hash = "sha256:8ac7448f09ab85811607bdd21ec2464495ac8b7c66d146bf545b0f08fb9220ba"}, {file = "pynacl-1.6.2-cp314-cp314t-win32.whl", hash = "sha256:a84bf1c20339d06dc0c85d9aea9637a24f718f375d861b2668b2f9f96fa51145"},
{file = "pynacl-1.6.2-cp314-cp314t-win_amd64.whl", hash = "sha256:320ef68a41c87547c91a8b58903c9caa641ab01e8512ce291085b5fe2fcb7590"},
{file = "pynacl-1.6.2-cp314-cp314t-win_arm64.whl", hash = "sha256:d29bfe37e20e015a7d8b23cfc8bd6aa7909c92a1b8f41ee416bbb3e79ef182b2"},
{file = "pynacl-1.6.2-cp38-abi3-macosx_10_10_universal2.whl", hash = "sha256:c949ea47e4206af7c8f604b8278093b674f7c79ed0d4719cc836902bf4517465"},
{file = "pynacl-1.6.2-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:8845c0631c0be43abdd865511c41eab235e0be69c81dc66a50911594198679b0"},
{file = "pynacl-1.6.2-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:22de65bb9010a725b0dac248f353bb072969c94fa8d6b1f34b87d7953cf7bbe4"},
{file = "pynacl-1.6.2-cp38-abi3-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:46065496ab748469cdd999246d17e301b2c24ae2fdf739132e580a0e94c94a87"},
{file = "pynacl-1.6.2-cp38-abi3-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8a66d6fb6ae7661c58995f9c6435bda2b1e68b54b598a6a10247bfcdadac996c"},
{file = "pynacl-1.6.2-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:26bfcd00dcf2cf160f122186af731ae30ab120c18e8375684ec2670dccd28130"},
{file = "pynacl-1.6.2-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:c8a231e36ec2cab018c4ad4358c386e36eede0319a0c41fed24f840b1dac59f6"},
{file = "pynacl-1.6.2-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:68be3a09455743ff9505491220b64440ced8973fe930f270c8e07ccfa25b1f9e"},
{file = "pynacl-1.6.2-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:8b097553b380236d51ed11356c953bf8ce36a29a3e596e934ecabe76c985a577"},
{file = "pynacl-1.6.2-cp38-abi3-win32.whl", hash = "sha256:5811c72b473b2f38f7e2a3dc4f8642e3a3e9b5e7317266e4ced1fba85cae41aa"},
{file = "pynacl-1.6.2-cp38-abi3-win_amd64.whl", hash = "sha256:62985f233210dee6548c223301b6c25440852e13d59a8b81490203c3227c5ba0"},
{file = "pynacl-1.6.2-cp38-abi3-win_arm64.whl", hash = "sha256:834a43af110f743a754448463e8fd61259cd4ab5bbedcf70f9dabad1d28a394c"},
{file = "pynacl-1.6.2.tar.gz", hash = "sha256:018494d6d696ae03c7e656e5e74cdfd8ea1326962cc401bcf018f1ed8436811c"},
] ]
[package.dependencies] [package.dependencies]
cffi = ">=1.4.1" cffi = {version = ">=2.0.0", markers = "platform_python_implementation != \"PyPy\" and python_version >= \"3.9\""}
[package.extras] [package.extras]
docs = ["sphinx (>=1.6.5)", "sphinx-rtd-theme"] docs = ["sphinx (<7)", "sphinx_rtd_theme"]
tests = ["hypothesis (>=3.27.0)", "pytest (>=3.2.1,!=3.3.0)"] tests = ["hypothesis (>=3.27.0)", "pytest (>=7.4.0)", "pytest-cov (>=2.10.1)", "pytest-xdist (>=3.5.0)"]
[[package]] [[package]]
name = "pyopenssl" name = "pyopenssl"
@ -2084,6 +2222,63 @@ typing-extensions = {version = ">=4.9", markers = "python_version < \"3.13\" and
docs = ["sphinx (!=5.2.0,!=5.2.0.post0,!=7.2.5)", "sphinx_rtd_theme"] docs = ["sphinx (!=5.2.0,!=5.2.0.post0,!=7.2.5)", "sphinx_rtd_theme"]
test = ["pretend", "pytest (>=3.0.1)", "pytest-rerunfailures"] test = ["pretend", "pytest (>=3.0.1)", "pytest-rerunfailures"]
[[package]]
name = "pyparsing"
version = "3.2.5"
description = "pyparsing - Classes and methods to define and execute parsing grammars"
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "pyparsing-3.2.5-py3-none-any.whl", hash = "sha256:e38a4f02064cf41fe6593d328d0512495ad1f3d8a91c4f73fc401b3079a59a5e"},
{file = "pyparsing-3.2.5.tar.gz", hash = "sha256:2df8d5b7b2802ef88e8d016a2eb9c7aeaa923529cd251ed0fe4608275d4105b6"},
]
[package.extras]
diagrams = ["jinja2", "railroad-diagrams"]
[[package]]
name = "pyrsistent"
version = "0.20.0"
description = "Persistent/Functional/Immutable data structures"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "pyrsistent-0.20.0-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:8c3aba3e01235221e5b229a6c05f585f344734bd1ad42a8ac51493d74722bbce"},
{file = "pyrsistent-0.20.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c1beb78af5423b879edaf23c5591ff292cf7c33979734c99aa66d5914ead880f"},
{file = "pyrsistent-0.20.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:21cc459636983764e692b9eba7144cdd54fdec23ccdb1e8ba392a63666c60c34"},
{file = "pyrsistent-0.20.0-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f5ac696f02b3fc01a710427585c855f65cd9c640e14f52abe52020722bb4906b"},
{file = "pyrsistent-0.20.0-cp310-cp310-win32.whl", hash = "sha256:0724c506cd8b63c69c7f883cc233aac948c1ea946ea95996ad8b1380c25e1d3f"},
{file = "pyrsistent-0.20.0-cp310-cp310-win_amd64.whl", hash = "sha256:8441cf9616d642c475684d6cf2520dd24812e996ba9af15e606df5f6fd9d04a7"},
{file = "pyrsistent-0.20.0-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:0f3b1bcaa1f0629c978b355a7c37acd58907390149b7311b5db1b37648eb6958"},
{file = "pyrsistent-0.20.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5cdd7ef1ea7a491ae70d826b6cc64868de09a1d5ff9ef8d574250d0940e275b8"},
{file = "pyrsistent-0.20.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cae40a9e3ce178415040a0383f00e8d68b569e97f31928a3a8ad37e3fde6df6a"},
{file = "pyrsistent-0.20.0-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6288b3fa6622ad8a91e6eb759cfc48ff3089e7c17fb1d4c59a919769314af224"},
{file = "pyrsistent-0.20.0-cp311-cp311-win32.whl", hash = "sha256:7d29c23bdf6e5438c755b941cef867ec2a4a172ceb9f50553b6ed70d50dfd656"},
{file = "pyrsistent-0.20.0-cp311-cp311-win_amd64.whl", hash = "sha256:59a89bccd615551391f3237e00006a26bcf98a4d18623a19909a2c48b8e986ee"},
{file = "pyrsistent-0.20.0-cp312-cp312-macosx_10_9_universal2.whl", hash = "sha256:09848306523a3aba463c4b49493a760e7a6ca52e4826aa100ee99d8d39b7ad1e"},
{file = "pyrsistent-0.20.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a14798c3005ec892bbada26485c2eea3b54109cb2533713e355c806891f63c5e"},
{file = "pyrsistent-0.20.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b14decb628fac50db5e02ee5a35a9c0772d20277824cfe845c8a8b717c15daa3"},
{file = "pyrsistent-0.20.0-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2e2c116cc804d9b09ce9814d17df5edf1df0c624aba3b43bc1ad90411487036d"},
{file = "pyrsistent-0.20.0-cp312-cp312-win32.whl", hash = "sha256:e78d0c7c1e99a4a45c99143900ea0546025e41bb59ebc10182e947cf1ece9174"},
{file = "pyrsistent-0.20.0-cp312-cp312-win_amd64.whl", hash = "sha256:4021a7f963d88ccd15b523787d18ed5e5269ce57aa4037146a2377ff607ae87d"},
{file = "pyrsistent-0.20.0-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:79ed12ba79935adaac1664fd7e0e585a22caa539dfc9b7c7c6d5ebf91fb89054"},
{file = "pyrsistent-0.20.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f920385a11207dc372a028b3f1e1038bb244b3ec38d448e6d8e43c6b3ba20e98"},
{file = "pyrsistent-0.20.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4f5c2d012671b7391803263419e31b5c7c21e7c95c8760d7fc35602353dee714"},
{file = "pyrsistent-0.20.0-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ef3992833fbd686ee783590639f4b8343a57f1f75de8633749d984dc0eb16c86"},
{file = "pyrsistent-0.20.0-cp38-cp38-win32.whl", hash = "sha256:881bbea27bbd32d37eb24dd320a5e745a2a5b092a17f6debc1349252fac85423"},
{file = "pyrsistent-0.20.0-cp38-cp38-win_amd64.whl", hash = "sha256:6d270ec9dd33cdb13f4d62c95c1a5a50e6b7cdd86302b494217137f760495b9d"},
{file = "pyrsistent-0.20.0-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:ca52d1ceae015859d16aded12584c59eb3825f7b50c6cfd621d4231a6cc624ce"},
{file = "pyrsistent-0.20.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b318ca24db0f0518630e8b6f3831e9cba78f099ed5c1d65ffe3e023003043ba0"},
{file = "pyrsistent-0.20.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fed2c3216a605dc9a6ea50c7e84c82906e3684c4e80d2908208f662a6cbf9022"},
{file = "pyrsistent-0.20.0-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2e14c95c16211d166f59c6611533d0dacce2e25de0f76e4c140fde250997b3ca"},
{file = "pyrsistent-0.20.0-cp39-cp39-win32.whl", hash = "sha256:f058a615031eea4ef94ead6456f5ec2026c19fb5bd6bfe86e9665c4158cf802f"},
{file = "pyrsistent-0.20.0-cp39-cp39-win_amd64.whl", hash = "sha256:58b8f6366e152092194ae68fefe18b9f0b4f89227dfd86a07770c3d86097aebf"},
{file = "pyrsistent-0.20.0-py3-none-any.whl", hash = "sha256:c55acc4733aad6560a7f5f818466631f07efc001fd023f34a6c203f8b6df0f0b"},
{file = "pyrsistent-0.20.0.tar.gz", hash = "sha256:4c48f78f62ab596c679086084d0dd13254ae4f3d6c72a83ffdf5ebdef8f265a4"},
]
[[package]] [[package]]
name = "pysaml2" name = "pysaml2"
version = "7.5.0" version = "7.5.0"
@ -2139,15 +2334,15 @@ files = [
[[package]] [[package]]
name = "pytz" name = "pytz"
version = "2022.7.1" version = "2025.2"
description = "World timezone definitions, modern and historical" description = "World timezone definitions, modern and historical"
optional = true optional = true
python-versions = "*" python-versions = "*"
groups = ["main"] groups = ["main"]
markers = "extra == \"all\" or extra == \"saml2\"" markers = "extra == \"all\" or extra == \"saml2\""
files = [ files = [
{file = "pytz-2022.7.1-py2.py3-none-any.whl", hash = "sha256:78f4f37d8198e0627c5f1143240bb0206b8691d8d7ac6d78fee88b78733f8c4a"}, {file = "pytz-2025.2-py2.py3-none-any.whl", hash = "sha256:5ddf76296dd8c44c26eb8f4b6f35488f3ccbf6fbbd7adee0b7262d43f0ec2f00"},
{file = "pytz-2022.7.1.tar.gz", hash = "sha256:01a0681c4b9684a28304615eba55d1ab31ae00bf68ec157ec3708a8182dbbcd0"}, {file = "pytz-2025.2.tar.gz", hash = "sha256:360b9e3dbb49a209c21ad61809c7fb453643e048b38924c765813546746e81c3"},
] ]
[[package]] [[package]]
@ -2481,31 +2676,31 @@ files = [
[[package]] [[package]]
name = "ruff" name = "ruff"
version = "0.14.5" version = "0.14.6"
description = "An extremely fast Python linter and code formatter, written in Rust." description = "An extremely fast Python linter and code formatter, written in Rust."
optional = false optional = false
python-versions = ">=3.7" python-versions = ">=3.7"
groups = ["dev"] groups = ["dev"]
files = [ files = [
{file = "ruff-0.14.5-py3-none-linux_armv6l.whl", hash = "sha256:f3b8248123b586de44a8018bcc9fefe31d23dda57a34e6f0e1e53bd51fd63594"}, {file = "ruff-0.14.6-py3-none-linux_armv6l.whl", hash = "sha256:d724ac2f1c240dbd01a2ae98db5d1d9a5e1d9e96eba999d1c48e30062df578a3"},
{file = "ruff-0.14.5-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:f7a75236570318c7a30edd7f5491945f0169de738d945ca8784500b517163a72"}, {file = "ruff-0.14.6-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:9f7539ea257aa4d07b7ce87aed580e485c40143f2473ff2f2b75aee003186004"},
{file = "ruff-0.14.5-py3-none-macosx_11_0_arm64.whl", hash = "sha256:6d146132d1ee115f8802356a2dc9a634dbf58184c51bff21f313e8cd1c74899a"}, {file = "ruff-0.14.6-py3-none-macosx_11_0_arm64.whl", hash = "sha256:7f6007e55b90a2a7e93083ba48a9f23c3158c433591c33ee2e99a49b889c6332"},
{file = "ruff-0.14.5-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e2380596653dcd20b057794d55681571a257a42327da8894b93bbd6111aa801f"}, {file = "ruff-0.14.6-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0a8e7b9d73d8728b68f632aa8e824ef041d068d231d8dbc7808532d3629a6bef"},
{file = "ruff-0.14.5-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2d1fa985a42b1f075a098fa1ab9d472b712bdb17ad87a8ec86e45e7fa6273e68"}, {file = "ruff-0.14.6-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d50d45d4553a3ebcbd33e7c5e0fe6ca4aafd9a9122492de357205c2c48f00775"},
{file = "ruff-0.14.5-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:88f0770d42b7fa02bbefddde15d235ca3aa24e2f0137388cc15b2dcbb1f7c7a7"}, {file = "ruff-0.14.6-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:118548dd121f8a21bfa8ab2c5b80e5b4aed67ead4b7567790962554f38e598ce"},
{file = "ruff-0.14.5-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:3676cb02b9061fee7294661071c4709fa21419ea9176087cb77e64410926eb78"}, {file = "ruff-0.14.6-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:57256efafbfefcb8748df9d1d766062f62b20150691021f8ab79e2d919f7c11f"},
{file = "ruff-0.14.5-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b595bedf6bc9cab647c4a173a61acf4f1ac5f2b545203ba82f30fcb10b0318fb"}, {file = "ruff-0.14.6-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ff18134841e5c68f8e5df1999a64429a02d5549036b394fafbe410f886e1989d"},
{file = "ruff-0.14.5-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f55382725ad0bdb2e8ee2babcbbfb16f124f5a59496a2f6a46f1d9d99d93e6e2"}, {file = "ruff-0.14.6-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:29c4b7ec1e66a105d5c27bd57fa93203637d66a26d10ca9809dc7fc18ec58440"},
{file = "ruff-0.14.5-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7497d19dce23976bdaca24345ae131a1d38dcfe1b0850ad8e9e6e4fa321a6e19"}, {file = "ruff-0.14.6-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:167843a6f78680746d7e226f255d920aeed5e4ad9c03258094a2d49d3028b105"},
{file = "ruff-0.14.5-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:410e781f1122d6be4f446981dd479470af86537fb0b8857f27a6e872f65a38e4"}, {file = "ruff-0.14.6-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:16a33af621c9c523b1ae006b1b99b159bf5ac7e4b1f20b85b2572455018e0821"},
{file = "ruff-0.14.5-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:c01be527ef4c91a6d55e53b337bfe2c0f82af024cc1a33c44792d6844e2331e1"}, {file = "ruff-0.14.6-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:1432ab6e1ae2dc565a7eea707d3b03a0c234ef401482a6f1621bc1f427c2ff55"},
{file = "ruff-0.14.5-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:f66e9bb762e68d66e48550b59c74314168ebb46199886c5c5aa0b0fbcc81b151"}, {file = "ruff-0.14.6-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:4c55cfbbe7abb61eb914bfd20683d14cdfb38a6d56c6c66efa55ec6570ee4e71"},
{file = "ruff-0.14.5-py3-none-musllinux_1_2_i686.whl", hash = "sha256:d93be8f1fa01022337f1f8f3bcaa7ffee2d0b03f00922c45c2207954f351f465"}, {file = "ruff-0.14.6-py3-none-musllinux_1_2_i686.whl", hash = "sha256:efea3c0f21901a685fff4befda6d61a1bf4cb43de16da87e8226a281d614350b"},
{file = "ruff-0.14.5-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:c135d4b681f7401fe0e7312017e41aba9b3160861105726b76cfa14bc25aa367"}, {file = "ruff-0.14.6-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:344d97172576d75dc6afc0e9243376dbe1668559c72de1864439c4fc95f78185"},
{file = "ruff-0.14.5-py3-none-win32.whl", hash = "sha256:c83642e6fccfb6dea8b785eb9f456800dcd6a63f362238af5fc0c83d027dd08b"}, {file = "ruff-0.14.6-py3-none-win32.whl", hash = "sha256:00169c0c8b85396516fdd9ce3446c7ca20c2a8f90a77aa945ba6b8f2bfe99e85"},
{file = "ruff-0.14.5-py3-none-win_amd64.whl", hash = "sha256:9d55d7af7166f143c94eae1db3312f9ea8f95a4defef1979ed516dbb38c27621"}, {file = "ruff-0.14.6-py3-none-win_amd64.whl", hash = "sha256:390e6480c5e3659f8a4c8d6a0373027820419ac14fa0d2713bd8e6c3e125b8b9"},
{file = "ruff-0.14.5-py3-none-win_arm64.whl", hash = "sha256:4b700459d4649e2594b31f20a9de33bc7c19976d4746d8d0798ad959621d64a4"}, {file = "ruff-0.14.6-py3-none-win_arm64.whl", hash = "sha256:d43c81fbeae52cfa8728d8766bbf46ee4298c888072105815b392da70ca836b2"},
{file = "ruff-0.14.5.tar.gz", hash = "sha256:8d3b48d7d8aad423d3137af7ab6c8b1e38e4de104800f0d596990f6ada1a9fc1"}, {file = "ruff-0.14.6.tar.gz", hash = "sha256:6f0c742ca6a7783a736b867a263b9a7a80a45ce9bee391eeda296895f1b4e1cc"},
] ]
[[package]] [[package]]
@ -3192,21 +3387,21 @@ files = [
[[package]] [[package]]
name = "urllib3" name = "urllib3"
version = "2.5.0" version = "2.6.3"
description = "HTTP library with thread-safe connection pooling, file post, and more." description = "HTTP library with thread-safe connection pooling, file post, and more."
optional = false optional = false
python-versions = ">=3.9" python-versions = ">=3.9"
groups = ["main", "dev"] groups = ["main", "dev"]
files = [ files = [
{file = "urllib3-2.5.0-py3-none-any.whl", hash = "sha256:e6b01673c0fa6a13e374b50871808eb3bf7046c4b125b216f6bf1cc604cff0dc"}, {file = "urllib3-2.6.3-py3-none-any.whl", hash = "sha256:bf272323e553dfb2e87d9bfd225ca7b0f467b919d7bbd355436d3fd37cb0acd4"},
{file = "urllib3-2.5.0.tar.gz", hash = "sha256:3fc47733c7e419d4bc3f6b3dc2b4f890bb743906a30d56ba4a5bfa4bbff92760"}, {file = "urllib3-2.6.3.tar.gz", hash = "sha256:1b62b6884944a57dbe321509ab94fd4d3b307075e0c2eae991ac71ee15ad38ed"},
] ]
[package.extras] [package.extras]
brotli = ["brotli (>=1.0.9) ; platform_python_implementation == \"CPython\"", "brotlicffi (>=0.8.0) ; platform_python_implementation != \"CPython\""] brotli = ["brotli (>=1.2.0) ; platform_python_implementation == \"CPython\"", "brotlicffi (>=1.2.0.0) ; platform_python_implementation != \"CPython\""]
h2 = ["h2 (>=4,<5)"] h2 = ["h2 (>=4,<5)"]
socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"] socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"]
zstd = ["zstandard (>=0.18.0)"] zstd = ["backports-zstd (>=1.0.0) ; python_version < \"3.14\""]
[[package]] [[package]]
name = "webencodings" name = "webencodings"
@ -3345,15 +3540,15 @@ docs = ["Sphinx", "repoze.sphinx.autointerface"]
test = ["zope.i18nmessageid", "zope.testing", "zope.testrunner"] test = ["zope.i18nmessageid", "zope.testing", "zope.testrunner"]
[extras] [extras]
all = ["authlib", "hiredis", "jaeger-client", "lxml", "matrix-synapse-ldap3", "opentracing", "psycopg2", "psycopg2cffi", "psycopg2cffi-compat", "pympler", "pysaml2", "sentry-sdk", "txredisapi"] all = ["authlib", "defusedxml", "hiredis", "jaeger-client", "lxml", "matrix-synapse-ldap3", "opentracing", "psycopg2", "psycopg2cffi", "psycopg2cffi-compat", "pympler", "pysaml2", "pytz", "sentry-sdk", "thrift", "tornado", "txredisapi"]
cache-memory = ["pympler"] cache-memory = ["pympler"]
jwt = ["authlib"] jwt = ["authlib"]
matrix-synapse-ldap3 = ["matrix-synapse-ldap3"] matrix-synapse-ldap3 = ["matrix-synapse-ldap3"]
oidc = ["authlib"] oidc = ["authlib"]
opentracing = ["jaeger-client", "opentracing"] opentracing = ["jaeger-client", "opentracing", "thrift", "tornado"]
postgres = ["psycopg2", "psycopg2cffi", "psycopg2cffi-compat"] postgres = ["psycopg2", "psycopg2cffi", "psycopg2cffi-compat"]
redis = ["hiredis", "txredisapi"] redis = ["hiredis", "txredisapi"]
saml2 = ["pysaml2"] saml2 = ["defusedxml", "pysaml2", "pytz"]
sentry = ["sentry-sdk"] sentry = ["sentry-sdk"]
systemd = ["systemd-python"] systemd = ["systemd-python"]
test = ["idna", "parameterized"] test = ["idna", "parameterized"]
@ -3362,4 +3557,4 @@ url-preview = ["lxml"]
[metadata] [metadata]
lock-version = "2.1" lock-version = "2.1"
python-versions = ">=3.10.0,<4.0.0" python-versions = ">=3.10.0,<4.0.0"
content-hash = "98b9062f48205a3bcc99b43ae665083d360a15d4a208927fa978df9c36fd5315" content-hash = "1caa5072f6304122c89377420f993a54f54587f3618ccc8094ec31642264592c"

View file

@ -1,6 +1,6 @@
[project] [project]
name = "matrix-synapse" name = "matrix-synapse"
version = "1.144.0" version = "1.145.0"
description = "Homeserver for the Matrix decentralised comms protocol" description = "Homeserver for the Matrix decentralised comms protocol"
readme = "README.rst" readme = "README.rst"
authors = [ authors = [
@ -42,7 +42,8 @@ dependencies = [
"Twisted[tls]>=21.2.0", "Twisted[tls]>=21.2.0",
"treq>=21.5.0", "treq>=21.5.0",
# Twisted has required pyopenssl 16.0 since about Twisted 16.6. # Twisted has required pyopenssl 16.0 since about Twisted 16.6.
"pyOpenSSL>=16.0.0", # pyOpenSSL 16.2.0 fixes compatibility with OpenSSL 1.1.0.
"pyOpenSSL>=16.2.0",
"PyYAML>=5.3", "PyYAML>=5.3",
"pyasn1>=0.1.9", "pyasn1>=0.1.9",
"pyasn1-modules>=0.0.7", "pyasn1-modules>=0.0.7",
@ -95,6 +96,25 @@ dependencies = [
# This is used for parsing multipart responses # This is used for parsing multipart responses
"python-multipart>=0.0.9", "python-multipart>=0.0.9",
# Transitive dependency constraints
# These dependencies aren't directly required by Synapse.
# However, in order for Synapse to build, Synapse requires a higher minimum version
# for these dependencies than the minimum specified by the direct dependency.
# We should periodically check to see if these dependencies are still necessary and
# remove any that are no longer required.
"cffi>=1.15", # via cryptography
"pynacl>=1.3", # via signedjson
"pyparsing>=2.4", # via packaging
"pyrsistent>=0.18.0", # via jsonschema
"requests>=2.16.0", # 2.16.0+ no longer vendors urllib3, avoiding Python 3.10+ incompatibility
"urllib3>=1.26.5", # via treq; 1.26.5 fixes Python 3.10+ collections.abc compatibility
# 5.2 is the current version in Debian oldstable. If we don't care to support that, then 5.4 is
# the minimum version from Ubuntu 22.04 and RHEL 9. (as of 2025-12)
# When bumping this version to 6.2 or above, refer to https://github.com/element-hq/synapse/pull/19274
# for details of Synapse improvements that may be unlocked. Particularly around the use of `|`
# syntax with zope interface types.
"zope-interface>=5.2", # via twisted
] ]
[project.optional-dependencies] [project.optional-dependencies]
@ -104,7 +124,16 @@ postgres = [
"psycopg2cffi>=2.8;platform_python_implementation == 'PyPy'", "psycopg2cffi>=2.8;platform_python_implementation == 'PyPy'",
"psycopg2cffi-compat==1.1;platform_python_implementation == 'PyPy'", "psycopg2cffi-compat==1.1;platform_python_implementation == 'PyPy'",
] ]
saml2 = ["pysaml2>=4.5.0"] saml2 = [
"pysaml2>=4.5.0",
# Transitive dependencies from pysaml2
# These dependencies aren't directly required by Synapse.
# However, in order for Synapse to build, Synapse requires a higher minimum version
# for these dependencies than the minimum specified by the direct dependency.
"defusedxml>=0.7.1", # via pysaml2
"pytz>=2018.3", # via pysaml2
]
oidc = ["authlib>=0.15.1"] oidc = ["authlib>=0.15.1"]
# systemd-python is necessary for logging to the systemd journal via # systemd-python is necessary for logging to the systemd journal via
# `systemd.journal.JournalHandler`, as is documented in # `systemd.journal.JournalHandler`, as is documented in
@ -112,15 +141,25 @@ oidc = ["authlib>=0.15.1"]
systemd = ["systemd-python>=231"] systemd = ["systemd-python>=231"]
url-preview = ["lxml>=4.6.3"] url-preview = ["lxml>=4.6.3"]
sentry = ["sentry-sdk>=0.7.2"] sentry = ["sentry-sdk>=0.7.2"]
opentracing = ["jaeger-client>=4.2.0", "opentracing>=2.2.0"] opentracing = [
"jaeger-client>=4.2.0",
"opentracing>=2.2.0",
# Transitive dependencies from jaeger-client
# These dependencies aren't directly required by Synapse.
# However, in order for Synapse to build, Synapse requires a higher minimum version
# for these dependencies than the minimum specified by the direct dependency.
"thrift>=0.10", # via jaeger-client
"tornado>=6.0", # via jaeger-client
]
jwt = ["authlib"] jwt = ["authlib"]
# hiredis is not a *strict* dependency, but it makes things much faster. # hiredis is not a *strict* dependency, but it makes things much faster.
# (if it is not installed, we fall back to slow code.) # (if it is not installed, we fall back to slow code.)
redis = ["txredisapi>=1.4.7", "hiredis"] redis = ["txredisapi>=1.4.7", "hiredis>=0.3"]
# Required to use experimental `caches.track_memory_usage` config option. # Required to use experimental `caches.track_memory_usage` config option.
cache-memory = ["pympler"] cache-memory = ["pympler>=1.0"]
# If this is updated, don't forget to update the equivalent lines in # If this is updated, don't forget to update the equivalent lines in
# tool.poetry.group.dev.dependencies. # `dependency-groups.dev` below.
test = ["parameterized>=0.9.0", "idna>=3.3"] test = ["parameterized>=0.9.0", "idna>=3.3"]
# The duplication here is awful. # The duplication here is awful.
@ -149,12 +188,22 @@ all = [
# opentracing # opentracing
"jaeger-client>=4.2.0", "opentracing>=2.2.0", "jaeger-client>=4.2.0", "opentracing>=2.2.0",
# redis # redis
"txredisapi>=1.4.7", "hiredis", "txredisapi>=1.4.7", "hiredis>=0.3",
# cache-memory # cache-memory
"pympler", # 1.0 added support for python 3.10, our current minimum supported python version
"pympler>=1.0",
# omitted: # omitted:
# - test: it's useful to have this separate from dev deps in the olddeps job # - test: it's useful to have this separate from dev deps in the olddeps job
# - systemd: this is a system-based requirement # - systemd: this is a system-based requirement
# Transitive dependencies
# These dependencies aren't directly required by Synapse.
# However, in order for Synapse to build, Synapse requires a higher minimum version
# for these dependencies than the minimum specified by the direct dependency.
"defusedxml>=0.7.1", # via pysaml2
"pytz>=2018.3", # via pysaml2
"thrift>=0.10", # via jaeger-client
"tornado>=6.0", # via jaeger-client
] ]
[project.urls] [project.urls]
@ -177,6 +226,85 @@ synapse_port_db = "synapse._scripts.synapse_port_db:main"
synapse_review_recent_signups = "synapse._scripts.review_recent_signups:main" synapse_review_recent_signups = "synapse._scripts.review_recent_signups:main"
update_synapse_database = "synapse._scripts.update_synapse_database:main" update_synapse_database = "synapse._scripts.update_synapse_database:main"
[tool.poetry]
packages = [{ include = "synapse" }]
[tool.poetry.build]
# Compile our rust module when using `poetry install`. This is still required
# while using `poetry` as the build frontend. Saves the developer from needing
# to run both:
#
# $ poetry install
# $ maturin develop
script = "build_rust.py"
# Create a `setup.py` file which will call the `build` method in our build
# script.
#
# Our build script currently uses the "old" build method, where we define a
# `build` method and `setup.py` calls it. Poetry developers have mentioned that
# this will eventually be removed:
# https://github.com/matrix-org/synapse/pull/14949#issuecomment-1418001859
#
# The new build method is defined here:
# https://python-poetry.org/docs/building-extension-modules/#maturin-build-script
# but is still marked as "unstable" at the time of writing. This would also
# bump our minimum `poetry-core` version to 1.5.0.
#
# We can just drop this work-around entirely if migrating away from
# Poetry, thus there's little motivation to update the build script.
generate-setup-file = true
# Dependencies used for developing Synapse itself.
#
# Hold off on migrating these to `dev-dependencies` (PEP 735) for now until
# Poetry 2.2.0+, pip 25.1+ are more widely available.
[tool.poetry.group.dev.dependencies]
# We pin development dependencies in poetry.lock so that our tests don't start
# failing on new releases. Keeping lower bounds loose here means that dependabot
# can bump versions without having to update the content-hash in the lockfile.
# This helps prevents merge conflicts when running a batch of dependabot updates.
ruff = "0.14.6"
# Typechecking
lxml-stubs = ">=0.4.0"
mypy = "*"
mypy-zope = "*"
types-bleach = ">=4.1.0"
types-jsonschema = ">=3.2.0"
types-netaddr = ">=0.8.0.6"
types-opentracing = ">=2.4.2"
types-Pillow = ">=8.3.4"
types-psycopg2 = ">=2.9.9"
types-pyOpenSSL = ">=20.0.7"
types-PyYAML = ">=5.4.10"
types-requests = ">=2.26.0"
types-setuptools = ">=57.4.0"
# Dependencies which are exclusively required by unit test code. This is
# NOT a list of all modules that are necessary to run the unit tests.
# Tests assume that all optional dependencies are installed.
#
# If this is updated, don't forget to update the equivalent lines in
# project.optional-dependencies.test.
parameterized = ">=0.9.0"
idna = ">=3.3"
# The following are used by the release script
click = ">=8.1.3"
# GitPython was == 3.1.14; bumped to 3.1.20, the first release with type hints.
GitPython = ">=3.1.20"
markdown-it-py = ">=3.0.0"
pygithub = ">=1.59"
# The following are executed as commands by the release script.
twine = "*"
# Towncrier min version comes from https://github.com/matrix-org/synapse/pull/3425. Rationale unclear.
towncrier = ">=18.6.0rc1"
# Used for checking the Poetry lockfile
tomli = ">=1.2.3"
# Used for checking the schema delta files
sqlglot = ">=28.0.0"
[tool.towncrier] [tool.towncrier]
package = "synapse" package = "synapse"
@ -260,15 +388,10 @@ select = [
"G", "G",
# pyupgrade # pyupgrade
"UP006", "UP006",
"UP007",
"UP045",
] ]
extend-safe-fixes = [ extend-safe-fixes = [
# pyupgrade rules compatible with Python >= 3.9 # pyupgrade rules compatible with Python >= 3.9
"UP006", "UP006",
"UP007",
# pyupgrade rules compatible with Python >= 3.10
"UP045",
# Allow ruff to automatically fix trailing spaces within a multi-line string/comment. # Allow ruff to automatically fix trailing spaces within a multi-line string/comment.
"W293" "W293"
] ]
@ -291,25 +414,23 @@ line-ending = "auto"
[tool.maturin] [tool.maturin]
manifest-path = "rust/Cargo.toml" manifest-path = "rust/Cargo.toml"
module-name = "synapse.synapse_rust" module-name = "synapse.synapse_rust"
python-source = "."
[tool.poetry]
packages = [
{ include = "synapse" },
]
include = [ include = [
{ path = "AUTHORS.rst", format = "sdist" }, { path = "AUTHORS.rst", format = "sdist" },
{ path = "book.toml", format = "sdist" }, { path = "book.toml", format = "sdist" },
{ path = "changelog.d", format = "sdist" }, { path = "changelog.d/**/*", format = "sdist" },
{ path = "CHANGES.md", format = "sdist" }, { path = "CHANGES.md", format = "sdist" },
{ path = "CONTRIBUTING.md", format = "sdist" }, { path = "CONTRIBUTING.md", format = "sdist" },
{ path = "demo", format = "sdist" }, { path = "demo/**/*", format = "sdist" },
{ path = "docs", format = "sdist" }, { path = "docs/**/*", format = "sdist" },
{ path = "INSTALL.md", format = "sdist" }, { path = "INSTALL.md", format = "sdist" },
{ path = "LICENSE-AGPL-3.0", format = "sdist" },
{ path = "LICENSE-COMMERCIAL", format = "sdist" },
{ path = "mypy.ini", format = "sdist" }, { path = "mypy.ini", format = "sdist" },
{ path = "scripts-dev", format = "sdist" }, { path = "scripts-dev/**/*", format = "sdist" },
{ path = "synmark", format="sdist" }, { path = "synmark/**/*", format = "sdist" },
{ path = "sytest-blacklist", format = "sdist" }, { path = "sytest-blacklist", format = "sdist" },
{ path = "tests", format = "sdist" }, { path = "tests/**/*", format = "sdist" },
{ path = "UPGRADE.rst", format = "sdist" }, { path = "UPGRADE.rst", format = "sdist" },
{ path = "Cargo.toml", format = "sdist" }, { path = "Cargo.toml", format = "sdist" },
{ path = "Cargo.lock", format = "sdist" }, { path = "Cargo.lock", format = "sdist" },
@ -318,62 +439,9 @@ include = [
{ path = "rust/src/**", format = "sdist" }, { path = "rust/src/**", format = "sdist" },
] ]
exclude = [ exclude = [
{ path = "synapse/*.so", format = "sdist"} { path = "synapse/*.so", format = "sdist" },
] ]
[tool.poetry.build]
script = "build_rust.py"
generate-setup-file = true
[tool.poetry.group.dev.dependencies]
# We pin development dependencies in poetry.lock so that our tests don't start
# failing on new releases. Keeping lower bounds loose here means that dependabot
# can bump versions without having to update the content-hash in the lockfile.
# This helps prevents merge conflicts when running a batch of dependabot updates.
ruff = "0.14.5"
# Typechecking
lxml-stubs = ">=0.4.0"
mypy = "*"
mypy-zope = "*"
types-bleach = ">=4.1.0"
types-jsonschema = ">=3.2.0"
types-netaddr = ">=0.8.0.6"
types-opentracing = ">=2.4.2"
types-Pillow = ">=8.3.4"
types-psycopg2 = ">=2.9.9"
types-pyOpenSSL = ">=20.0.7"
types-PyYAML = ">=5.4.10"
types-requests = ">=2.26.0"
types-setuptools = ">=57.4.0"
# Dependencies which are exclusively required by unit test code. This is
# NOT a list of all modules that are necessary to run the unit tests.
# Tests assume that all optional dependencies are installed.
#
# If this is updated, don't forget to update the equivalent lines in
# project.optional-dependencies.test.
parameterized = ">=0.9.0"
idna = ">=3.3"
# The following are used by the release script
click = ">=8.1.3"
# GitPython was == 3.1.14; bumped to 3.1.20, the first release with type hints.
GitPython = ">=3.1.20"
markdown-it-py = ">=3.0.0"
pygithub = ">=1.59"
# The following are executed as commands by the release script.
twine = "*"
# Towncrier min version comes from https://github.com/matrix-org/synapse/pull/3425. Rationale unclear.
towncrier = ">=18.6.0rc1"
# Used for checking the Poetry lockfile
tomli = ">=1.2.3"
# Used for checking the schema delta files
sqlglot = ">=28.0.0"
[build-system] [build-system]
# The upper bounds here are defensive, intended to prevent situations like # The upper bounds here are defensive, intended to prevent situations like
# https://github.com/matrix-org/synapse/issues/13849 and # https://github.com/matrix-org/synapse/issues/13849 and
@ -381,8 +449,8 @@ sqlglot = ">=28.0.0"
# runtime errors caused by build system changes. # runtime errors caused by build system changes.
# We are happy to raise these upper bounds upon request, # We are happy to raise these upper bounds upon request,
# provided we check that it's safe to do so (i.e. that CI passes). # provided we check that it's safe to do so (i.e. that CI passes).
requires = ["poetry-core>=2.0.0,<=2.1.3", "setuptools_rust>=1.3,<=1.11.1"] requires = ["maturin>=1.0,<2.0"]
build-backend = "poetry.core.masonry.api" build-backend = "maturin"
[tool.cibuildwheel] [tool.cibuildwheel]
@ -407,9 +475,6 @@ skip = "cp3??t-* *i686* *macosx*"
enable = "pypy" enable = "pypy"
# We need a rust compiler. # We need a rust compiler.
#
# We temporarily pin Rust to 1.82.0 to work around
# https://github.com/element-hq/synapse/issues/17988
before-all = "sh .ci/before_build_wheel.sh" before-all = "sh .ci/before_build_wheel.sh"
environment= { PATH = "$PATH:$HOME/.cargo/bin" } environment= { PATH = "$PATH:$HOME/.cargo/bin" }
@ -419,8 +484,3 @@ environment= { PATH = "$PATH:$HOME/.cargo/bin" }
before-build = "rm -rf {project}/build" before-build = "rm -rf {project}/build"
build-frontend = "build" build-frontend = "build"
test-command = "python -c 'from synapse.synapse_rust import sum_as_string; print(sum_as_string(1, 2))'" test-command = "python -c 'from synapse.synapse_rust import sum_as_string; print(sum_as_string(1, 2))'"
[tool.cibuildwheel.linux]
# Wrap the repair command to correctly rename the built cpython wheels as ABI3.
repair-wheel-command = "./.ci/scripts/auditwheel_wrapper.py -w {dest_dir} {wheel}"

View file

@ -1,5 +1,5 @@
$schema: https://element-hq.github.io/synapse/latest/schema/v1/meta.schema.json $schema: https://element-hq.github.io/synapse/latest/schema/v1/meta.schema.json
$id: https://element-hq.github.io/synapse/schema/synapse/v1.144/synapse-config.schema.json $id: https://element-hq.github.io/synapse/schema/synapse/v1.145/synapse-config.schema.json
type: object type: object
properties: properties:
modules: modules:
@ -2274,6 +2274,16 @@ properties:
examples: examples:
- per_second: 1.0 - per_second: 1.0
burst_count: 5.0 burst_count: 5.0
rc_user_directory:
$ref: "#/$defs/rc"
description: >-
This option allows admins to ratelimit searches in the user directory.
_Added in Synapse 1.145.0._
default:
per_second: 0.016
burst_count: 200.0
federation_rr_transactions_per_room_per_second: federation_rr_transactions_per_room_per_second:
type: integer type: integer
description: >- description: >-
@ -2338,6 +2348,19 @@ properties:
default: true default: true
examples: examples:
- false - false
enable_local_media_storage:
type: boolean
description: >-
Enable the local on-disk media storage provider. When disabled, media is
stored only in configured `media_storage_providers` and temporary files are
used for processing.
**Warning:** If this option is set to `false` and no `media_storage_providers`
are configured, all media requests will return 404 errors as there will be
no storage backend available.
default: true
examples:
- false
media_store_path: media_store_path:
type: string type: string
description: Directory where uploaded images and attachments are stored. description: Directory where uploaded images and attachments are stored.

View file

@ -31,7 +31,7 @@ DISTS = (
"debian:sid", # (rolling distro, no EOL) "debian:sid", # (rolling distro, no EOL)
"ubuntu:jammy", # 22.04 LTS (EOL 2027-04) (our EOL forced by Python 3.10 is 2026-10-04) "ubuntu:jammy", # 22.04 LTS (EOL 2027-04) (our EOL forced by Python 3.10 is 2026-10-04)
"ubuntu:noble", # 24.04 LTS (EOL 2029-06) "ubuntu:noble", # 24.04 LTS (EOL 2029-06)
"ubuntu:plucky", # 25.04 (EOL 2026-01) "ubuntu:questing", # 25.10 (EOL 2026-07)
"debian:trixie", # (EOL not specified yet) "debian:trixie", # (EOL not specified yet)
) )
@ -94,6 +94,7 @@ class Builder:
build_args = ( build_args = (
( (
"docker", "docker",
"buildx",
"build", "build",
"--tag", "--tag",
"dh-venv-builder:" + tag, "dh-venv-builder:" + tag,

View file

@ -14,7 +14,6 @@ import sqlglot.expressions
SCHEMA_FILE_REGEX = re.compile(r"^synapse/storage/schema/(.*)/delta/(.*)/(.*)$") SCHEMA_FILE_REGEX = re.compile(r"^synapse/storage/schema/(.*)/delta/(.*)/(.*)$")
# The base branch we want to check against. We use the main development branch # The base branch we want to check against. We use the main development branch
# on the assumption that is what we are developing against. # on the assumption that is what we are developing against.
DEVELOP_BRANCH = "develop" DEVELOP_BRANCH = "develop"
@ -188,6 +187,14 @@ def check_schema_delta(delta_files: list[str], force_colors: bool) -> bool:
sql_lang = "postgres" sql_lang = "postgres"
if delta_file.endswith(".sqlite"): if delta_file.endswith(".sqlite"):
sql_lang = "sqlite" sql_lang = "sqlite"
elif delta_file.endswith(".py"):
click.secho(
f"Skipping Python delta file: '{delta_file}'",
fg="yellow",
bold=True,
color=force_colors,
)
return True
statements = sqlglot.parse(delta_contents, read=sql_lang) statements = sqlglot.parse(delta_contents, read=sql_lang)

View file

@ -145,7 +145,7 @@ def request(
print("Requesting %s" % dest, file=sys.stderr) print("Requesting %s" % dest, file=sys.stderr)
s = requests.Session() s = requests.Session()
s.mount("matrix-federation://", MatrixConnectionAdapter()) s.mount("matrix-federation://", MatrixConnectionAdapter(verify_tls=verify_tls))
headers: dict[str, str] = { headers: dict[str, str] = {
"Authorization": authorization_headers[0], "Authorization": authorization_headers[0],
@ -267,6 +267,17 @@ def read_args_from_config(args: argparse.Namespace) -> None:
class MatrixConnectionAdapter(HTTPAdapter): class MatrixConnectionAdapter(HTTPAdapter):
"""
A Matrix federation-aware HTTP Adapter.
"""
verify_tls: bool
"""whether to verify the remote server's TLS certificate."""
def __init__(self, verify_tls: bool = True) -> None:
self.verify_tls = verify_tls
super().__init__()
def send( def send(
self, self,
request: PreparedRequest, request: PreparedRequest,
@ -280,7 +291,7 @@ class MatrixConnectionAdapter(HTTPAdapter):
assert isinstance(request.url, str) assert isinstance(request.url, str)
parsed = urlparse.urlsplit(request.url) parsed = urlparse.urlsplit(request.url)
server_name = parsed.netloc server_name = parsed.netloc
well_known = self._get_well_known(parsed.netloc) well_known = self._get_well_known(parsed.netloc, verify_tls=self.verify_tls)
if well_known: if well_known:
server_name = well_known server_name = well_known
@ -318,6 +329,21 @@ class MatrixConnectionAdapter(HTTPAdapter):
print( print(
f"Connecting to {host}:{port} with SNI {ssl_server_name}", file=sys.stderr f"Connecting to {host}:{port} with SNI {ssl_server_name}", file=sys.stderr
) )
if proxies:
scheme = parsed.scheme
if isinstance(scheme, bytes):
scheme = scheme.decode("utf-8")
proxy_for_scheme = proxies.get(scheme)
if proxy_for_scheme:
return self.proxy_manager_for(proxy_for_scheme).connection_from_host(
host,
port=port,
scheme="https",
pool_kwargs={"server_hostname": ssl_server_name},
)
return self.poolmanager.connection_from_host( return self.poolmanager.connection_from_host(
host, host,
port=port, port=port,
@ -368,7 +394,7 @@ class MatrixConnectionAdapter(HTTPAdapter):
return server_name, 8448, server_name return server_name, 8448, server_name
@staticmethod @staticmethod
def _get_well_known(server_name: str) -> str | None: def _get_well_known(server_name: str, verify_tls: bool = True) -> str | None:
if ":" in server_name: if ":" in server_name:
# explicit port, or ipv6 literal. Either way, no .well-known # explicit port, or ipv6 literal. Either way, no .well-known
return None return None
@ -379,7 +405,7 @@ class MatrixConnectionAdapter(HTTPAdapter):
print(f"fetching {uri}", file=sys.stderr) print(f"fetching {uri}", file=sys.stderr)
try: try:
resp = requests.get(uri) resp = requests.get(uri, verify=verify_tls)
if resp.status_code != 200: if resp.status_code != 200:
print("%s gave %i" % (uri, resp.status_code), file=sys.stderr) print("%s gave %i" % (uri, resp.status_code), file=sys.stderr)
return None return None

View file

@ -133,6 +133,7 @@ prometheus_metric_fullname_to_label_arg_map: Mapping[str, ArgLocation | None] =
"prometheus_client.metrics.Info": ArgLocation("labelnames", 2), "prometheus_client.metrics.Info": ArgLocation("labelnames", 2),
"prometheus_client.metrics.Enum": ArgLocation("labelnames", 2), "prometheus_client.metrics.Enum": ArgLocation("labelnames", 2),
"synapse.metrics.LaterGauge": ArgLocation("labelnames", 2), "synapse.metrics.LaterGauge": ArgLocation("labelnames", 2),
"synapse.metrics._InFlightGaugeRuntime": ArgLocation("labels", 2),
"synapse.metrics.InFlightGauge": ArgLocation("labels", 2), "synapse.metrics.InFlightGauge": ArgLocation("labels", 2),
"synapse.metrics.GaugeBucketCollector": ArgLocation("labelnames", 2), "synapse.metrics.GaugeBucketCollector": ArgLocation("labelnames", 2),
"prometheus_client.registry.Collector": None, "prometheus_client.registry.Collector": None,

View file

@ -32,7 +32,7 @@ import time
import urllib.request import urllib.request
from os import path from os import path
from tempfile import TemporaryDirectory from tempfile import TemporaryDirectory
from typing import Any, Match from typing import Any
import attr import attr
import click import click
@ -455,19 +455,19 @@ def _publish(gh_token: str) -> None:
gh = Github(auth=github.Auth.Token(token=gh_token)) gh = Github(auth=github.Auth.Token(token=gh_token))
gh_repo = gh.get_repo("element-hq/synapse") gh_repo = gh.get_repo("element-hq/synapse")
for release in gh_repo.get_releases(): for release in gh_repo.get_releases():
if release.title == tag_name: if release.name == tag_name:
break break
else: else:
raise ClickException(f"Failed to find GitHub release for {tag_name}") raise ClickException(f"Failed to find GitHub release for {tag_name}")
assert release.title == tag_name assert release.name == tag_name
if not release.draft: if not release.draft:
click.echo("Release already published.") click.echo("Release already published.")
return return
release = release.update_release( release = release.update_release(
name=release.title, name=release.name,
message=release.body, message=release.body,
tag_name=release.tag_name, tag_name=release.tag_name,
prerelease=release.prerelease, prerelease=release.prerelease,
@ -968,10 +968,6 @@ def generate_and_write_changelog(
new_changes = new_changes.replace( new_changes = new_changes.replace(
"No significant changes.", f"No significant changes since {current_version}." "No significant changes.", f"No significant changes since {current_version}."
) )
new_changes += build_dependabot_changelog(
repo,
current_version,
)
# Prepend changes to changelog # Prepend changes to changelog
with open("CHANGES.md", "r+") as f: with open("CHANGES.md", "r+") as f:
@ -986,49 +982,5 @@ def generate_and_write_changelog(
os.remove(filename) os.remove(filename)
def build_dependabot_changelog(repo: Repo, current_version: version.Version) -> str:
"""Summarise dependabot commits between `current_version` and `release_branch`.
Returns an empty string if there have been no such commits; otherwise outputs a
third-level markdown header followed by an unordered list."""
last_release_commit = repo.tag("v" + str(current_version)).commit
rev_spec = f"{last_release_commit.hexsha}.."
commits = list(git.objects.Commit.iter_items(repo, rev_spec))
messages = []
for commit in reversed(commits):
if commit.author.name == "dependabot[bot]":
message: str | bytes = commit.message
if isinstance(message, bytes):
message = message.decode("utf-8")
messages.append(message.split("\n", maxsplit=1)[0])
if not messages:
print(f"No dependabot commits in range {rev_spec}", file=sys.stderr)
return ""
messages.sort()
def replacer(match: Match[str]) -> str:
desc = match.group(1)
number = match.group(2)
return f"* {desc}. ([\\#{number}](https://github.com/element-hq/synapse/issues/{number}))"
for i, message in enumerate(messages):
messages[i] = re.sub(r"(.*) \(#(\d+)\)$", replacer, message)
messages.insert(0, "### Updates to locked dependencies\n")
# Add an extra blank line to the bottom of the section
messages.append("")
return "\n".join(messages)
@cli.command()
@click.argument("since")
def test_dependabot_changelog(since: str) -> None:
"""Test building the dependabot changelog.
Summarises all dependabot commits between the SINCE tag and the current git HEAD."""
print(build_dependabot_changelog(git.Repo("."), version.Version(since)))
if __name__ == "__main__": if __name__ == "__main__":
cli() cli()

View file

@ -172,7 +172,7 @@ if __name__ == "__main__":
# Expect JSON data on stdin. # Expect JSON data on stdin.
context, book = json.load(sys.stdin) context, book = json.load(sys.stdin)
for section in book["sections"]: for section in book["items"]:
if "Chapter" in section and section["Chapter"]["path"] == "upgrade.md": if "Chapter" in section and section["Chapter"]["path"] == "upgrade.md":
section["Chapter"]["content"] = section["Chapter"]["content"].replace( section["Chapter"]["content"] = section["Chapter"]["content"].replace(
"<!-- REPLACE_WITH_SCHEMA_VERSIONS -->", calculate_version_chart() "<!-- REPLACE_WITH_SCHEMA_VERSIONS -->", calculate_version_chart()

View file

@ -29,6 +29,19 @@ from typing import Final
# the max size of a (canonical-json-encoded) event # the max size of a (canonical-json-encoded) event
MAX_PDU_SIZE = 65536 MAX_PDU_SIZE = 65536
# The maximum allowed size of an HTTP request.
# Other than media uploads, the biggest request we expect to see is a fully-loaded
# /federation/v1/send request.
#
# The main thing in such a request is up to 50 PDUs, and up to 100 EDUs. PDUs are
# limited to 65536 bytes (possibly slightly more if the sender didn't use canonical
# json encoding); there is no specced limit to EDUs (see
# https://github.com/matrix-org/matrix-doc/issues/3121).
#
# in short, we somewhat arbitrarily limit requests to 200 * 64K (about 12.5M)
#
MAX_REQUEST_SIZE = 200 * MAX_PDU_SIZE
# Max/min size of ints in canonical JSON # Max/min size of ints in canonical JSON
CANONICALJSON_MAX_INT = (2**53) - 1 CANONICALJSON_MAX_INT = (2**53) - 1
CANONICALJSON_MIN_INT = -CANONICALJSON_MAX_INT CANONICALJSON_MIN_INT = -CANONICALJSON_MAX_INT

View file

@ -856,6 +856,12 @@ class HttpResponseException(CodeMessageException):
return ProxiedRequestError(self.code, errmsg, errcode, j) return ProxiedRequestError(self.code, errmsg, errcode, j)
class HomeServerNotSetupException(Exception):
"""
Raised when an operation is attempted on the HomeServer before setup() has been called.
"""
class ShadowBanError(Exception): class ShadowBanError(Exception):
""" """
Raised when a shadow-banned user attempts to perform an action. Raised when a shadow-banned user attempts to perform an action.

View file

@ -54,7 +54,9 @@ def check_bind_error(
""" """
if address == "0.0.0.0" and "::" in bind_addresses: if address == "0.0.0.0" and "::" in bind_addresses:
logger.warning( logger.warning(
"Failed to listen on 0.0.0.0, continuing because listening on [::]" "Failed to listen on 0.0.0.0, continuing because listening on [::]. Original exception: %s: %s",
type(e).__name__,
str(e),
) )
else: else:
raise e raise e

View file

@ -36,12 +36,13 @@ from typing import (
Awaitable, Awaitable,
Callable, Callable,
NoReturn, NoReturn,
Optional,
cast, cast,
) )
from wsgiref.simple_server import WSGIServer from wsgiref.simple_server import WSGIServer
from cryptography.utils import CryptographyDeprecationWarning from cryptography.utils import CryptographyDeprecationWarning
from typing_extensions import ParamSpec from typing_extensions import ParamSpec, assert_never
import twisted import twisted
from twisted.internet import defer, error, reactor as _reactor from twisted.internet import defer, error, reactor as _reactor
@ -59,12 +60,17 @@ from twisted.python.threadpool import ThreadPool
from twisted.web.resource import Resource from twisted.web.resource import Resource
import synapse.util.caches import synapse.util.caches
from synapse.api.constants import MAX_PDU_SIZE from synapse.api.constants import MAX_REQUEST_SIZE
from synapse.app import check_bind_error from synapse.app import check_bind_error
from synapse.config import ConfigError from synapse.config import ConfigError
from synapse.config._base import format_config_error from synapse.config._base import format_config_error
from synapse.config.homeserver import HomeServerConfig from synapse.config.homeserver import HomeServerConfig
from synapse.config.server import ListenerConfig, ManholeConfig, TCPListenerConfig from synapse.config.server import (
ListenerConfig,
ManholeConfig,
TCPListenerConfig,
UnixListenerConfig,
)
from synapse.crypto import context_factory from synapse.crypto import context_factory
from synapse.events.auto_accept_invites import InviteAutoAccepter from synapse.events.auto_accept_invites import InviteAutoAccepter
from synapse.events.presence_router import load_legacy_presence_router from synapse.events.presence_router import load_legacy_presence_router
@ -413,13 +419,44 @@ def listen_unix(
] ]
class ListenerException(RuntimeError):
"""
An exception raised when we fail to listen with the given `ListenerConfig`.
Attributes:
listener_config: The listener config that caused the exception.
"""
def __init__(
self,
listener_config: ListenerConfig,
):
listener_human_name = ""
port = ""
if isinstance(listener_config, TCPListenerConfig):
listener_human_name = "TCP port"
port = str(listener_config.port)
elif isinstance(listener_config, UnixListenerConfig):
listener_human_name = "unix socket"
port = listener_config.path
else:
assert_never(listener_config)
super().__init__(
"Failed to listen on %s (%s) with the given listener config: %s"
% (listener_human_name, port, listener_config)
)
self.listener_config = listener_config
def listen_http( def listen_http(
hs: "HomeServer", hs: "HomeServer",
listener_config: ListenerConfig, listener_config: ListenerConfig,
root_resource: Resource, root_resource: Resource,
version_string: str, version_string: str,
max_request_body_size: int, max_request_body_size: int,
context_factory: IOpenSSLContextFactory | None, context_factory: Optional[IOpenSSLContextFactory],
reactor: ISynapseReactor = reactor, reactor: ISynapseReactor = reactor,
) -> list[Port]: ) -> list[Port]:
""" """
@ -447,39 +484,55 @@ def listen_http(
hs=hs, hs=hs,
) )
if isinstance(listener_config, TCPListenerConfig): try:
if listener_config.is_tls(): if isinstance(listener_config, TCPListenerConfig):
# refresh_certificate should have been called before this. if listener_config.is_tls():
assert context_factory is not None # refresh_certificate should have been called before this.
ports = listen_ssl( assert context_factory is not None
listener_config.bind_addresses, ports = listen_ssl(
listener_config.port, listener_config.bind_addresses,
site, listener_config.port,
context_factory, site,
reactor=reactor, context_factory,
reactor=reactor,
)
logger.info(
"Synapse now listening on TCP port %d (TLS)", listener_config.port
)
else:
ports = listen_tcp(
listener_config.bind_addresses,
listener_config.port,
site,
reactor=reactor,
)
logger.info(
"Synapse now listening on TCP port %d", listener_config.port
)
elif isinstance(listener_config, UnixListenerConfig):
ports = listen_unix(
listener_config.path, listener_config.mode, site, reactor=reactor
) )
# getHost() returns a UNIXAddress which contains an instance variable of 'name'
# encoded as a byte string. Decode as utf-8 so pretty.
logger.info( logger.info(
"Synapse now listening on TCP port %d (TLS)", listener_config.port "Synapse now listening on Unix Socket at: %s",
ports[0].getHost().name.decode("utf-8"),
) )
else: else:
ports = listen_tcp( assert_never(listener_config)
listener_config.bind_addresses, except Exception as exc:
listener_config.port, # The Twisted interface says that "Users should not call this function
site, # themselves!" but this appears to be the correct/only way handle proper cleanup
reactor=reactor, # of the site when things go wrong. In the normal case, a `Port` is created
) # which we can call `Port.stopListening()` on to do the same thing (but no
logger.info("Synapse now listening on TCP port %d", listener_config.port) # `Port` is created when an error occurs).
#
else: # We use `site.stopFactory()` instead of `site.doStop()` as the latter assumes
ports = listen_unix( # that `site.doStart()` was called (which won't be the case if an error occurs).
listener_config.path, listener_config.mode, site, reactor=reactor site.stopFactory()
) raise ListenerException(listener_config) from exc
# getHost() returns a UNIXAddress which contains an instance variable of 'name'
# encoded as a byte string. Decode as utf-8 so pretty.
logger.info(
"Synapse now listening on Unix Socket at: %s",
ports[0].getHost().name.decode("utf-8"),
)
return ports return ports
@ -843,17 +896,8 @@ def sdnotify(state: bytes) -> None:
def max_request_body_size(config: HomeServerConfig) -> int: def max_request_body_size(config: HomeServerConfig) -> int:
"""Get a suitable maximum size for incoming HTTP requests""" """Get a suitable maximum size for incoming HTTP requests"""
# Other than media uploads, the biggest request we expect to see is a fully-loaded # Baseline default for any request that isn't configured in the homeserver config
# /federation/v1/send request. max_request_size = MAX_REQUEST_SIZE
#
# The main thing in such a request is up to 50 PDUs, and up to 100 EDUs. PDUs are
# limited to 65536 bytes (possibly slightly more if the sender didn't use canonical
# json encoding); there is no specced limit to EDUs (see
# https://github.com/matrix-org/matrix-doc/issues/3121).
#
# in short, we somewhat arbitrarily limit requests to 200 * 64K (about 12.5M)
#
max_request_size = 200 * MAX_PDU_SIZE
# if we have a media repo enabled, we may need to allow larger uploads than that # if we have a media repo enabled, we may need to allow larger uploads than that
if config.media.can_load_media_repo: if config.media.can_load_media_repo:

View file

@ -24,7 +24,7 @@ import logging
import os import os
import sys import sys
import tempfile import tempfile
from typing import Mapping, Sequence from typing import Mapping, Optional, Sequence
from twisted.internet import defer, task from twisted.internet import defer, task
@ -291,7 +291,7 @@ def load_config(argv_options: list[str]) -> tuple[HomeServerConfig, argparse.Nam
def create_homeserver( def create_homeserver(
config: HomeServerConfig, config: HomeServerConfig,
reactor: ISynapseReactor | None = None, reactor: Optional[ISynapseReactor] = None,
) -> AdminCmdServer: ) -> AdminCmdServer:
""" """
Create a homeserver instance for the Synapse admin command process. Create a homeserver instance for the Synapse admin command process.

View file

@ -21,6 +21,7 @@
# #
import logging import logging
import sys import sys
from typing import Optional
from twisted.web.resource import Resource from twisted.web.resource import Resource
@ -335,7 +336,7 @@ def load_config(argv_options: list[str]) -> HomeServerConfig:
def create_homeserver( def create_homeserver(
config: HomeServerConfig, config: HomeServerConfig,
reactor: ISynapseReactor | None = None, reactor: Optional[ISynapseReactor] = None,
) -> GenericWorkerServer: ) -> GenericWorkerServer:
""" """
Create a homeserver instance for the Synapse worker process. Create a homeserver instance for the Synapse worker process.

View file

@ -22,7 +22,7 @@
import logging import logging
import os import os
import sys import sys
from typing import Iterable from typing import Iterable, Optional
from twisted.internet.tcp import Port from twisted.internet.tcp import Port
from twisted.web.resource import EncodingResourceWrapper, Resource from twisted.web.resource import EncodingResourceWrapper, Resource
@ -350,7 +350,7 @@ def load_or_generate_config(argv_options: list[str]) -> HomeServerConfig:
def create_homeserver( def create_homeserver(
config: HomeServerConfig, config: HomeServerConfig,
reactor: ISynapseReactor | None = None, reactor: Optional[ISynapseReactor] = None,
) -> SynapseHomeServer: ) -> SynapseHomeServer:
""" """
Create a homeserver instance for the Synapse main process. Create a homeserver instance for the Synapse main process.

View file

@ -44,6 +44,7 @@ import jinja2
import yaml import yaml
from synapse.types import StrSequence from synapse.types import StrSequence
from synapse.util.stringutils import parse_and_validate_server_name
from synapse.util.templates import _create_mxc_to_http_filter, _format_ts_filter from synapse.util.templates import _create_mxc_to_http_filter, _format_ts_filter
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -465,6 +466,7 @@ class RootConfig:
generate_secrets: bool = False, generate_secrets: bool = False,
report_stats: bool | None = None, report_stats: bool | None = None,
open_private_ports: bool = False, open_private_ports: bool = False,
enable_metrics: bool = False,
listeners: list[dict] | None = None, listeners: list[dict] | None = None,
tls_certificate_path: str | None = None, tls_certificate_path: str | None = None,
tls_private_key_path: str | None = None, tls_private_key_path: str | None = None,
@ -495,9 +497,15 @@ class RootConfig:
open_private_ports: True to leave private ports (such as the non-TLS open_private_ports: True to leave private ports (such as the non-TLS
HTTP listener) open to the internet. HTTP listener) open to the internet.
enable_metrics: True to set `enable_metrics: true` and when using the
default set of listeners, will also add the metrics listener on port 19090.
listeners: A list of descriptions of the listeners synapse should listeners: A list of descriptions of the listeners synapse should
start with each of which specifies a port (int), a list of start with each of which specifies a port (int), a list of resources
resources (list(str)), tls (bool) and type (str). For example: (list(str)), tls (bool) and type (str). There is a default set of
listeners when `None`.
Example usage:
[{ [{
"port": 8448, "port": 8448,
"resources": [{"names": ["federation"]}], "resources": [{"names": ["federation"]}],
@ -518,6 +526,35 @@ class RootConfig:
Returns: Returns:
The yaml config file The yaml config file
""" """
_, bind_port = parse_and_validate_server_name(server_name)
if bind_port is not None:
unsecure_port = bind_port - 400
else:
bind_port = 8448
unsecure_port = 8008
# The default listeners
if listeners is None:
listeners = [
{
"port": unsecure_port,
"tls": False,
"type": "http",
"x_forwarded": True,
"resources": [
{"names": ["client", "federation"], "compress": False}
],
}
]
if enable_metrics:
listeners.append(
{
"port": 19090,
"tls": False,
"type": "metrics",
}
)
conf = CONFIG_FILE_HEADER + "\n".join( conf = CONFIG_FILE_HEADER + "\n".join(
dedent(conf) dedent(conf)
@ -529,6 +566,7 @@ class RootConfig:
generate_secrets=generate_secrets, generate_secrets=generate_secrets,
report_stats=report_stats, report_stats=report_stats,
open_private_ports=open_private_ports, open_private_ports=open_private_ports,
enable_metrics=enable_metrics,
listeners=listeners, listeners=listeners,
tls_certificate_path=tls_certificate_path, tls_certificate_path=tls_certificate_path,
tls_private_key_path=tls_private_key_path, tls_private_key_path=tls_private_key_path,
@ -756,6 +794,14 @@ class RootConfig:
" internet. Do not use this unless you know what you are doing." " internet. Do not use this unless you know what you are doing."
), ),
) )
generate_group.add_argument(
"--enable-metrics",
action="store_true",
help=(
"Sets `enable_metrics: true` and when using the default set of listeners, "
"will also add the metrics listener on port 19090."
),
)
cls.invoke_all_static("add_arguments", parser) cls.invoke_all_static("add_arguments", parser)
config_args = parser.parse_args(argv_options) config_args = parser.parse_args(argv_options)
@ -812,6 +858,7 @@ class RootConfig:
report_stats=(config_args.report_stats == "yes"), report_stats=(config_args.report_stats == "yes"),
generate_secrets=True, generate_secrets=True,
open_private_ports=config_args.open_private_ports, open_private_ports=config_args.open_private_ports,
enable_metrics=config_args.enable_metrics,
) )
os.makedirs(config_dir_path, exist_ok=True) os.makedirs(config_dir_path, exist_ok=True)

View file

@ -146,6 +146,7 @@ class RootConfig:
generate_secrets: bool = ..., generate_secrets: bool = ...,
report_stats: bool | None = ..., report_stats: bool | None = ...,
open_private_ports: bool = ..., open_private_ports: bool = ...,
enable_metrics: bool = ...,
listeners: Any | None = ..., listeners: Any | None = ...,
tls_certificate_path: str | None = ..., tls_certificate_path: str | None = ...,
tls_private_key_path: str | None = ..., tls_private_key_path: str | None = ...,

View file

@ -378,27 +378,9 @@ class ExperimentalConfig(Config):
# MSC3026 (busy presence state) # MSC3026 (busy presence state)
self.msc3026_enabled: bool = experimental.get("msc3026_enabled", False) self.msc3026_enabled: bool = experimental.get("msc3026_enabled", False)
# MSC2697 (device dehydration)
# Enabled by default since this option was added after adding the feature.
# It is not recommended that both MSC2697 and MSC3814 both be enabled at
# once.
self.msc2697_enabled: bool = experimental.get("msc2697_enabled", True)
# MSC3814 (dehydrated devices with SSSS) # MSC3814 (dehydrated devices with SSSS)
# This is an alternative method to achieve the same goals as MSC2697.
# It is not recommended that both MSC2697 and MSC3814 both be enabled at
# once.
self.msc3814_enabled: bool = experimental.get("msc3814_enabled", False) self.msc3814_enabled: bool = experimental.get("msc3814_enabled", False)
if self.msc2697_enabled and self.msc3814_enabled:
raise ConfigError(
"MSC2697 and MSC3814 should not both be enabled.",
(
"experimental_features",
"msc3814_enabled",
),
)
# MSC3244 (room version capabilities) # MSC3244 (room version capabilities)
self.msc3244_enabled: bool = experimental.get("msc3244_enabled", True) self.msc3244_enabled: bool = experimental.get("msc3244_enabled", True)

View file

@ -75,10 +75,19 @@ class MetricsConfig(Config):
) )
def generate_config_section( def generate_config_section(
self, report_stats: bool | None = None, **kwargs: Any self,
report_stats: bool | None = None,
enable_metrics: bool = False,
**kwargs: Any,
) -> str: ) -> str:
if report_stats is not None: if report_stats is not None:
res = "report_stats: %s\n" % ("true" if report_stats else "false") res = "report_stats: %s\n" % ("true" if report_stats else "false")
else: else:
res = "\n" res = "\n"
# We avoid adding anything if it's `False` since that's the default (less noise
# in the default generated config)
if enable_metrics:
res += "enable_metrics: true\n"
return res return res

View file

@ -252,3 +252,9 @@ class RatelimitConfig(Config):
"rc_reports", "rc_reports",
defaults={"per_second": 1, "burst_count": 5}, defaults={"per_second": 1, "burst_count": 5},
) )
self.rc_user_directory = RatelimitSettings.parse(
config,
"rc_user_directory",
defaults={"per_second": 0.016, "burst_count": 200},
)

View file

@ -174,6 +174,11 @@ class ContentRepositoryConfig(Config):
config.get("media_store_path", "media_store") config.get("media_store_path", "media_store")
) )
# Whether to enable the local media storage provider. When disabled,
# media will only be stored in configured storage providers and temp
# files will be used for processing.
self.enable_local_media_storage = config.get("enable_local_media_storage", True)
backup_media_store_path = config.get("backup_media_store_path") backup_media_store_path = config.get("backup_media_store_path")
synchronous_backup_media_store = config.get( synchronous_backup_media_store = config.get(

View file

@ -923,26 +923,21 @@ class ServerConfig(Config):
def generate_config_section( def generate_config_section(
self, self,
*,
config_dir_path: str, config_dir_path: str,
data_dir_path: str, data_dir_path: str,
server_name: str, server_name: str,
open_private_ports: bool, open_private_ports: bool = False,
listeners: list[dict] | None, listeners: list[dict] | None = None,
**kwargs: Any, **kwargs: Any,
) -> str: ) -> str:
_, bind_port = parse_and_validate_server_name(server_name)
if bind_port is not None:
unsecure_port = bind_port - 400
else:
bind_port = 8448
unsecure_port = 8008
pid_file = os.path.join(data_dir_path, "homeserver.pid") pid_file = os.path.join(data_dir_path, "homeserver.pid")
secure_listeners = [] http_bindings = "[]"
unsecure_listeners = []
private_addresses = ["::1", "127.0.0.1"] private_addresses = ["::1", "127.0.0.1"]
if listeners: if listeners:
secure_listeners = []
unsecure_listeners = []
for listener in listeners: for listener in listeners:
if listener["tls"]: if listener["tls"]:
secure_listeners.append(listener) secure_listeners.append(listener)
@ -957,43 +952,17 @@ class ServerConfig(Config):
unsecure_listeners.append(listener) unsecure_listeners.append(listener)
secure_http_bindings = indent( # `lstrip` is used because the first line already has whitespace in the
yaml.dump(secure_listeners), " " * 10 # template below
http_bindings = indent(
yaml.dump(secure_listeners + unsecure_listeners), " " * 10
).lstrip() ).lstrip()
unsecure_http_bindings = indent(
yaml.dump(unsecure_listeners), " " * 10
).lstrip()
if not unsecure_listeners:
unsecure_http_bindings = """- port: %(unsecure_port)s
tls: false
type: http
x_forwarded: true""" % locals()
if not open_private_ports:
unsecure_http_bindings += (
"\n bind_addresses: ['::1', '127.0.0.1']"
)
unsecure_http_bindings += """
resources:
- names: [client, federation]
compress: false"""
if listeners:
unsecure_http_bindings = ""
if not secure_listeners:
secure_http_bindings = ""
return """\ return """\
server_name: "%(server_name)s" server_name: "%(server_name)s"
pid_file: %(pid_file)s pid_file: %(pid_file)s
listeners: listeners:
%(secure_http_bindings)s %(http_bindings)s
%(unsecure_http_bindings)s
""" % locals() """ % locals()
def read_arguments(self, args: argparse.Namespace) -> None: def read_arguments(self, args: argparse.Namespace) -> None:

View file

@ -21,6 +21,7 @@
import abc import abc
import logging import logging
from contextlib import ExitStack
from typing import TYPE_CHECKING, Callable, Iterable from typing import TYPE_CHECKING, Callable, Iterable
import attr import attr
@ -150,57 +151,81 @@ class Keyring:
""" """
def __init__( def __init__(
self, hs: "HomeServer", key_fetchers: "Iterable[KeyFetcher] | None" = None self,
hs: "HomeServer",
test_only_key_fetchers: "list[KeyFetcher] | None" = None,
): ):
self.server_name = hs.hostname """
Args:
hs: The HomeServer instance
test_only_key_fetchers: Dependency injection for tests only. If provided,
these key fetchers will be used instead of the default ones.
"""
# Clean-up to avoid partial initialization leaving behind references.
with ExitStack() as exit:
self.server_name = hs.hostname
if key_fetchers is None: self._key_fetchers: list[KeyFetcher] = []
# Always fetch keys from the database. if test_only_key_fetchers is None:
mutable_key_fetchers: list[KeyFetcher] = [StoreKeyFetcher(hs)] # Always fetch keys from the database.
# Fetch keys from configured trusted key servers, if any exist. store_key_fetcher = StoreKeyFetcher(hs)
key_servers = hs.config.key.key_servers exit.callback(store_key_fetcher.shutdown)
if key_servers: self._key_fetchers.append(store_key_fetcher)
mutable_key_fetchers.append(PerspectivesKeyFetcher(hs))
# Finally, fetch keys from the origin server directly.
mutable_key_fetchers.append(ServerKeyFetcher(hs))
self._key_fetchers: Iterable[KeyFetcher] = tuple(mutable_key_fetchers) # Fetch keys from configured trusted key servers, if any exist.
else: key_servers = hs.config.key.key_servers
self._key_fetchers = key_fetchers if key_servers:
perspectives_key_fetcher = PerspectivesKeyFetcher(hs)
exit.callback(perspectives_key_fetcher.shutdown)
self._key_fetchers.append(perspectives_key_fetcher)
self._fetch_keys_queue: BatchingQueue[ # Finally, fetch keys from the origin server directly.
_FetchKeyRequest, dict[str, dict[str, FetchKeyResult]] server_key_fetcher = ServerKeyFetcher(hs)
] = BatchingQueue( exit.callback(server_key_fetcher.shutdown)
name="keyring_server", self._key_fetchers.append(server_key_fetcher)
hs=hs, else:
clock=hs.get_clock(), self._key_fetchers = test_only_key_fetchers
# The method called to fetch each key
process_batch_callback=self._inner_fetch_key_requests,
)
self._is_mine_server_name = hs.is_mine_server_name self._fetch_keys_queue: BatchingQueue[
_FetchKeyRequest, dict[str, dict[str, FetchKeyResult]]
] = BatchingQueue(
name="keyring_server",
hs=hs,
clock=hs.get_clock(),
# The method called to fetch each key
process_batch_callback=self._inner_fetch_key_requests,
)
exit.callback(self._fetch_keys_queue.shutdown)
# build a FetchKeyResult for each of our own keys, to shortcircuit the self._is_mine_server_name = hs.is_mine_server_name
# fetcher.
self._local_verify_keys: dict[str, FetchKeyResult] = {} # build a FetchKeyResult for each of our own keys, to shortcircuit the
for key_id, key in hs.config.key.old_signing_keys.items(): # fetcher.
self._local_verify_keys[key_id] = FetchKeyResult( self._local_verify_keys: dict[str, FetchKeyResult] = {}
verify_key=key, valid_until_ts=key.expired for key_id, key in hs.config.key.old_signing_keys.items():
self._local_verify_keys[key_id] = FetchKeyResult(
verify_key=key, valid_until_ts=key.expired
)
vk = get_verify_key(hs.signing_key)
self._local_verify_keys[f"{vk.alg}:{vk.version}"] = FetchKeyResult(
verify_key=vk,
valid_until_ts=2**63, # fake future timestamp
) )
vk = get_verify_key(hs.signing_key) # We reached the end of the block which means everything was successful, so
self._local_verify_keys[f"{vk.alg}:{vk.version}"] = FetchKeyResult( # no exit handlers are needed (remove them all).
verify_key=vk, exit.pop_all()
valid_until_ts=2**63, # fake future timestamp
)
def shutdown(self) -> None: def shutdown(self) -> None:
""" """
Prepares the KeyRing for garbage collection by shutting down it's queues. Prepares the KeyRing for garbage collection by shutting down it's queues.
""" """
self._fetch_keys_queue.shutdown() self._fetch_keys_queue.shutdown()
for key_fetcher in self._key_fetchers: for key_fetcher in self._key_fetchers:
key_fetcher.shutdown() key_fetcher.shutdown()
self._key_fetchers.clear()
async def verify_json_for_server( async def verify_json_for_server(
self, self,
@ -521,9 +546,21 @@ class StoreKeyFetcher(KeyFetcher):
"""KeyFetcher impl which fetches keys from our data store""" """KeyFetcher impl which fetches keys from our data store"""
def __init__(self, hs: "HomeServer"): def __init__(self, hs: "HomeServer"):
super().__init__(hs) # Clean-up to avoid partial initialization leaving behind references.
with ExitStack() as exit:
super().__init__(hs)
# `KeyFetcher` keeps a reference to `hs` which we need to clean up if
# something goes wrong so we can cleanly shutdown the homeserver.
exit.callback(super().shutdown)
self.store = hs.get_datastores().main # An error can be raised here if someone tried to create a `StoreKeyFetcher`
# before the homeserver is fully set up (`HomeServerNotSetupException:
# HomeServer.setup must be called before getting datastores`).
self.store = hs.get_datastores().main
# We reached the end of the block which means everything was successful, so
# no exit handlers are needed (remove them all).
exit.pop_all()
async def _fetch_keys( async def _fetch_keys(
self, keys_to_fetch: list[_FetchKeyRequest] self, keys_to_fetch: list[_FetchKeyRequest]
@ -543,9 +580,21 @@ class StoreKeyFetcher(KeyFetcher):
class BaseV2KeyFetcher(KeyFetcher): class BaseV2KeyFetcher(KeyFetcher):
def __init__(self, hs: "HomeServer"): def __init__(self, hs: "HomeServer"):
super().__init__(hs) # Clean-up to avoid partial initialization leaving behind references.
with ExitStack() as exit:
super().__init__(hs)
# `KeyFetcher` keeps a reference to `hs` which we need to clean up if
# something goes wrong so we can cleanly shutdown the homeserver.
exit.callback(super().shutdown)
self.store = hs.get_datastores().main # An error can be raised here if someone tried to create a `StoreKeyFetcher`
# before the homeserver is fully set up (`HomeServerNotSetupException:
# HomeServer.setup must be called before getting datastores`).
self.store = hs.get_datastores().main
# We reached the end of the block which means everything was successful, so
# no exit handlers are needed (remove them all).
exit.pop_all()
async def process_v2_response( async def process_v2_response(
self, from_server: str, response_json: JsonDict, time_added_ms: int self, from_server: str, response_json: JsonDict, time_added_ms: int

View file

@ -66,6 +66,7 @@ from synapse.state import CREATE_KEY
from synapse.storage.databases.main.events_worker import EventRedactBehaviour from synapse.storage.databases.main.events_worker import EventRedactBehaviour
from synapse.types import ( from synapse.types import (
MutableStateMap, MutableStateMap,
StateKey,
StateMap, StateMap,
StrCollection, StrCollection,
UserID, UserID,
@ -1200,8 +1201,8 @@ def get_public_keys(invite_event: "EventBase") -> list[dict[str, Any]]:
def auth_types_for_event( def auth_types_for_event(
room_version: RoomVersion, event: Union["EventBase", "EventBuilder"] room_version: RoomVersion, event: Union["EventBase", "EventBuilder"]
) -> set[tuple[str, str]]: ) -> set[StateKey]:
"""Given an event, return a list of (EventType, StateKey) that may be """Given an event, return a list of (state event type, state key) that may be
needed to auth the event. The returned list may be a superset of what needed to auth the event. The returned list may be a superset of what
would actually be required depending on the full state of the room. would actually be required depending on the full state of the room.

View file

@ -44,7 +44,7 @@ from synapse.types import (
UserInfo, UserInfo,
create_requester, create_requester,
) )
from synapse.visibility import filter_events_for_client from synapse.visibility import filter_and_transform_events_for_client
if TYPE_CHECKING: if TYPE_CHECKING:
from synapse.server import HomeServer from synapse.server import HomeServer
@ -251,7 +251,7 @@ class AdminHandler:
topological=last_event.depth, topological=last_event.depth,
) )
events = await filter_events_for_client( events = await filter_and_transform_events_for_client(
self._storage_controllers, self._storage_controllers,
user_id, user_id,
events, events,

View file

@ -13,7 +13,7 @@
# #
import logging import logging
from typing import TYPE_CHECKING from typing import TYPE_CHECKING, Optional
from twisted.internet.interfaces import IDelayedCall from twisted.internet.interfaces import IDelayedCall
@ -74,7 +74,7 @@ class DelayedEventsHandler:
cfg=self._config.ratelimiting.rc_delayed_event_mgmt, cfg=self._config.ratelimiting.rc_delayed_event_mgmt,
) )
self._next_delayed_event_call: IDelayedCall | None = None self._next_delayed_event_call: Optional[IDelayedCall] = None
# The current position in the current_state_delta stream # The current position in the current_state_delta stream
self._event_pos: int | None = None self._event_pos: int | None = None

Some files were not shown because too many files have changed in this diff Show more