1
0

Compare commits

...

29 Commits

Author SHA1 Message Date
Hugh Nimmo-Smith
5c08e04985 Make soft_limit optional 2025-10-13 11:50:50 +01:00
Hugh Nimmo-Smith
8ba5e4a055 Merge branch 'develop' into hughns/msc4335 2025-10-10 12:40:15 +01:00
Hugh Nimmo-Smith
459ede6966 Add soft_limit, increase_uri and rename info_url to info_uri 2025-10-10 11:51:12 +01:00
Andrew Morgan
8390138fa4 Add 'Fetch Event' Admin API page to the docs SUMMARY.md
Otherwise it won't appear on the documentation website's sidebar.
2025-10-10 11:20:48 +01:00
Eric Eastwood
47fb4b43ca Introduce RootConfig.validate_config() which can be subclassed in HomeServerConfig to do cross-config class validation (#19027)
This means we
can move the open registration config validation from `setup()` to
`HomeServerConfig.validate_config()` (much more sane).

Spawning from looking at this area of code in
https://github.com/element-hq/synapse/pull/19015
2025-10-09 14:56:22 -05:00
Eric Eastwood
715cc5ee37 Split homeserver creation and setup (#19015)
### Background

As part of Element's plan to support a light form of vhosting (virtual
host) (multiple instances of Synapse in the same Python process), we're
currently diving into the details and implications of running multiple
instances of Synapse in the same Python process.

"Clean tenant provisioning" tracked internally by
https://github.com/element-hq/synapse-small-hosts/issues/221


### Partial startup problem

In the context of Synapse Pro for Small Hosts, since the Twisted reactor
is already running (from the `multi_synapse` shard process itself), when
provisioning a homeserver tenant, the `reactor.callWhenRunning(...)`
callbacks will be invoked immediately. This includes the Synapse's
[`start`](0615b64bb4/synapse/app/homeserver.py (L418-L429))
callback which sets up everything (including listeners, background
tasks, etc). If we encounter an error at this point, we are partially
setup but the exception will [bubble back to
us](8be122186b/multi_synapse/app/shard.py (L114-L121))
without us having a handle to the homeserver yet so we can't call
`hs.shutdown()` and clean everything up.


### What does this PR do?

Structures Synapse so we split creating the homeserver instance from
setting everything up. This way we have access to `hs` if anything goes
wrong during setup and can subsequently `hs.shutdown()` to clean
everything up.
2025-10-09 13:12:10 -05:00
Andrew Morgan
d440cfc9e2 Allow any release script command to accept --gh-token (#19035) 2025-10-09 17:15:54 +01:00
fkwp
18f07fdc4c Add MatrixRTC backend/services discovery endpoint (#18967)
Co-authored-by: Andrew Morgan <andrew@amorgan.xyz>
2025-10-09 17:15:47 +01:00
Andrew Morgan
e3344dc0c3 Expose defer_to_threadpool in the module API (#19032) 2025-10-09 15:15:13 +01:00
Andrew Morgan
bcbbccca23 Swap macos-13 with macos-15-intel GHA runner in CI (#19025) 2025-10-08 12:58:42 +01:00
Shay
8f01eb8ee0 Add an Admin API to fetch an event by ID (#18963)
Adds an endpoint to allow server admins to fetch an event regardless of
their membership in the originating room.
2025-10-08 11:38:15 +01:00
Andrew Morgan
21d125e29a Merge branch 'master' into develop 2025-10-08 10:20:14 +01:00
Andrew Morgan
638fa0f33d Merge branch 'release-v1.139' 2025-10-08 10:19:59 +01:00
Andrew Morgan
38afd10823 Merge branch 'master' into develop 2025-10-08 10:16:17 +01:00
Andrew Morgan
87cfe56d14 Merge branch 'release-v1.138' 2025-10-08 10:16:04 +01:00
Eric Eastwood
631eed91f1 Fix bad merge with start_background_tasks (#19013)
This was originally removed in
https://github.com/element-hq/synapse/pull/18886 but it looks like it
snuck back in https://github.com/element-hq/synapse/pull/18828 during a
[bad
merge](4cd3d9172e).

Noticed while looking at Synapse setup and startup (just by happen
stance).

I don't think this has adverse effects on Synapse actually working and
`start_background_tasks()` can be called multiple times.


### Is there a good way to audit all of these merges?

As I would like to see the conflicts for each merge.

This works but it's still hard to notice anything is wrong:

```
git log --remerge-diff <commit-sha>
```

> shows the difference from mechanical merge result and the result that
is actually recorded in a merge commit

via
https://stackoverflow.com/questions/15277708/how-do-you-see-show-a-git-merge-conflict-resolution-that-was-done-given-a-mer/71181334#71181334

The following better. Specify the version range to the commit right
before the merge to the merge. And can even specify which file to look
at to make it more obvious with the hindsight we have now.

```
git log --remerge-diff <merge-commit-sha>~1..<merge-commit-sha> -- synapse/server.py
```

Example:
```
git log --remerge-diff 4cd3d9172ed7b87e509746851a376c861a27820e~1..4cd3d9172ed7b87e509746851a376c861a27820e -- synapse/server.py
```
2025-10-07 13:29:22 -05:00
Eric Eastwood
7b8831310f No need to have version_string as an argument since it's always the same (#19012)
Assuming, we're happy with
https://github.com/element-hq/synapse/pull/19011, this PR makes sense.
2025-10-07 13:27:24 -05:00
dependabot[bot]
fb12d516cd Bump authlib from 1.6.4 to 1.6.5 (#19019)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-07 18:00:46 +01:00
dependabot[bot]
dde4e0e83d Bump types-pyyaml from 6.0.12.20250809 to 6.0.12.20250915 (#19018)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-07 18:00:28 +01:00
dependabot[bot]
8696551e7f Bump pydantic from 2.11.9 to 2.11.10 (#19017)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-07 18:00:02 +01:00
dependabot[bot]
28bc486bff Bump prometheus-client from 0.22.1 to 0.23.1 (#19016)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-10-07 17:59:39 +01:00
Eric Eastwood
ca27938257 Align Synapse version string to use SYNAPSE_VERSION (#19011)
See https://github.com/matrix-org/synapse/pull/12973 where we previously
used `version_string="Synapse/" +
get_distribution_version_string("matrix-synapse")` everywhere; and then
updated to use `version_string=f"Synapse/{SYNAPSE_VERSION}"` for every
other place except `synapse/app/homeserver.py` (why?!?!?!). This seems
more like a typo than something on purpose especially without any
context in the comments or PR. The whole point of that PR was trying to
solve the missing git info in version strings.

For reference, here is what both variables look like for me locally on
the latest `develop`:

 - `SYNAPSE_VERSION`: `1.139.0 (b=develop,1d2ddbc76e,dirty)`
 - `VERSION`: `1.139.0`

Only reason we may want to do this is to hide the branch name (some
sensitive name that exposes a security fix, etc). But we don't hide
anything:

`https://matrix.org/_matrix/federation/v1/version`
```json
{
  "server": {
    "name": "Synapse",
    "version": "1.139.0rc3 (b=matrix-org-hotfixes-priv,f538ed5ac3)"
  }
}
```

On `matrix.org`, the `Server` response header is masked as `cloudflare`
which would otherwise show `1.139.0rc3` for everything from the main
process.

---

This is spawning from looking at the way we setup and start Synapse for
homeserver tenant provisioning in the Synapse Pro for Small Hosts
project (https://github.com/element-hq/synapse-small-hosts/issues/221)
2025-10-07 10:44:56 -05:00
Andrew Morgan
036fb87584 1.139.2 2025-10-07 16:30:03 +01:00
Andrew Morgan
abe974cd2b 1.138.4 2025-10-07 16:28:59 +01:00
Andrew Morgan
5e3839e2af Update KeyUploadServlet to handle case where client sends device_keys: null (#19023) 2025-10-07 16:28:26 +01:00
Andrew Morgan
0ae1f105b2 Update KeyUploadServlet to handle case where client sends device_keys: null (#19023) 2025-10-07 16:27:58 +01:00
Hugh Nimmo-Smith
e0cbf0f44f Merge branch 'develop' into hughns/msc4335 2025-10-03 11:12:09 +01:00
Hugh Nimmo-Smith
9ad30bcaa1 Update to latest unstable codes 2025-10-03 11:09:12 +01:00
Hugh Nimmo-Smith
998463222b Support for experimental MSC4335
- Make it available behind experimental feature flag
- return it for media upload limits
2025-09-24 17:56:51 +01:00
51 changed files with 1193 additions and 139 deletions

View File

@@ -114,8 +114,8 @@ jobs:
os:
- ubuntu-24.04
- ubuntu-24.04-arm
- macos-13 # This uses x86-64
- macos-14 # This uses arm64
- macos-15-intel # This uses x86-64
# is_pr is a flag used to exclude certain jobs from the matrix on PRs.
# It is not read by the rest of the workflow.
is_pr:
@@ -124,7 +124,7 @@ jobs:
exclude:
# Don't build macos wheels on PR CI.
- is_pr: true
os: "macos-13"
os: "macos-15-intel"
- is_pr: true
os: "macos-14"
# Don't build aarch64 wheels on PR CI.

View File

@@ -1,3 +1,12 @@
# Synapse 1.139.2 (2025-10-07)
## Bugfixes
- Fix a bug introduced in 1.139.1 where a client could receive an Internal Server Error if they set `device_keys: null` in the request to [`POST /_matrix/client/v3/keys/upload`](https://spec.matrix.org/v1.16/client-server-api/#post_matrixclientv3keysupload). ([\#19023](https://github.com/element-hq/synapse/issues/19023))
# Synapse 1.139.1 (2025-10-07)
## Security Fixes
@@ -8,6 +17,17 @@
- Drop support for unstable field names from the long-accepted [MSC2732](https://github.com/matrix-org/matrix-spec-proposals/pull/2732) (Olm fallback keys) proposal. This change allows unit tests to pass following the security patch above. ([\#18996](https://github.com/element-hq/synapse/issues/18996))
# Synapse 1.138.4 (2025-10-07)
## Bugfixes
- Fix a bug introduced in 1.138.3 where a client could receive an Internal Server Error if they set `device_keys: null` in the request to [`POST /_matrix/client/v3/keys/upload`](https://spec.matrix.org/v1.16/client-server-api/#post_matrixclientv3keysupload). ([\#19023](https://github.com/element-hq/synapse/issues/19023))
# Synapse 1.138.3 (2025-10-07)
## Security Fixes

View File

@@ -0,0 +1 @@
Add support for experimental [MSC4335](https://github.com/matrix-org/matrix-spec-proposals/pull/4335) M_USER_LIMIT_EXCEEDED error code for media upload limits.

View File

@@ -0,0 +1 @@
Add an Admin API to fetch an event by ID.

View File

@@ -0,0 +1 @@
Add experimental implementation for the latest draft of [MSC4143](https://github.com/matrix-org/matrix-spec-proposals/pull/4143).

1
changelog.d/19011.bugfix Normal file
View File

@@ -0,0 +1 @@
Update Synapse main process version string to include git info.

1
changelog.d/19012.misc Normal file
View File

@@ -0,0 +1 @@
Remove `version_string` argument from `HomeServer` since it's always the same.

1
changelog.d/19013.misc Normal file
View File

@@ -0,0 +1 @@
Remove duplicate call to `hs.start_background_tasks()` introduced from a bad merge.

1
changelog.d/19015.misc Normal file
View File

@@ -0,0 +1 @@
Split homeserver creation (`create_homeserver`) and setup (`setup`).

1
changelog.d/19025.misc Normal file
View File

@@ -0,0 +1 @@
Swap near-end-of-life `macos-13` GitHub Actions runner for the `macos-15-intel` variant.

1
changelog.d/19027.misc Normal file
View File

@@ -0,0 +1 @@
Introduce `RootConfig.validate_config()` which can be subclassed in `HomeServerConfig` to do cross-config class validation.

View File

@@ -0,0 +1 @@
Expose a `defer_to_threadpool` function in the Synapse Module API that allows modules to run a function on a separate thread in a custom threadpool.

1
changelog.d/19035.misc Normal file
View File

@@ -0,0 +1 @@
Allow any command of the `release.py` to accept a `--gh-token` argument.

12
debian/changelog vendored
View File

@@ -1,9 +1,21 @@
matrix-synapse-py3 (1.139.2) stable; urgency=medium
* New Synapse release 1.139.2.
-- Synapse Packaging team <packages@matrix.org> Tue, 07 Oct 2025 16:29:47 +0100
matrix-synapse-py3 (1.139.1) stable; urgency=medium
* New Synapse release 1.139.1.
-- Synapse Packaging team <packages@matrix.org> Tue, 07 Oct 2025 11:46:51 +0100
matrix-synapse-py3 (1.138.4) stable; urgency=medium
* New Synapse release 1.138.4.
-- Synapse Packaging team <packages@matrix.org> Tue, 07 Oct 2025 16:28:38 +0100
matrix-synapse-py3 (1.138.3) stable; urgency=medium
* New Synapse release 1.138.3.

View File

@@ -60,6 +60,7 @@
- [Admin API](usage/administration/admin_api/README.md)
- [Account Validity](admin_api/account_validity.md)
- [Background Updates](usage/administration/admin_api/background_updates.md)
- [Fetch Event](admin_api/fetch_event.md)
- [Event Reports](admin_api/event_reports.md)
- [Experimental Features](admin_api/experimental_features.md)
- [Media](admin_api/media_admin_api.md)

View File

@@ -0,0 +1,53 @@
# Fetch Event API
The fetch event API allows admins to fetch an event regardless of their membership in the room it
originated in.
To use it, you will need to authenticate by providing an `access_token`
for a server admin: see [Admin API](../usage/administration/admin_api/).
Request:
```http
GET /_synapse/admin/v1/fetch_event/<event_id>
```
The API returns a JSON body like the following:
Response:
```json
{
"event": {
"auth_events": [
"$WhLChbYg6atHuFRP7cUd95naUtc8L0f7fqeizlsUVvc",
"$9Wj8dt02lrNEWweeq-KjRABUYKba0K9DL2liRvsAdtQ",
"$qJxBFxBt8_ODd9b3pgOL_jXP98S_igc1_kizuPSZFi4"
],
"content": {
"body": "Hey now",
"msgtype": "m.text"
},
"depth": 6,
"event_id": "$hJ_kcXbVMcI82JDrbqfUJIHu61tJD86uIFJ_8hNHi7s",
"hashes": {
"sha256": "LiNw8DtrRVf55EgAH8R42Wz7WCJUqGsPt2We6qZO5Rg"
},
"origin_server_ts": 799,
"prev_events": [
"$cnSUrNMnC3Ywh9_W7EquFxYQjC_sT3BAAVzcUVxZq1g"
],
"room_id": "!aIhKToCqgPTBloWMpf:test",
"sender": "@user:test",
"signatures": {
"test": {
"ed25519:a_lPym": "7mqSDwK1k7rnw34Dd8Fahu0rhPW7jPmcWPRtRDoEN9Yuv+BCM2+Rfdpv2MjxNKy3AYDEBwUwYEuaKMBaEMiKAQ"
}
},
"type": "m.room.message",
"unsigned": {
"age_ts": 799
}
}
}
```

View File

@@ -2174,6 +2174,18 @@ These settings can be overridden using the `get_media_upload_limits_for_user` mo
Defaults to `[]`.
Options for each entry include:
* `time_period` (duration): The time period over which the limit applies. Required.
* `max_size` (byte size): Amount of data that can be uploaded in the time period by the user. Required.
* `msc4335_info_uri` (string): Experimental MSC4335 URI to where the user can find information about the upload limit. Optional.
* `msc4335_soft_limit` (boolean): Experimental MSC4335 value to say if the limit can be increased. Optional.
* `msc4335_increase_uri` (string): Experimental MSC4335 URI to where the user can increase the upload limit. Required if msc4335_soft_limit is true.
Example configuration:
```yaml
media_upload_limits:
@@ -2181,6 +2193,9 @@ media_upload_limits:
max_size: 100M
- time_period: 1w
max_size: 500M
msc4335_info_uri: https://example.com/quota
msc4335_soft_limit: true
msc4335_increase_uri: https://example.com/increase-quota
```
---
### `max_image_pixels`
@@ -2573,6 +2588,28 @@ Example configuration:
turn_allow_guests: false
```
---
### `matrix_rtc`
*(object)* Options related to MatrixRTC. Defaults to `{}`.
This setting has the following sub-options:
* `transports` (array): A list of transport types and arguments to use for MatrixRTC connections. Defaults to `[]`.
Options for each entry include:
* `type` (string): The type of transport to use to connect to the selective forwarding unit (SFU).
* `livekit_service_url` (string): The base URL of the LiveKit service. Should only be used with LiveKit-based transports.
Example configuration:
```yaml
matrix_rtc:
transports:
- type: livekit
livekit_service_url: https://matrix-rtc.example.com/livekit/jwt
```
---
## Registration
Registration can be rate-limited using the parameters in the [Ratelimiting](#ratelimiting) section of this manual.

24
poetry.lock generated
View File

@@ -34,15 +34,15 @@ tests-mypy = ["mypy (>=1.11.1) ; platform_python_implementation == \"CPython\" a
[[package]]
name = "authlib"
version = "1.6.4"
version = "1.6.5"
description = "The ultimate Python library in building OAuth and OpenID Connect servers and clients."
optional = true
python-versions = ">=3.9"
groups = ["main"]
markers = "extra == \"all\" or extra == \"jwt\" or extra == \"oidc\""
files = [
{file = "authlib-1.6.4-py2.py3-none-any.whl", hash = "sha256:39313d2a2caac3ecf6d8f95fbebdfd30ae6ea6ae6a6db794d976405fdd9aa796"},
{file = "authlib-1.6.4.tar.gz", hash = "sha256:104b0442a43061dc8bc23b133d1d06a2b0a9c2e3e33f34c4338929e816287649"},
{file = "authlib-1.6.5-py2.py3-none-any.whl", hash = "sha256:3e0e0507807f842b02175507bdee8957a1d5707fd4afb17c32fb43fee90b6e3a"},
{file = "authlib-1.6.5.tar.gz", hash = "sha256:6aaf9c79b7cc96c900f0b284061691c5d4e61221640a948fe690b556a6d6d10b"},
]
[package.dependencies]
@@ -1726,14 +1726,14 @@ xmp = ["defusedxml"]
[[package]]
name = "prometheus-client"
version = "0.22.1"
version = "0.23.1"
description = "Python client for the Prometheus monitoring system."
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "prometheus_client-0.22.1-py3-none-any.whl", hash = "sha256:cca895342e308174341b2cbf99a56bef291fbc0ef7b9e5412a0f26d653ba7094"},
{file = "prometheus_client-0.22.1.tar.gz", hash = "sha256:190f1331e783cf21eb60bca559354e0a4d4378facecf78f5428c39b675d20d28"},
{file = "prometheus_client-0.23.1-py3-none-any.whl", hash = "sha256:dd1913e6e76b59cfe44e7a4b83e01afc9873c1bdfd2ed8739f1e76aeca115f99"},
{file = "prometheus_client-0.23.1.tar.gz", hash = "sha256:6ae8f9081eaaaf153a2e959d2e6c4f4fb57b12ef76c8c7980202f1e57b48b2ce"},
]
[package.extras]
@@ -1832,14 +1832,14 @@ files = [
[[package]]
name = "pydantic"
version = "2.11.9"
version = "2.11.10"
description = "Data validation using Python type hints"
optional = false
python-versions = ">=3.9"
groups = ["main", "dev"]
files = [
{file = "pydantic-2.11.9-py3-none-any.whl", hash = "sha256:c42dd626f5cfc1c6950ce6205ea58c93efa406da65f479dcb4029d5934857da2"},
{file = "pydantic-2.11.9.tar.gz", hash = "sha256:6b8ffda597a14812a7975c90b82a8a2e777d9257aba3453f973acd3c032a18e2"},
{file = "pydantic-2.11.10-py3-none-any.whl", hash = "sha256:802a655709d49bd004c31e865ef37da30b540786a46bfce02333e0e24b5fe29a"},
{file = "pydantic-2.11.10.tar.gz", hash = "sha256:dc280f0982fbda6c38fada4e476dc0a4f3aeaf9c6ad4c28df68a666ec3c61423"},
]
[package.dependencies]
@@ -3057,14 +3057,14 @@ types-cffi = "*"
[[package]]
name = "types-pyyaml"
version = "6.0.12.20250809"
version = "6.0.12.20250915"
description = "Typing stubs for PyYAML"
optional = false
python-versions = ">=3.9"
groups = ["dev"]
files = [
{file = "types_pyyaml-6.0.12.20250809-py3-none-any.whl", hash = "sha256:032b6003b798e7de1a1ddfeefee32fac6486bdfe4845e0ae0e7fb3ee4512b52f"},
{file = "types_pyyaml-6.0.12.20250809.tar.gz", hash = "sha256:af4a1aca028f18e75297da2ee0da465f799627370d74073e96fee876524f61b5"},
{file = "types_pyyaml-6.0.12.20250915-py3-none-any.whl", hash = "sha256:e7d4d9e064e89a3b3cae120b4990cd370874d2bf12fa5f46c97018dd5d3c9ab6"},
{file = "types_pyyaml-6.0.12.20250915.tar.gz", hash = "sha256:0f8b54a528c303f0e6f7165687dd33fafa81c807fcac23f632b63aa624ced1d3"},
]
[[package]]

View File

@@ -101,7 +101,7 @@ module-name = "synapse.synapse_rust"
[tool.poetry]
name = "matrix-synapse"
version = "1.139.1"
version = "1.139.2"
description = "Homeserver for the Matrix decentralised comms protocol"
authors = ["Matrix.org Team and Contributors <packages@matrix.org>"]
license = "AGPL-3.0-or-later OR LicenseRef-Element-Commercial"

View File

@@ -2424,20 +2424,40 @@ properties:
module API [callback](../../modules/media_repository_callbacks.md#get_media_upload_limits_for_user).
default: []
items:
time_period:
type: "#/$defs/duration"
description: >-
The time period over which the limit applies. Required.
max_size:
type: "#/$defs/bytes"
description: >-
Amount of data that can be uploaded in the time period by the user.
Required.
type: object
required:
- time_period
- max_size
properties:
time_period:
$ref: "#/$defs/duration"
description: >-
The time period over which the limit applies. Required.
max_size:
$ref: "#/$defs/bytes"
description: >-
Amount of data that can be uploaded in the time period by the user.
Required.
msc4335_info_uri:
type: string
description: >-
Experimental MSC4335 URI to where the user can find information about the upload limit. Optional.
msc4335_soft_limit:
type: boolean
description: >-
Experimental MSC4335 value to say if the limit can be increased. Optional.
msc4335_increase_uri:
type: string
description: >-
Experimental MSC4335 URI to where the user can increase the upload limit. Required if msc4335_soft_limit is true.
examples:
- - time_period: 1h
max_size: 100M
- time_period: 1w
max_size: 500M
msc4335_info_uri: https://example.com/quota
msc4335_soft_limit: true
msc4335_increase_uri: https://example.com/increase-quota
max_image_pixels:
$ref: "#/$defs/bytes"
description: Maximum number of pixels that will be thumbnailed.
@@ -2884,6 +2904,35 @@ properties:
default: true
examples:
- false
matrix_rtc:
type: object
description: >-
Options related to MatrixRTC.
properties:
transports:
type: array
items:
type: object
required:
- type
properties:
type:
type: string
description: The type of transport to use to connect to the selective forwarding unit (SFU).
example: livekit
livekit_service_url:
type: string
description: >-
The base URL of the LiveKit service. Should only be used with LiveKit-based transports.
example: https://matrix-rtc.example.com/livekit/jwt
description:
A list of transport types and arguments to use for MatrixRTC connections.
default: []
default: {}
examples:
- transports:
- type: livekit
livekit_service_url: https://matrix-rtc.example.com/livekit/jwt
enable_registration:
type: boolean
description: >-

View File

@@ -639,7 +639,16 @@ def _notify(message: str) -> None:
@cli.command()
def merge_back() -> None:
# Although this option is not used, allow it anyways. Otherwise the user will
# receive an error when providing it, which is annoying as other commands accept
# it.
@click.option(
"--gh-token",
"_gh_token",
envvar=["GH_TOKEN", "GITHUB_TOKEN"],
required=False,
)
def merge_back(_gh_token: Optional[str]) -> None:
_merge_back()
@@ -687,7 +696,16 @@ def _merge_back() -> None:
@cli.command()
def announce() -> None:
# Although this option is not used, allow it anyways. Otherwise the user will
# receive an error when providing it, which is annoying as other commands accept
# it.
@click.option(
"--gh-token",
"_gh_token",
envvar=["GH_TOKEN", "GITHUB_TOKEN"],
required=False,
)
def announce(_gh_token: Optional[str]) -> None:
_announce()

View File

@@ -98,7 +98,6 @@ from synapse.storage.databases.state.bg_updates import StateBackgroundUpdateStor
from synapse.storage.engines import create_engine
from synapse.storage.prepare_database import prepare_database
from synapse.types import ISynapseReactor
from synapse.util import SYNAPSE_VERSION
# Cast safety: Twisted does some naughty magic which replaces the
# twisted.internet.reactor module with a Reactor instance at runtime.
@@ -325,7 +324,6 @@ class MockHomeserver(HomeServer):
hostname=config.server.server_name,
config=config,
reactor=reactor,
version_string=f"Synapse/{SYNAPSE_VERSION}",
)

View File

@@ -31,7 +31,6 @@ from synapse.config.homeserver import HomeServerConfig
from synapse.server import HomeServer
from synapse.storage import DataStore
from synapse.types import ISynapseReactor
from synapse.util import SYNAPSE_VERSION
# Cast safety: Twisted does some naughty magic which replaces the
# twisted.internet.reactor module with a Reactor instance at runtime.
@@ -47,7 +46,6 @@ class MockHomeserver(HomeServer):
hostname=config.server.server_name,
config=config,
reactor=reactor,
version_string=f"Synapse/{SYNAPSE_VERSION}",
)

View File

@@ -152,6 +152,8 @@ class Codes(str, Enum):
# Part of MSC4326
UNKNOWN_DEVICE = "ORG.MATRIX.MSC4326.M_UNKNOWN_DEVICE"
MSC4335_USER_LIMIT_EXCEEDED = "ORG.MATRIX.MSC4335_USER_LIMIT_EXCEEDED"
class CodeMessageException(RuntimeError):
"""An exception with integer code, a message string attributes and optional headers.
@@ -513,6 +515,40 @@ class ResourceLimitError(SynapseError):
)
class MSC4335UserLimitExceededError(SynapseError):
"""
Experimental implementation of MSC4335 M_USER_LIMIT_EXCEEDED error
"""
def __init__(
self,
code: int,
msg: str,
info_uri: str,
soft_limit: bool = False,
increase_uri: Optional[str] = None,
):
if soft_limit and increase_uri is None:
raise ValueError("increase_uri must be provided if soft_limit is True")
additional_fields: dict[str, Union[str, bool]] = {
"org.matrix.msc4335.info_uri": info_uri,
}
if soft_limit:
additional_fields["org.matrix.msc4335.soft_limit"] = soft_limit
if soft_limit and increase_uri is not None:
additional_fields["org.matrix.msc4335.increase_uri"] = increase_uri
super().__init__(
code,
msg,
Codes.MSC4335_USER_LIMIT_EXCEEDED,
additional_fields=additional_fields,
)
class EventSizeError(SynapseError):
"""An error raised when an event is too big."""

View File

@@ -421,17 +421,26 @@ def listen_http(
context_factory: Optional[IOpenSSLContextFactory],
reactor: ISynapseReactor = reactor,
) -> List[Port]:
"""
Args:
listener_config: TODO
root_resource: TODO
version_string: A string to present for the Server header
max_request_body_size: TODO
context_factory: TODO
reactor: TODO
"""
assert listener_config.http_options is not None
site_tag = listener_config.get_site_tag()
site = SynapseSite(
"synapse.access.%s.%s"
logger_name="synapse.access.%s.%s"
% ("https" if listener_config.is_tls() else "http", site_tag),
site_tag,
listener_config,
root_resource,
version_string,
site_tag=site_tag,
config=listener_config,
resource=root_resource,
server_version_string=version_string,
max_request_body_size=max_request_body_size,
reactor=reactor,
hs=hs,

View File

@@ -65,7 +65,6 @@ from synapse.storage.databases.main.stream import StreamWorkerStore
from synapse.storage.databases.main.tags import TagsWorkerStore
from synapse.storage.databases.main.user_erasure_store import UserErasureWorkerStore
from synapse.types import JsonMapping, StateMap
from synapse.util import SYNAPSE_VERSION
from synapse.util.logcontext import LoggingContext
logger = logging.getLogger("synapse.app.admin_cmd")
@@ -316,7 +315,6 @@ def start(config: HomeServerConfig, args: argparse.Namespace) -> None:
ss = AdminCmdServer(
config.server.server_name,
config=config,
version_string=f"Synapse/{SYNAPSE_VERSION}",
)
setup_logging(ss, config, use_worker_options=True)

View File

@@ -112,7 +112,6 @@ from synapse.storage.databases.main.transactions import TransactionWorkerStore
from synapse.storage.databases.main.ui_auth import UIAuthWorkerStore
from synapse.storage.databases.main.user_directory import UserDirectoryStore
from synapse.storage.databases.main.user_erasure_store import UserErasureWorkerStore
from synapse.util import SYNAPSE_VERSION
from synapse.util.httpresourcetree import create_resource_tree
logger = logging.getLogger("synapse.app.generic_worker")
@@ -359,7 +358,6 @@ def start(config: HomeServerConfig) -> None:
hs = GenericWorkerServer(
config.server.server_name,
config=config,
version_string=f"Synapse/{SYNAPSE_VERSION}",
)
setup_logging(hs, config, use_worker_options=True)

View File

@@ -71,7 +71,7 @@ from synapse.rest.well_known import well_known_resource
from synapse.server import HomeServer
from synapse.storage import DataStore
from synapse.types import ISynapseReactor
from synapse.util.check_dependencies import VERSION, check_requirements
from synapse.util.check_dependencies import check_requirements
from synapse.util.httpresourcetree import create_resource_tree
from synapse.util.module_loader import load_module
@@ -83,6 +83,10 @@ def gz_wrap(r: Resource) -> Resource:
class SynapseHomeServer(HomeServer):
"""
Homeserver class for the main Synapse process.
"""
DATASTORE_CLASS = DataStore
def _listener_http(
@@ -345,18 +349,53 @@ def load_or_generate_config(argv_options: List[str]) -> HomeServerConfig:
return config
def setup(
def create_homeserver(
config: HomeServerConfig,
reactor: Optional[ISynapseReactor] = None,
freeze: bool = True,
) -> SynapseHomeServer:
"""
Create and setup a Synapse homeserver instance given a configuration.
Create a homeserver instance for the Synapse main process.
Args:
config: The configuration for the homeserver.
reactor: Optionally provide a reactor to use. Can be useful in different
scenarios that you want control over the reactor, such as tests.
Returns:
A homeserver instance.
"""
if config.worker.worker_app:
raise ConfigError(
"You have specified `worker_app` in the config but are attempting to setup a non-worker "
"instance. Please use `python -m synapse.app.generic_worker` instead (or remove the option if this is the main process)."
)
events.USE_FROZEN_DICTS = config.server.use_frozen_dicts
synapse.util.caches.TRACK_MEMORY_USAGE = config.caches.track_memory_usage
if config.server.gc_seconds:
synapse.metrics.MIN_TIME_BETWEEN_GCS = config.server.gc_seconds
hs = SynapseHomeServer(
hostname=config.server.server_name,
config=config,
reactor=reactor,
)
return hs
def setup(
hs: SynapseHomeServer,
*,
freeze: bool = True,
) -> None:
"""
Setup a Synapse homeserver instance given a configuration.
Args:
hs: The homeserver to setup.
freeze: whether to freeze the homeserver base objects in the garbage collector.
May improve garbage collection performance by marking objects with an effectively
static lifetime as frozen so they don't need to be considered for cleanup.
@@ -367,55 +406,22 @@ def setup(
A homeserver instance.
"""
if config.worker.worker_app:
raise ConfigError(
"You have specified `worker_app` in the config but are attempting to start a non-worker "
"instance. Please use `python -m synapse.app.generic_worker` instead (or remove the option if this is the main process)."
)
sys.exit(1)
setup_logging(hs, hs.config, use_worker_options=False)
events.USE_FROZEN_DICTS = config.server.use_frozen_dicts
synapse.util.caches.TRACK_MEMORY_USAGE = config.caches.track_memory_usage
if config.server.gc_seconds:
synapse.metrics.MIN_TIME_BETWEEN_GCS = config.server.gc_seconds
if (
config.registration.enable_registration
and not config.registration.enable_registration_without_verification
):
if (
not config.captcha.enable_registration_captcha
and not config.registration.registrations_require_3pid
and not config.registration.registration_requires_token
):
raise ConfigError(
"You have enabled open registration without any verification. This is a known vector for "
"spam and abuse. If you would like to allow public registration, please consider adding email, "
"captcha, or token-based verification. Otherwise this check can be removed by setting the "
"`enable_registration_without_verification` config option to `true`."
)
hs = SynapseHomeServer(
config.server.server_name,
config=config,
version_string=f"Synapse/{VERSION}",
reactor=reactor,
)
setup_logging(hs, config, use_worker_options=False)
# Log after we've configured logging.
logger.info("Setting up server")
# Start the tracer
init_tracer(hs) # noqa
logger.info("Setting up server")
try:
hs.setup()
except Exception as e:
handle_startup_exception(e)
async def start() -> None:
async def _start_when_reactor_running() -> None:
# TODO: Feels like this should be moved somewhere else.
#
# Load the OIDC provider metadatas, if OIDC is enabled.
if hs.config.oidc.oidc_enabled:
oidc = hs.get_oidc_handler()
@@ -424,21 +430,31 @@ def setup(
await _base.start(hs, freeze)
# TODO: This should be moved to `SynapseHomeServer.start_background_tasks` (not
# `HomeServer.start_background_tasks`) (this way it matches the behavior of only
# running on `main`)
hs.get_datastores().main.db_pool.updates.start_doing_background_updates()
register_start(hs, start)
return hs
# Register a callback to be invoked once the reactor is running
register_start(hs, _start_when_reactor_running)
def run(hs: HomeServer) -> None:
def start_reactor(
config: HomeServerConfig,
) -> None:
"""
Start the reactor (Twisted event-loop).
Args:
config: The configuration for the homeserver.
"""
_base.start_reactor(
"synapse-homeserver",
soft_file_limit=hs.config.server.soft_file_limit,
gc_thresholds=hs.config.server.gc_thresholds,
pid_file=hs.config.server.pid_file,
daemonize=hs.config.server.daemonize,
print_pidfile=hs.config.server.print_pidfile,
soft_file_limit=config.server.soft_file_limit,
gc_thresholds=config.server.gc_thresholds,
pid_file=config.server.pid_file,
daemonize=config.server.daemonize,
print_pidfile=config.server.print_pidfile,
logger=logger,
)
@@ -449,13 +465,14 @@ def main() -> None:
with LoggingContext(name="main", server_name=homeserver_config.server.server_name):
# check base requirements
check_requirements()
hs = setup(homeserver_config)
hs = create_homeserver(homeserver_config)
setup(hs)
# redirect stdio to the logs, if configured.
if not hs.config.logging.no_redirect_stdio:
redirect_stdio_to_logs()
run(hs)
start_reactor(homeserver_config)
if __name__ == "__main__":

View File

@@ -545,18 +545,22 @@ class RootConfig:
@classmethod
def load_config(
cls: Type[TRootConfig], description: str, argv: List[str]
cls: Type[TRootConfig], description: str, argv_options: List[str]
) -> TRootConfig:
"""Parse the commandline and config files
Doesn't support config-file-generation: used by the worker apps.
Args:
description: TODO
argv_options: The options passed to Synapse. Usually `sys.argv[1:]`.
Returns:
Config object.
"""
config_parser = argparse.ArgumentParser(description=description)
cls.add_arguments_to_parser(config_parser)
obj, _ = cls.load_config_with_parser(config_parser, argv)
obj, _ = cls.load_config_with_parser(config_parser, argv_options)
return obj
@@ -609,6 +613,10 @@ class RootConfig:
Used for workers where we want to add extra flags/subcommands.
Note: This is the common denominator for loading config and is also used by
`load_config` and `load_or_generate_config`. Which is why we call
`validate_config()` here.
Args:
parser
argv_options: The options passed to Synapse. Usually `sys.argv[1:]`.
@@ -642,6 +650,10 @@ class RootConfig:
obj.invoke_all("read_arguments", config_args)
# Now that we finally have the full config sections parsed, allow subclasses to
# do some extra validation across the entire config.
obj.validate_config()
return obj, config_args
@classmethod
@@ -842,15 +854,7 @@ class RootConfig:
):
return None
obj.parse_config_dict(
config_dict,
config_dir_path=config_dir_path,
data_dir_path=data_dir_path,
allow_secrets_in_config=config_args.secrets_in_config,
)
obj.invoke_all("read_arguments", config_args)
return obj
return cls.load_config(description, argv_options)
def parse_config_dict(
self,
@@ -911,6 +915,20 @@ class RootConfig:
existing_config.root = None
return existing_config
def validate_config(self) -> None:
"""
Additional config validation across all config sections.
Override this in subclasses to add extra validation. This is called once all
config option values have been populated.
XXX: This should only validate, not modify the configuration, as the final
config state is required for proper validation across all config sections.
Raises:
ConfigError: if the config is invalid.
"""
def read_config_files(config_files: Iterable[str]) -> Dict[str, Any]:
"""Read the config files and shallowly merge them into a dict.

View File

@@ -37,6 +37,7 @@ from synapse.config import ( # noqa: F401
key,
logger,
mas,
matrixrtc,
metrics,
modules,
oembed,
@@ -126,6 +127,7 @@ class RootConfig:
auto_accept_invites: auto_accept_invites.AutoAcceptInvitesConfig
user_types: user_types.UserTypesConfig
mas: mas.MasConfig
matrix_rtc: matrixrtc.MatrixRtcConfig
config_classes: List[Type["Config"]] = ...
config_files: List[str]
@@ -156,11 +158,11 @@ class RootConfig:
) -> str: ...
@classmethod
def load_or_generate_config(
cls: Type[TRootConfig], description: str, argv: List[str]
cls: Type[TRootConfig], description: str, argv_options: List[str]
) -> Optional[TRootConfig]: ...
@classmethod
def load_config(
cls: Type[TRootConfig], description: str, argv: List[str]
cls: Type[TRootConfig], description: str, argv_options: List[str]
) -> TRootConfig: ...
@classmethod
def add_arguments_to_parser(
@@ -168,7 +170,7 @@ class RootConfig:
) -> None: ...
@classmethod
def load_config_with_parser(
cls: Type[TRootConfig], parser: argparse.ArgumentParser, argv: List[str]
cls: Type[TRootConfig], parser: argparse.ArgumentParser, argv_options: List[str]
) -> Tuple[TRootConfig, argparse.Namespace]: ...
def generate_missing_files(
self, config_dict: dict, config_dir_path: str

View File

@@ -556,6 +556,9 @@ class ExperimentalConfig(Config):
# MSC4133: Custom profile fields
self.msc4133_enabled: bool = experimental.get("msc4133_enabled", False)
# MSC4143: Matrix RTC Transport using Livekit Backend
self.msc4143_enabled: bool = experimental.get("msc4143_enabled", False)
# MSC4169: Backwards-compatible redaction sending using `/send`
self.msc4169_enabled: bool = experimental.get("msc4169_enabled", False)
@@ -595,3 +598,6 @@ class ExperimentalConfig(Config):
# MSC4306: Thread Subscriptions
# (and MSC4308: Thread Subscriptions extension to Sliding Sync)
self.msc4306_enabled: bool = experimental.get("msc4306_enabled", False)
# MSC4335: M_USER_LIMIT_EXCEEDED error
self.msc4335_enabled: bool = experimental.get("msc4335_enabled", False)

View File

@@ -18,7 +18,8 @@
# [This file includes modifications made by New Vector Limited]
#
#
from ._base import RootConfig
from ._base import ConfigError, RootConfig
from .account_validity import AccountValidityConfig
from .api import ApiConfig
from .appservice import AppServiceConfig
@@ -37,6 +38,7 @@ from .jwt import JWTConfig
from .key import KeyConfig
from .logger import LoggingConfig
from .mas import MasConfig
from .matrixrtc import MatrixRtcConfig
from .metrics import MetricsConfig
from .modules import ModulesConfig
from .oembed import OembedConfig
@@ -66,6 +68,10 @@ from .workers import WorkerConfig
class HomeServerConfig(RootConfig):
"""
Top-level config object for Synapse homeserver (main process and workers).
"""
config_classes = [
ModulesConfig,
ServerConfig,
@@ -80,6 +86,7 @@ class HomeServerConfig(RootConfig):
OembedConfig,
CaptchaConfig,
VoipConfig,
MatrixRtcConfig,
RegistrationConfig,
AccountValidityConfig,
MetricsConfig,
@@ -113,3 +120,22 @@ class HomeServerConfig(RootConfig):
# This must be last, as it checks for conflicts with other config options.
MasConfig,
]
def validate_config(
self,
) -> None:
if (
self.registration.enable_registration
and not self.registration.enable_registration_without_verification
):
if (
not self.captcha.enable_registration_captcha
and not self.registration.registrations_require_3pid
and not self.registration.registration_requires_token
):
raise ConfigError(
"You have enabled open registration without any verification. This is a known vector for "
"spam and abuse. If you would like to allow public registration, please consider adding email, "
"captcha, or token-based verification. Otherwise this check can be removed by setting the "
"`enable_registration_without_verification` config option to `true`."
)

View File

@@ -0,0 +1,67 @@
#
# This file is licensed under the Affero General Public License (AGPL) version 3.
#
# Copyright (C) 2025 New Vector, Ltd
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# See the GNU Affero General Public License for more details:
# <https://www.gnu.org/licenses/agpl-3.0.html>.
#
# [This file includes modifications made by New Vector Limited]
#
#
from typing import Any, Optional
from pydantic import ValidationError
from synapse._pydantic_compat import Field, StrictStr, validator
from synapse.types import JsonDict
from synapse.util.pydantic_models import ParseModel
from ._base import Config, ConfigError
class TransportConfigModel(ParseModel):
type: StrictStr
livekit_service_url: Optional[StrictStr] = Field(default=None)
"""An optional livekit service URL. Only required if type is "livekit"."""
@validator("livekit_service_url", always=True)
def validate_livekit_service_url(cls, v: Any, values: dict) -> Any:
if values.get("type") == "livekit" and not v:
raise ValueError(
"You must set a `livekit_service_url` when using the 'livekit' transport."
)
return v
class MatrixRtcConfigModel(ParseModel):
transports: list = []
class MatrixRtcConfig(Config):
section = "matrix_rtc"
def read_config(
self, config: JsonDict, allow_secrets_in_config: bool, **kwargs: Any
) -> None:
matrix_rtc = config.get("matrix_rtc", {})
if matrix_rtc is None:
matrix_rtc = {}
try:
parsed = MatrixRtcConfigModel(**matrix_rtc)
except ValidationError as e:
raise ConfigError(
"Could not validate matrix_rtc config",
("matrix_rtc",),
) from e
self.transports = parsed.transports

View File

@@ -21,7 +21,7 @@
import logging
import os
from typing import Any, Dict, List, Tuple
from typing import Any, Dict, List, Optional, Tuple
import attr
@@ -134,6 +134,15 @@ class MediaUploadLimit:
time_period_ms: int
"""The time period in milliseconds."""
msc4335_info_uri: Optional[str] = None
"""Used for experimental MSC4335 error code feature"""
msc4335_soft_limit: Optional[bool] = None
"""Used for experimental MSC4335 error code feature"""
msc4335_increase_uri: Optional[str] = None
"""Used for experimental MSC4335 error code feature"""
class ContentRepositoryConfig(Config):
section = "media"
@@ -302,8 +311,34 @@ class ContentRepositoryConfig(Config):
for limit_config in config.get("media_upload_limits", []):
time_period_ms = self.parse_duration(limit_config["time_period"])
max_bytes = self.parse_size(limit_config["max_size"])
msc4335_info_uri = limit_config.get("msc4335_info_uri", None)
msc4335_soft_limit = limit_config.get("msc4335_soft_limit", None)
msc4335_increase_uri = limit_config.get("msc4335_increase_uri", None)
self.media_upload_limits.append(MediaUploadLimit(max_bytes, time_period_ms))
if (
msc4335_info_uri is not None
or msc4335_soft_limit is not None
or msc4335_increase_uri is not None
) and (not (msc4335_info_uri and msc4335_soft_limit is not None)):
raise ConfigError(
"If any of msc4335_info_uri, msc4335_soft_limit or "
"msc4335_increase_uri are set, then both msc4335_info_uri and "
"msc4335_soft_limit must be set."
)
if msc4335_soft_limit and not msc4335_increase_uri:
raise ConfigError(
"msc4335_increase_uri must be set if msc4335_soft_limit is true."
)
self.media_upload_limits.append(
MediaUploadLimit(
max_bytes,
time_period_ms,
msc4335_info_uri,
msc4335_soft_limit,
msc4335_increase_uri,
)
)
def generate_config_section(self, data_dir_path: str, **kwargs: Any) -> str:
assert data_dir_path is not None

View File

@@ -741,6 +741,7 @@ class SynapseSite(ProxySite):
def __init__(
self,
*,
logger_name: str,
site_tag: str,
config: ListenerConfig,

View File

@@ -37,6 +37,7 @@ from synapse.api.errors import (
Codes,
FederationDeniedError,
HttpResponseException,
MSC4335UserLimitExceededError,
NotFoundError,
RequestSendFailed,
SynapseError,
@@ -67,6 +68,7 @@ from synapse.media.media_storage import (
from synapse.media.storage_provider import StorageProviderWrapper
from synapse.media.thumbnailer import Thumbnailer, ThumbnailError
from synapse.media.url_previewer import UrlPreviewer
from synapse.rest.admin.experimental_features import ExperimentalFeature
from synapse.storage.databases.main.media_repository import LocalMedia, RemoteMedia
from synapse.types import UserID
from synapse.util.async_helpers import Linearizer
@@ -379,6 +381,25 @@ class MediaRepository:
sent_bytes=uploaded_media_size,
attempted_bytes=content_length,
)
# If the MSC4335 experimental feature is enabled and the media limit
# has the info_uri configured then we raise the MSC4335 error
msc4335_enabled = await self.store.is_feature_enabled(
auth_user.to_string(), ExperimentalFeature.MSC4335
)
if (
msc4335_enabled
and limit.msc4335_info_uri
and limit.msc4335_soft_limit is not None
):
raise MSC4335UserLimitExceededError(
403,
"Media upload limit exceeded",
limit.msc4335_info_uri,
limit.msc4335_soft_limit,
limit.msc4335_increase_uri,
)
# Otherwise we use the current behaviour albeit not spec compliant
# See: https://github.com/element-hq/synapse/issues/18749
raise SynapseError(
400, "Media upload limit exceeded", Codes.RESOURCE_LIMIT_EXCEEDED
)

View File

@@ -43,6 +43,7 @@ from typing_extensions import Concatenate, ParamSpec
from twisted.internet import defer
from twisted.internet.interfaces import IDelayedCall
from twisted.python.threadpool import ThreadPool
from twisted.web.resource import Resource
from synapse.api import errors
@@ -79,6 +80,7 @@ from synapse.http.servlet import parse_json_object_from_request
from synapse.http.site import SynapseRequest
from synapse.logging.context import (
defer_to_thread,
defer_to_threadpool,
make_deferred_yieldable,
run_in_background,
)
@@ -1733,6 +1735,33 @@ class ModuleApi:
"""
return await defer_to_thread(self._hs.get_reactor(), f, *args, **kwargs)
async def defer_to_threadpool(
self,
threadpool: ThreadPool,
f: Callable[P, T],
*args: P.args,
**kwargs: P.kwargs,
) -> T:
"""Runs the given function in a separate thread from the given thread pool.
Allows specifying a custom thread pool instead of using the default Synapse
one. To use the default Synapse threadpool, use `defer_to_thread` instead.
Added in Synapse v1.140.0.
Args:
threadpool: The thread pool to use.
f: The function to run.
args: The function's arguments.
kwargs: The function's keyword arguments.
Returns:
The return value of the function once ran in a thread.
"""
return await defer_to_threadpool(
self._hs.get_reactor(), threadpool, f, *args, **kwargs
)
async def check_username(self, username: str) -> None:
"""Checks if the provided username uses the grammar defined in the Matrix
specification, and is already being used by an existing user.

View File

@@ -42,6 +42,7 @@ from synapse.rest.client import (
login,
login_token_request,
logout,
matrixrtc,
mutual_rooms,
notifications,
openid,
@@ -89,6 +90,7 @@ CLIENT_SERVLET_FUNCTIONS: Tuple[RegisterServletsFunc, ...] = (
presence.register_servlets,
directory.register_servlets,
voip.register_servlets,
matrixrtc.register_servlets,
pusher.register_servlets,
push_rule.register_servlets,
logout.register_servlets,

View File

@@ -57,6 +57,9 @@ from synapse.rest.admin.event_reports import (
EventReportDetailRestServlet,
EventReportsRestServlet,
)
from synapse.rest.admin.events import (
EventRestServlet,
)
from synapse.rest.admin.experimental_features import ExperimentalFeaturesRestServlet
from synapse.rest.admin.federation import (
DestinationMembershipRestServlet,
@@ -339,6 +342,7 @@ def register_servlets(hs: "HomeServer", http_server: HttpServer) -> None:
ExperimentalFeaturesRestServlet(hs).register(http_server)
SuspendAccountRestServlet(hs).register(http_server)
ScheduledTasksRestServlet(hs).register(http_server)
EventRestServlet(hs).register(http_server)
def register_servlets_for_client_rest_resource(

View File

@@ -0,0 +1,69 @@
from http import HTTPStatus
from typing import TYPE_CHECKING, Tuple
from synapse.api.errors import NotFoundError
from synapse.events.utils import (
SerializeEventConfig,
format_event_raw,
serialize_event,
)
from synapse.http.servlet import RestServlet
from synapse.http.site import SynapseRequest
from synapse.rest.admin import admin_patterns
from synapse.rest.admin._base import assert_user_is_admin
from synapse.storage.databases.main.events_worker import EventRedactBehaviour
from synapse.types import JsonDict
if TYPE_CHECKING:
from synapse.server import HomeServer
class EventRestServlet(RestServlet):
"""
Get an event that is known to the homeserver.
The requester must have administrator access in Synapse.
GET /_synapse/admin/v1/fetch_event/<event_id>
returns:
200 OK with event json if the event is known to the homeserver. Otherwise raises
a NotFound error.
Args:
event_id: the id of the requested event.
Returns:
JSON blob of the event
"""
PATTERNS = admin_patterns("/fetch_event/(?P<event_id>[^/]*)$")
def __init__(self, hs: "HomeServer"):
self._auth = hs.get_auth()
self._store = hs.get_datastores().main
self._clock = hs.get_clock()
async def on_GET(
self, request: SynapseRequest, event_id: str
) -> Tuple[int, JsonDict]:
requester = await self._auth.get_user_by_req(request)
await assert_user_is_admin(self._auth, requester)
event = await self._store.get_event(
event_id,
EventRedactBehaviour.as_is,
allow_none=True,
)
if event is None:
raise NotFoundError("Event not found")
config = SerializeEventConfig(
as_client_event=False,
event_format=format_event_raw,
requester=requester,
only_event_fields=None,
include_stripped_room_state=True,
include_admin_metadata=True,
)
res = {"event": serialize_event(event, self._clock.time_msec(), config=config)}
return HTTPStatus.OK, res

View File

@@ -44,6 +44,7 @@ class ExperimentalFeature(str, Enum):
MSC3881 = "msc3881"
MSC3575 = "msc3575"
MSC4222 = "msc4222"
MSC4335 = "msc4335"
def is_globally_enabled(self, config: "HomeServerConfig") -> bool:
if self is ExperimentalFeature.MSC3881:
@@ -52,6 +53,8 @@ class ExperimentalFeature(str, Enum):
return config.experimental.msc3575_enabled
if self is ExperimentalFeature.MSC4222:
return config.experimental.msc4222_enabled
if self is ExperimentalFeature.MSC4335:
return config.experimental.msc4335_enabled
assert_never(self)

View File

@@ -0,0 +1,52 @@
#
# This file is licensed under the Affero General Public License (AGPL) version 3.
#
# Copyright (C) 2025 New Vector, Ltd
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# See the GNU Affero General Public License for more details:
# <https://www.gnu.org/licenses/agpl-3.0.html>.
#
# [This file includes modifications made by New Vector Limited]
#
#
from typing import TYPE_CHECKING, Tuple
from synapse.http.server import HttpServer
from synapse.http.servlet import RestServlet
from synapse.http.site import SynapseRequest
from synapse.rest.client._base import client_patterns
from synapse.types import JsonDict
if TYPE_CHECKING:
from synapse.server import HomeServer
class MatrixRTCRestServlet(RestServlet):
PATTERNS = client_patterns(r"/org\.matrix\.msc4143/rtc/transports$", releases=())
CATEGORY = "Client API requests"
def __init__(self, hs: "HomeServer"):
super().__init__()
self._hs = hs
self._auth = hs.get_auth()
self._transports = hs.config.matrix_rtc.transports
async def on_GET(self, request: SynapseRequest) -> Tuple[int, JsonDict]:
# Require authentication for this endpoint.
await self._auth.get_user_by_req(request)
if self._transports:
return 200, {"rtc_transports": self._transports}
return 200, {}
def register_servlets(hs: "HomeServer", http_server: HttpServer) -> None:
if hs.config.experimental.msc4143_enabled:
MatrixRTCRestServlet(hs).register(http_server)

View File

@@ -175,6 +175,7 @@ from synapse.storage.controllers import StorageControllers
from synapse.streams.events import EventSources
from synapse.synapse_rust.rendezvous import RendezvousHandler
from synapse.types import DomainSpecificString, ISynapseReactor
from synapse.util import SYNAPSE_VERSION
from synapse.util.caches import CACHE_METRIC_REGISTRY
from synapse.util.clock import Clock
from synapse.util.distributor import Distributor
@@ -322,7 +323,6 @@ class HomeServer(metaclass=abc.ABCMeta):
hostname: str,
config: HomeServerConfig,
reactor: Optional[ISynapseReactor] = None,
version_string: str = "Synapse",
):
"""
Args:
@@ -347,7 +347,7 @@ class HomeServer(metaclass=abc.ABCMeta):
self._instance_id = random_string(5)
self._instance_name = config.worker.instance_name
self.version_string = version_string
self.version_string = f"Synapse/{SYNAPSE_VERSION}"
self.datastores: Optional[Databases] = None
@@ -613,12 +613,6 @@ class HomeServer(metaclass=abc.ABCMeta):
self.datastores = Databases(self.DATASTORE_CLASS, self)
logger.info("Finished setting up.")
# Register background tasks required by this server. This must be done
# somewhat manually due to the background tasks not being registered
# unless handlers are instantiated.
if self.config.worker.run_background_tasks:
self.start_background_tasks()
# def __del__(self) -> None:
# """
# Called when an the homeserver is garbage collected.

View File

@@ -37,7 +37,13 @@ class HomeserverAppStartTestCase(ConfigFileTestCase):
self.add_lines_to_config([" main:", " host: 127.0.0.1", " port: 1234"])
# Ensure that starting master process with worker config raises an exception
with self.assertRaises(ConfigError):
# Do a normal homeserver creation and setup
homeserver_config = synapse.app.homeserver.load_or_generate_config(
["-c", self.config_file]
)
synapse.app.homeserver.setup(homeserver_config)
# XXX: The error will be raised at this point
hs = synapse.app.homeserver.create_homeserver(homeserver_config)
# Continue with the setup. We don't expect this to run because we raised
# earlier, but in the future, the code could be refactored to raise the
# error in a different place.
synapse.app.homeserver.setup(hs)

View File

@@ -99,7 +99,14 @@ class ConfigLoadingFileTestCase(ConfigFileTestCase):
def test_disable_registration(self) -> None:
self.generate_config()
self.add_lines_to_config(
["enable_registration: true", "disable_registration: true"]
[
"enable_registration: true",
"disable_registration: true",
# We're not worried about open registration in this test. This test is
# focused on making sure that enable/disable_registration properly
# override each other.
"enable_registration_without_verification: true",
]
)
# Check that disable_registration clobbers enable_registration.
config = HomeServerConfig.load_config("", ["-c", self.config_file])

View File

@@ -19,6 +19,8 @@
#
#
import argparse
import synapse.app.homeserver
from synapse.config import ConfigError
from synapse.config.homeserver import HomeServerConfig
@@ -99,6 +101,39 @@ class RegistrationConfigTestCase(ConfigFileTestCase):
)
def test_refuse_to_start_if_open_registration_and_no_verification(self) -> None:
"""
Test that our utilities to start the main Synapse homeserver process refuses
to start if we detect open registration.
"""
self.generate_config()
self.add_lines_to_config(
[
" ",
"enable_registration: true",
"registrations_require_3pid: []",
"enable_registration_captcha: false",
"registration_requires_token: false",
]
)
# Test that allowing open registration without verification raises an error
with self.assertRaises(SystemExit):
# Do a normal homeserver creation and setup
homeserver_config = synapse.app.homeserver.load_or_generate_config(
["-c", self.config_file]
)
# XXX: The error will be raised at this point
hs = synapse.app.homeserver.create_homeserver(homeserver_config)
# Continue with the setup. We don't expect this to run because we raised
# earlier, but in the future, the code could be refactored to raise the
# error in a different place.
synapse.app.homeserver.setup(hs)
def test_load_config_error_if_open_registration_and_no_verification(self) -> None:
"""
Test that `HomeServerConfig.load_config(...)` raises an exception when we detect open
registration.
"""
self.generate_config()
self.add_lines_to_config(
[
@@ -112,7 +147,57 @@ class RegistrationConfigTestCase(ConfigFileTestCase):
# Test that allowing open registration without verification raises an error
with self.assertRaises(ConfigError):
homeserver_config = synapse.app.homeserver.load_or_generate_config(
["-c", self.config_file]
_homeserver_config = HomeServerConfig.load_config(
description="test", argv_options=["-c", self.config_file]
)
def test_load_or_generate_config_error_if_open_registration_and_no_verification(
self,
) -> None:
"""
Test that `HomeServerConfig.load_or_generate_config(...)` raises an exception when we detect open
registration.
"""
self.generate_config()
self.add_lines_to_config(
[
" ",
"enable_registration: true",
"registrations_require_3pid: []",
"enable_registration_captcha: false",
"registration_requires_token: false",
]
)
# Test that allowing open registration without verification raises an error
with self.assertRaises(ConfigError):
_homeserver_config = HomeServerConfig.load_or_generate_config(
description="test", argv_options=["-c", self.config_file]
)
def test_load_config_with_parser_error_if_open_registration_and_no_verification(
self,
) -> None:
"""
Test that `HomeServerConfig.load_config_with_parser(...)` raises an exception when we detect open
registration.
"""
self.generate_config()
self.add_lines_to_config(
[
" ",
"enable_registration: true",
"registrations_require_3pid: []",
"enable_registration_captcha: false",
"registration_requires_token: false",
]
)
# Test that allowing open registration without verification raises an error
with self.assertRaises(ConfigError):
config_parser = argparse.ArgumentParser(description="test")
HomeServerConfig.add_arguments_to_parser(config_parser)
_homeserver_config = HomeServerConfig.load_config_with_parser(
parser=config_parser, argv_options=["-c", self.config_file]
)
synapse.app.homeserver.setup(homeserver_config)

View File

@@ -0,0 +1,74 @@
from twisted.internet.testing import MemoryReactor
import synapse.rest.admin
from synapse.api.errors import Codes
from synapse.rest.client import login, room
from synapse.server import HomeServer
from synapse.util.clock import Clock
from tests import unittest
class FetchEventTestCase(unittest.HomeserverTestCase):
servlets = [
synapse.rest.admin.register_servlets,
login.register_servlets,
room.register_servlets,
]
def prepare(self, reactor: MemoryReactor, clock: Clock, hs: HomeServer) -> None:
self.admin_user = self.register_user("admin", "pass", admin=True)
self.admin_user_tok = self.login("admin", "pass")
self.other_user = self.register_user("user", "pass")
self.other_user_tok = self.login("user", "pass")
self.room_id1 = self.helper.create_room_as(
self.other_user, tok=self.other_user_tok, is_public=True
)
resp = self.helper.send(self.room_id1, body="Hey now", tok=self.other_user_tok)
self.event_id = resp["event_id"]
def test_no_auth(self) -> None:
"""
Try to get an event without authentication.
"""
channel = self.make_request(
"GET",
f"/_synapse/admin/v1/fetch_event/{self.event_id}",
)
self.assertEqual(401, channel.code, msg=channel.json_body)
self.assertEqual(Codes.MISSING_TOKEN, channel.json_body["errcode"])
def test_requester_is_not_admin(self) -> None:
"""
If the user is not a server admin, an error 403 is returned.
"""
channel = self.make_request(
"GET",
f"/_synapse/admin/v1/fetch_event/{self.event_id}",
access_token=self.other_user_tok,
)
self.assertEqual(403, channel.code, msg=channel.json_body)
self.assertEqual(Codes.FORBIDDEN, channel.json_body["errcode"])
def test_fetch_event(self) -> None:
"""
Test that we can successfully fetch an event
"""
channel = self.make_request(
"GET",
f"/_synapse/admin/v1/fetch_event/{self.event_id}",
access_token=self.admin_user_tok,
)
self.assertEqual(200, channel.code, msg=channel.json_body)
self.assertEqual(
channel.json_body["event"]["content"],
{"body": "Hey now", "msgtype": "m.text"},
)
self.assertEqual(channel.json_body["event"]["event_id"], self.event_id)
self.assertEqual(channel.json_body["event"]["type"], "m.room.message")
self.assertEqual(channel.json_body["event"]["sender"], self.other_user)

View File

@@ -0,0 +1,105 @@
#
# This file is licensed under the Affero General Public License (AGPL) version 3.
#
# Copyright (C) 2025 New Vector, Ltd
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# See the GNU Affero General Public License for more details:
# <https://www.gnu.org/licenses/agpl-3.0.html>.
#
# [This file includes modifications made by New Vector Limited]
#
#
"""Tests REST events for /rtc/endpoints path."""
from twisted.internet.testing import MemoryReactor
from synapse.rest import admin
from synapse.rest.client import login, matrixrtc, register, room
from synapse.server import HomeServer
from synapse.util.clock import Clock
from tests.unittest import HomeserverTestCase, override_config
PATH_PREFIX = "/_matrix/client/unstable/org.matrix.msc4143"
RTC_ENDPOINT = {"type": "focusA", "required_field": "theField"}
LIVEKIT_ENDPOINT = {
"type": "livekit",
"livekit_service_url": "https://livekit.example.com",
}
class MatrixRtcTestCase(HomeserverTestCase):
"""Tests /rtc/transports Client-Server REST API."""
servlets = [
admin.register_servlets,
room.register_servlets,
login.register_servlets,
register.register_servlets,
matrixrtc.register_servlets,
]
def prepare(
self, reactor: MemoryReactor, clock: Clock, homeserver: HomeServer
) -> None:
self.register_user("alice", "password")
self._alice_tok = self.login("alice", "password")
def test_matrixrtc_endpoint_not_enabled(self) -> None:
channel = self.make_request(
"GET", f"{PATH_PREFIX}/rtc/transports", access_token=self._alice_tok
)
self.assertEqual(404, channel.code, channel.json_body)
self.assertEqual(
"M_UNRECOGNIZED", channel.json_body["errcode"], channel.json_body
)
@override_config({"experimental_features": {"msc4143_enabled": True}})
def test_matrixrtc_endpoint_requires_authentication(self) -> None:
channel = self.make_request("GET", f"{PATH_PREFIX}/rtc/transports")
self.assertEqual(401, channel.code, channel.json_body)
@override_config(
{
"experimental_features": {"msc4143_enabled": True},
"matrix_rtc": {"transports": [RTC_ENDPOINT]},
}
)
def test_matrixrtc_endpoint_contains_expected_transport(self) -> None:
channel = self.make_request(
"GET", f"{PATH_PREFIX}/rtc/transports", access_token=self._alice_tok
)
self.assertEqual(200, channel.code, channel.json_body)
self.assert_dict({"rtc_transports": [RTC_ENDPOINT]}, channel.json_body)
@override_config(
{
"experimental_features": {"msc4143_enabled": True},
"matrix_rtc": {"transports": []},
}
)
def test_matrixrtc_endpoint_no_transports_configured(self) -> None:
channel = self.make_request(
"GET", f"{PATH_PREFIX}/rtc/transports", access_token=self._alice_tok
)
self.assertEqual(200, channel.code, channel.json_body)
self.assert_dict({}, channel.json_body)
@override_config(
{
"experimental_features": {"msc4143_enabled": True},
"matrix_rtc": {"transports": [LIVEKIT_ENDPOINT]},
}
)
def test_matrixrtc_endpoint_livekit_transport(self) -> None:
channel = self.make_request(
"GET", f"{PATH_PREFIX}/rtc/transports", access_token=self._alice_tok
)
self.assertEqual(200, channel.code, channel.json_body)
self.assert_dict({"rtc_transports": [LIVEKIT_ENDPOINT]}, channel.json_body)

View File

@@ -44,9 +44,11 @@ from twisted.web.http_headers import Headers
from twisted.web.iweb import UNKNOWN_LENGTH, IResponse
from twisted.web.resource import Resource
from synapse.api.errors import HttpResponseException
from synapse.api.errors import Codes, HttpResponseException
from synapse.api.ratelimiting import Ratelimiter
from synapse.config import ConfigError
from synapse.config._base import Config
from synapse.config.homeserver import HomeServerConfig
from synapse.config.oembed import OEmbedEndpointConfig
from synapse.http.client import MultipartResponse
from synapse.http.types import QueryParams
@@ -75,6 +77,7 @@ from tests.media.test_media_storage import (
from tests.server import FakeChannel, FakeTransport, ThreadedMemoryReactorClock
from tests.test_utils import SMALL_PNG
from tests.unittest import override_config
from tests.utils import default_config
try:
import lxml
@@ -2880,11 +2883,12 @@ class MediaUploadLimits(unittest.HomeserverTestCase):
config["media_storage_providers"] = [provider_config]
# These are the limits that we are testing
config["media_upload_limits"] = [
{"time_period": "1d", "max_size": "1K"},
{"time_period": "1w", "max_size": "3K"},
]
# These are the limits that we are testing unless overridden
if config.get("media_upload_limits") is None:
config["media_upload_limits"] = [
{"time_period": "1d", "max_size": "1K"},
{"time_period": "1w", "max_size": "3K"},
]
return self.setup_test_homeserver(config=config)
@@ -2970,6 +2974,173 @@ class MediaUploadLimits(unittest.HomeserverTestCase):
channel = self.upload_media(900)
self.assertEqual(channel.code, 200)
def test_msc4335_requires_config(self) -> None:
config_dict = default_config("test")
# msc4335_info_uri and msc4335_soft_limit are required
# msc4335_increase_uri is required if msc4335_soft_limit is true
with self.assertRaises(ConfigError):
HomeServerConfig().parse_config_dict(
{
"media_upload_limits": [
{
"time_period": "1d",
"max_size": "1K",
"msc4335_info_uri": "https://example.com",
}
],
**config_dict,
},
"",
"",
)
with self.assertRaises(ConfigError):
HomeServerConfig().parse_config_dict(
{
"media_upload_limits": [
{
"time_period": "1d",
"max_size": "1K",
"msc4335_info_uri": "https://example.com",
"msc4335_soft_limit": True,
}
],
**config_dict,
},
"",
"",
)
with self.assertRaises(ConfigError):
HomeServerConfig().parse_config_dict(
{
"media_upload_limits": [
{
"time_period": "1d",
"max_size": "1K",
"msc4335_soft_limit": False,
}
],
**config_dict,
},
"",
"",
)
with self.assertRaises(ConfigError):
HomeServerConfig().parse_config_dict(
{
"media_upload_limits": [
{
"time_period": "1d",
"max_size": "1K",
"msc4335_soft_limit": True,
}
],
**config_dict,
},
"",
"",
)
with self.assertRaises(ConfigError):
HomeServerConfig().parse_config_dict(
{
"media_upload_limits": [
{
"time_period": "1d",
"max_size": "1K",
"msc4335_increase_uri": "https://example.com/increase",
}
],
**config_dict,
},
"",
"",
)
@override_config(
{
"experimental_features": {"msc4335_enabled": True},
"media_upload_limits": [
{
"time_period": "1d",
"max_size": "1K",
"msc4335_info_uri": "https://example.com",
"msc4335_soft_limit": False,
}
],
}
)
def test_msc4335_returns_hard_user_limit_exceeded(self) -> None:
"""Test that the MSC4335 error is returned with soft_limit False when experimental feature is enabled."""
channel = self.upload_media(500)
self.assertEqual(channel.code, 200)
channel = self.upload_media(800)
self.assertEqual(channel.code, 403)
self.assertEqual(
channel.json_body["errcode"], "ORG.MATRIX.MSC4335_USER_LIMIT_EXCEEDED"
)
self.assertEqual(
channel.json_body["org.matrix.msc4335.info_uri"], "https://example.com"
)
self.assertEquals(hasattr(channel.json_body, "org.matrix.msc4335.increase_uri"), False)
@override_config(
{
"experimental_features": {"msc4335_enabled": True},
"media_upload_limits": [
{
"time_period": "1d",
"max_size": "1K",
"msc4335_info_uri": "https://example.com",
"msc4335_soft_limit": True,
"msc4335_increase_uri": "https://example.com/increase",
}
],
}
)
def test_msc4335_returns_soft_user_limit_exceeded(self) -> None:
"""Test that the MSC4335 error is returned with soft_limit True when experimental feature is enabled."""
channel = self.upload_media(500)
self.assertEqual(channel.code, 200)
channel = self.upload_media(800)
self.assertEqual(channel.code, 403)
self.assertEqual(
channel.json_body["errcode"], "ORG.MATRIX.MSC4335_USER_LIMIT_EXCEEDED"
)
self.assertEqual(
channel.json_body["org.matrix.msc4335.info_uri"], "https://example.com"
)
self.assertEqual(channel.json_body["org.matrix.msc4335.soft_limit"], True)
self.assertEqual(
channel.json_body["org.matrix.msc4335.increase_uri"],
"https://example.com/increase",
)
@override_config(
{
"experimental_features": {"msc4335_enabled": True},
"media_upload_limits": [
{
"time_period": "1d",
"max_size": "1K",
}
],
}
)
def test_msc4335_requires_info_uri(self) -> None:
"""Test that the MSC4335 error is not used if info_uri is not provided."""
channel = self.upload_media(500)
self.assertEqual(channel.code, 200)
channel = self.upload_media(800)
self.assertEqual(channel.code, 400)
class MediaUploadLimitsModuleOverrides(unittest.HomeserverTestCase):
"""
@@ -3002,10 +3173,11 @@ class MediaUploadLimitsModuleOverrides(unittest.HomeserverTestCase):
config["media_storage_providers"] = [provider_config]
# default limits to use
config["media_upload_limits"] = [
{"time_period": "1d", "max_size": "1K"},
{"time_period": "1w", "max_size": "3K"},
]
if config.get("media_upload_limits") is None:
config["media_upload_limits"] = [
{"time_period": "1d", "max_size": "1K"},
{"time_period": "1w", "max_size": "3K"},
]
return self.setup_test_homeserver(config=config)
@@ -3158,3 +3330,25 @@ class MediaUploadLimitsModuleOverrides(unittest.HomeserverTestCase):
)
self.assertEqual(self.last_media_upload_limit_exceeded["sent_bytes"], 500)
self.assertEqual(self.last_media_upload_limit_exceeded["attempted_bytes"], 800)
@override_config(
{
"media_upload_limits": [
{
"time_period": "1d",
"max_size": "1K",
"msc4335_info_uri": "https://example.com",
"msc4335_soft_limit": False,
},
]
}
)
def test_msc4335_defaults_disabled(self) -> None:
"""Test that the MSC4335 is not used unless experimental feature is enabled."""
channel = self.upload_media(500, self.tok3)
self.assertEqual(channel.code, 200)
channel = self.upload_media(800, self.tok3)
# n.b. this response is not spec compliant as described at: https://github.com/element-hq/synapse/issues/18749
self.assertEqual(channel.code, 400)
self.assertEqual(channel.json_body["errcode"], Codes.RESOURCE_LIMIT_EXCEEDED)

View File

@@ -1198,7 +1198,6 @@ def setup_test_homeserver(
hs = homeserver_to_use(
server_name,
config=config,
version_string="Synapse/tests",
reactor=reactor,
)

View File

@@ -236,17 +236,17 @@ class OptionsResourceTests(unittest.TestCase):
"""Create a request from the method/path and return a channel with the response."""
# Create a site and query for the resource.
site = SynapseSite(
"test",
"site_tag",
parse_listener_def(
logger_name="test",
site_tag="site_tag",
config=parse_listener_def(
0,
{
"type": "http",
"port": 0,
},
),
self.resource,
"1.0",
resource=self.resource,
server_version_string="1",
max_request_body_size=4096,
reactor=self.reactor,
hs=self.homeserver,