1
0

Compare commits

...

140 Commits

Author SHA1 Message Date
H. Shay
9cd13d0f26 temp move check_schema_delta behind lint gating because this pr will always fail that 2023-06-13 15:39:53 -07:00
H. Shay
b4cc31d906 lint 2023-06-05 19:21:59 -07:00
H. Shay
d69d109ad6 newsfragment 2023-06-05 13:55:57 -07:00
H. Shay
a317ccbc7a make adding indexes to event_stream_ordering columns bg job depend on add_event_stream_ordering bg job 2023-06-05 13:50:24 -07:00
H. Shay
7bc2aefe92 add triggers to events_stream_ordering columns in bg job and make this job depend on the add_stream_ordering bg job 2023-06-05 13:49:24 -07:00
H. Shay
67f152b476 add event_stream_ordering columns in background job and make it depend on replace_stream_ordering_column job 2023-06-05 13:38:18 -07:00
Shay
d0c4257f14 N + 3: Read from column full_user_id rather than user_id of tables profiles and user_filters (#15649) 2023-06-02 17:24:13 -07:00
Mathieu Velten
e0f2429d13 Add a catch-all * to the supported relation types when redacting (#15705)
This is an update to MSC3912 implementation
2023-06-02 13:13:50 +00:00
Eric Eastwood
30a5076da8 Log when events are (unexpectedly) filtered out of responses in tests (#14213)
See https://github.com/matrix-org/synapse/pull/14095#discussion_r990335492

This is useful because when see that a relevant event is an `outlier` or `soft-failed`, then that's a good unexpected indicator explaining why it's not showing up. `filter_events_for_client` is used in `/sync`, `/messages`, `/context` which are all common end-to-end assertion touch points (also notifications, relations).
2023-06-01 21:27:18 -05:00
H. Shay
8af29155ec Merge branch 'release-v1.85' into develop 2023-06-01 10:26:37 -07:00
H. Shay
4c0bffaca5 1.85.0rc2 2023-06-01 09:16:35 -07:00
Erik Johnston
5ed0e8c61f Cache requests for user's devices from federation (#15675)
This should mitigate the issue where lots of different servers requests
the same user's devices all at once.
2023-06-01 13:25:20 +00:00
Hugh Nimmo-Smith
d1693f0362 Implement stable support for MSC3882 to allow an existing device/session to generate a login token for use on a new device/session (#15388)
Implements stable support for MSC3882; this involves updating Synapse's support to
match the MSC / the spec says.

Continue to support the unstable version to allow clients to transition.
2023-06-01 08:52:51 -04:00
Patrick Cloke
a273561c22 Add a note about deprecating /register with a user property. (#15703)
Application services providing a "user" property (instead of "username") for
the /register endpoint was never specified. Deprecate this very old
fallback.
2023-06-01 08:21:37 -04:00
Shay
6d9e2fd878 Speed up background jobs populate_full_user_id_user_filters and populate_full_user_id_profiles (#15700) 2023-05-31 15:13:48 -07:00
Eric Eastwood
0b5f64ff09 Add Synapse version deploy annotations to Grafana dashboard (#15674)
Fix https://github.com/matrix-org/synapse/issues/15662

This manifests as purple lines that show up on all time series panels
that you can hover and see what version was deployed.

Also added a new "Deployed Synapse versions over time" panel
where the color block changes with each version. And mixed this
color block into the "Up" time series panel.

To get the Grafana dashboard JSON to copy here: use the **Share** icon at the top -> **Export** -> check the **Export for sharing externally** option -> **View JSON** or **Save to file**
2023-05-31 14:35:49 -05:00
Patrick Cloke
6f18812bb0 Add stubs package for lxml. (#15697)
The stubs have some issues so this has some generous cast
and ignores in it, but it is better than not having stubs.

Note that confusing that Element is a function which creates
_Element instances (and similarly for Comment).
2023-05-31 17:06:57 +00:00
Jason Little
874378c052 Docker fully qualified image names (#15689)
* Fully qualified docker image names for the main Dockerfile and Complement related.

* Fully qualified docker image names for Dockerfiles associated with building Debian release artifacts.

This one is harder and is separate from the other commit in case it wasn't correct or was unwanted. I decided to
do the expansion on the docker images in the Dockerfile itself, instead of the various source places that build
which distribution that is selected, as it would have been more invasive with the scripts breaking up the string
for tagging and such. This one is untested.

* Changelog

* Update docker/Dockerfile-workers

* Update docker/complement/Dockerfile

---------

Co-authored-by: reivilibre <olivier@librepush.net>
2023-05-31 15:13:31 +00:00
reivilibre
11e15d79b8 Fix a performance issue introduced in Synapse v1.83.0 which meant that purging rooms was very slow and database-intensive. (#15693)
* Add indices required to efficiently validate new foreign key constraints on stream_ordering

* Newsfile

Signed-off-by: Olivier Wilkinson (reivilibre) <oliverw@matrix.org>

---------

Signed-off-by: Olivier Wilkinson (reivilibre) <oliverw@matrix.org>
2023-05-31 14:59:56 +01:00
Gabriel Féron
daf3a67908 Add get_canonical_room_alias to module API (#15450)
Co-authored-by: Boxdot <d@zerovolt.org>
2023-05-31 09:18:37 -04:00
Patrick Cloke
c01343de43 Add stricter mypy options (#15694)
Enable warn_unused_configs, strict_concatenate, disallow_subclassing_any,
and disallow_incomplete_defs.
2023-05-31 07:18:29 -04:00
David Robertson
6fc3deb029 Merge branch 'release-v1.85' into develop 2023-05-30 16:08:33 +01:00
Quentin Gliech
ceb3dd77db Enforce that an admin token also has the basic Matrix API scope 2023-05-30 09:43:06 -04:00
Quentin Gliech
32a2f05004 Make the config tests spawn the homeserver only when needed 2023-05-30 09:43:06 -04:00
Quentin Gliech
f739bde962 Reject tokens with multiple device scopes 2023-05-30 09:43:06 -04:00
Quentin Gliech
98afc57d59 Make OIDC scope constants 2023-05-30 09:43:06 -04:00
Quentin Gliech
14a5be9c4d Handle errors when introspecting tokens
This returns a proper 503 when the introspection endpoint is not working
for some reason, which should avoid logging out clients in those cases.
2023-05-30 09:43:06 -04:00
Quentin Gliech
ec9379d7e2 Newsfile. 2023-05-30 09:43:06 -04:00
Quentin Gliech
e343125b38 Disable incompatible Admin API endpoints 2023-05-30 09:43:06 -04:00
Quentin Gliech
4d0231b364 Make AS tokens work & allow ASes to /register 2023-05-30 09:43:06 -04:00
Quentin Gliech
c008b44b4f Add an admin token for MAS -> Synapse calls 2023-05-30 09:43:06 -04:00
Hugh Nimmo-Smith
bad1f2cd35 Tests for JWKS endpoint 2023-05-30 09:43:06 -04:00
Hugh Nimmo-Smith
249f4a338d Refactor config to be an experimental feature
Also enforce you can't combine it with incompatible config options
2023-05-30 09:43:06 -04:00
Hugh Nimmo-Smith
03920bdd4e Test MSC2965 implementation: well-known discovery document 2023-05-30 09:43:06 -04:00
Quentin Gliech
31691d6151 Disable account related endpoints when using OAuth delegation 2023-05-30 09:43:06 -04:00
Hugh Nimmo-Smith
5fe96082d0 Actually enforce guest + return www-authenticate header 2023-05-30 09:43:06 -04:00
Hugh Nimmo-Smith
28a9663bdf Initial tests for OAuth delegation 2023-05-30 09:43:06 -04:00
Hugh Nimmo-Smith
a1374b5c70 MSC2967: Check access token scope for use as user and add guest support 2023-05-30 09:43:06 -04:00
Hugh Nimmo-Smith
d20669971a Use name claim as display name when registering users on the fly.
This makes is so that the `name` claim got when introspecting the token
is used as the display name when registering a user on the fly.
2023-05-30 09:43:06 -04:00
Quentin Gliech
f9cd549f64 Record the sub claims as an external_id 2023-05-30 09:43:06 -04:00
Quentin Gliech
7628dbf4e9 Handle the Synapse admin scope 2023-05-30 09:43:06 -04:00
Quentin Gliech
c5cf1b421d Save the scopes in the requester 2023-05-30 09:43:06 -04:00
Quentin Gliech
e82ec6d008 MSC2965: OIDC Provider discovery via well-known document 2023-05-30 09:43:06 -04:00
Quentin Gliech
8f576aa462 Expose the public keys used for client authentication on an endpoint 2023-05-30 09:43:06 -04:00
Quentin Gliech
765244faee Initial MSC3964 support: delegation of auth to OIDC server 2023-05-30 09:43:06 -04:00
Quentin Gliech
e2c8458bba Make the api.auth.Auth a Protocol 2023-05-30 09:43:06 -04:00
Sean Quah
5d8c659373 Remove unused FederationServer.__str__ override (#15690)
Signed-off-by: Sean Quah <seanq@matrix.org>
2023-05-30 14:37:39 +01:00
David Robertson
7477810cc2 fixup changelog 2023-05-30 14:33:05 +01:00
David Robertson
3389653e15 Update changelog 2023-05-30 14:18:42 +01:00
David Robertson
cebff6f4d5 Tweak release script dependabot wording 2023-05-30 14:05:44 +01:00
David Robertson
a103b874dd 1.85.0rc1 2023-05-30 14:03:22 +01:00
David Robertson
42786d8a47 Create dependabot changelogs at release time (#15481)
* Ditch dependabot changelog workflow

* Summarise dependabot commits in release script

* Changelog

* Update scripts-dev/release.py
2023-05-30 13:54:50 +01:00
dependabot[bot]
626bd75f48 Bump types-bleach from 6.0.0.1 to 6.0.0.3 (#15686)
* Bump types-bleach from 6.0.0.1 to 6.0.0.3

Bumps [types-bleach](https://github.com/python/typeshed) from 6.0.0.1 to 6.0.0.3.
- [Commits](https://github.com/python/typeshed/commits)

---
updated-dependencies:
- dependency-name: types-bleach
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

* Changelog

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
Co-authored-by: Patrick Cloke <clokep@users.noreply.github.com>
Co-authored-by: David Robertson <davidr@element.io>
2023-05-30 11:13:04 +01:00
dependabot[bot]
2b6c9150dc Bump types-requests from 2.30.0.0 to 2.31.0.0 (#15684)
* Bump types-requests from 2.30.0.0 to 2.31.0.0

Bumps [types-requests](https://github.com/python/typeshed) from 2.30.0.0 to 2.31.0.0.
- [Commits](https://github.com/python/typeshed/commits)

---
updated-dependencies:
- dependency-name: types-requests
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

* Changelog

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
2023-05-30 11:03:58 +01:00
dependabot[bot]
04798b710d Bump log from 0.4.17 to 0.4.18 (#15681) 2023-05-29 14:15:49 -04:00
dependabot[bot]
eb48b10f4f Bump pydantic from 1.10.7 to 1.10.8 (#15685) 2023-05-29 14:14:58 -04:00
dependabot[bot]
ea634a9f81 Bump prometheus-client from 0.16.0 to 0.17.0 (#15682) 2023-05-29 14:13:40 -04:00
dependabot[bot]
4f07c2a170 Bump types-pyyaml from 6.0.12.9 to 6.0.12.10 (#15683) 2023-05-29 14:07:25 -04:00
Jason Little
c835befd10 Add Unix socket support for Redis connections (#15644)
Adds a new configuration setting to connect to Redis via a Unix
socket instead of over TCP. Disabled by default.
2023-05-26 15:28:39 -04:00
Travis Ralston
50918c4940 Add MSC3820opt2 as a known room version (#15678) 2023-05-26 18:05:24 +00:00
Grant McLean
179f0f851e Documentation improvements to contributing guide (#15667) (#15668)
Fix #15667

 - Reiterate the importance of getting Rust installed and set up before attempting to install the Python dependencies.
 - Mention the importance of confirming that `poetry install` completed successfully and include a typical error that the user might see if it did not.
 - Expand on "Now edit homeserver.yaml" to give examples of things likely to need changing and to link to the relevant sections of the Synapse server documentation.
2023-05-26 12:28:04 -05:00
Patrick Cloke
2ad91ec628 Set thread_id column to non-null for event_push_{actions,actions_staging,summary} (#15597)
Updates the database schema to require a thread_id (by adding a
constraint that the column is non-null) for event_push_actions,
event_push_actions_staging, and event_push_actions_summary.

For PostgreSQL we add the constraint as NOT VALID, then
VALIDATE the constraint a background job to avoid locking
the table during an upgrade.

Each table is updated as a separate schema delta to avoid
deadlocks between them.

For SQLite we simply rebuild the table & copy the data.
2023-05-26 13:16:08 -04:00
Olivier Wilkinson (reivilibre)
a1154dfc20 Merge branch 'master' into develop 2023-05-26 17:16:15 +01:00
Olivier Wilkinson (reivilibre)
cb6f4a84a6 Fix a typographical error in changelog 2023-05-26 16:18:35 +01:00
Olivier Wilkinson (reivilibre)
65bf5f3649 1.84.1 2023-05-26 16:17:50 +01:00
reivilibre
c775d80b73 Fix a bug introduced in Synapse v1.84.0 where workers do not start up when no instance_map was provided. (#15672)
* Fix #15669: always populate instance map even if it was empty

* Fix some tests

* Fix more tests

* Newsfile

Signed-off-by: Olivier Wilkinson (reivilibre) <oliverw@matrix.org>

* CI fix: don't forget to update apt repository sources before installing olddeps deps

* Add test testing the backwards compatibility

---------

Signed-off-by: Olivier Wilkinson (reivilibre) <oliverw@matrix.org>
2023-05-26 14:28:55 +00:00
Travis Ralston
4e013093a8 Add MSC3820 (room version 11) option 2 unstable room version. (#15666) 2023-05-26 07:46:13 -04:00
reivilibre
2d8a2ca374 Add dch and notify-send to the development Nix flake so that the release script can be used. (#15673)
* Add dch and notify-send to the Nix dev flake

* Newsfile

Signed-off-by: Olivier Wilkinson (reivilibre) <oliverw@matrix.org>

---------

Signed-off-by: Olivier Wilkinson (reivilibre) <oliverw@matrix.org>
2023-05-26 11:53:10 +01:00
Eric Eastwood
77156a4bc1 Process previously failed backfill events in the background (#15585)
Process previously failed backfill events in the background because they are bound to fail again and we don't need to waste time holding up the request for something that is bound to fail again.

Fix https://github.com/matrix-org/synapse/issues/13623

Follow-up to https://github.com/matrix-org/synapse/issues/13621 and https://github.com/matrix-org/synapse/issues/13622

Part of making `/messages` faster: https://github.com/matrix-org/synapse/issues/13356
2023-05-24 23:22:24 -05:00
Shay
8839b6c2f8 Add requesting user id parameter to key claim methods in TransportLayerClient (#15663) 2023-05-24 13:23:26 -07:00
Patrick Cloke
ca5c4be921 Add type hints to test_descriptors. (#15659)
Require type hints in test_descriptors and add missing ones.
2023-05-24 14:18:52 +00:00
Erik Johnston
c7e9c1d5ae Speed up user directory rebuild for users some more... (#15665) 2023-05-24 14:13:28 +00:00
Patrick Cloke
1f55c04cbc Improve type hints for cached decorator. (#15658)
The cached decorators always return a Deferred, which was not
properly propagated. It was close enough when wrapping coroutines,
but failed if a bare function was wrapped.
2023-05-24 12:59:31 +00:00
Eric Eastwood
379eb2d7ab Fix @trace not wrapping some state methods that return coroutines correctly (#15647)
```
2023-05-21 09:30:09,288 - synapse.logging.opentracing - 940 - ERROR - POST-1 - @trace may not have wrapped StateStorageController.get_state_for_groups correctly! The function is not async but returned a coroutine
```

Tracing instrumentation for these functions originally introduced in https://github.com/matrix-org/synapse/pull/15610
2023-05-23 12:26:25 -05:00
Patrick Cloke
7c9b91790c Consolidate logic to check for deactivated users. (#15634)
This moves the deactivated user check to the method which
all login types call.

Additionally updates the application service tests to be more
realistic by removing invalid tests and fixing server names.
2023-05-23 10:35:43 -04:00
Jason Little
1df0221bda Use a custom scheme & the worker name for replication requests. (#15578)
All the information needed is already in the `instance_map`, so
use that instead of passing the hostname / IP & port manually
for each replication request.

This consolidates logic for future improvements of using e.g.
UNIX sockets for workers.
2023-05-23 09:05:30 -04:00
Olivier Wilkinson (reivilibre)
5b18a217ca Merge branch 'master' into develop 2023-05-23 13:27:31 +01:00
dependabot[bot]
03042e435b Bump requests from 2.28.2 to 2.31.0 (#15651) 2023-05-23 07:28:51 -04:00
Olivier Wilkinson (reivilibre)
5cae9158e6 Tweak changelog and upgrade notes 2023-05-23 11:13:38 +01:00
Olivier Wilkinson (reivilibre)
ea6fcda98d Tweak changelog 2023-05-23 11:03:06 +01:00
Olivier Wilkinson (reivilibre)
11ff4884e7 1.84.0 2023-05-23 10:57:39 +01:00
Eric Eastwood
1903c7e5ed Remove duplicate timestamp from test logs (_trial_temp/test.log) (#15636)
Fix https://github.com/matrix-org/synapse/issues/15618

### Before

```
2023-05-17 22:51:36-0500 [-] 2023-05-17 22:51:36,889 - synapse.server - 338 - INFO - sentinel - Finished setting up.
```

### After

```
2023-05-19 18:16:20-0500 [-] synapse.server - 338 - INFO - sentinel - Finished setting up.
```


### Dev notes

The `Twisted.Logger` controls the `2023-05-19 18:16:20-0500 [-]` prefix, see : [`twisted/twisted` -> `src/twisted/logger/_format.py#L362-L374`](34b161e66b/src/twisted/logger/_format.py (L362-L374))

And we delegate our logs to the Twisted Logger for the tests which puts it in `_trial_temp/test.log`
2023-05-22 13:49:01 -05:00
Andrew Morgan
737f7ddf58 Remove outdated comment in log config (#15648) 2023-05-22 17:58:58 +00:00
Patrick Cloke
c5d1e6d414 Properly parse event_fields in filters (#15607)
The event_fields property in filters should use the proper
escape rules, namely backslashes can be escaped with
an additional backslash.

This adds tests (adapted from matrix-js-sdk) and implements
the logic to properly split the event_fields strings.
2023-05-22 11:31:22 -04:00
dependabot[bot]
201597fc86 Bump pygithub from 1.58.1 to 1.58.2 (#15643)
* Bump pygithub from 1.58.1 to 1.58.2

Bumps [pygithub](https://github.com/pygithub/pygithub) from 1.58.1 to 1.58.2.
- [Release notes](https://github.com/pygithub/pygithub/releases)
- [Changelog](https://github.com/PyGithub/PyGithub/blob/v1.58.2/doc/changes.rst)
- [Commits](https://github.com/pygithub/pygithub/compare/v1.58.1...v1.58.2)

---
updated-dependencies:
- dependency-name: pygithub
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

* Changelog

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
2023-05-22 15:39:19 +01:00
Sean Quah
cc53c96bf8 Limit the size of the HomeServerConfig cache in trial test runs (#15646)
...to try to control memory usage. `HomeServerConfig`s hold on to
many Jinja2 objects, which come out to over 0.5 MiB per config.

Over the course of a full test run, the cache grows to ~360 entries.
Limit it to 8 entries.

Part of #15622.

Signed-off-by: Sean Quah <seanq@matrix.org>
2023-05-22 13:25:39 +01:00
dependabot[bot]
a47b2065f0 Bump furo from 2023.3.27 to 2023.5.20 (#15642)
* Bump furo from 2023.3.27 to 2023.5.20

Bumps [furo](https://github.com/pradyunsg/furo) from 2023.3.27 to 2023.5.20.
- [Release notes](https://github.com/pradyunsg/furo/releases)
- [Changelog](https://github.com/pradyunsg/furo/blob/main/docs/changelog.md)
- [Commits](https://github.com/pradyunsg/furo/compare/2023.03.27...2023.05.20)

---
updated-dependencies:
- dependency-name: furo
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

* Changelog

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
2023-05-22 12:12:59 +01:00
dependabot[bot]
875015d512 Bump sphinx from 6.1.3 to 6.2.1 (#15641)
* Bump sphinx from 6.1.3 to 6.2.1

Bumps [sphinx](https://github.com/sphinx-doc/sphinx) from 6.1.3 to 6.2.1.
- [Release notes](https://github.com/sphinx-doc/sphinx/releases)
- [Changelog](https://github.com/sphinx-doc/sphinx/blob/master/CHANGES)
- [Commits](https://github.com/sphinx-doc/sphinx/compare/v6.1.3...v6.2.1)

---
updated-dependencies:
- dependency-name: sphinx
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

* Changelog

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
2023-05-22 10:38:08 +01:00
dependabot[bot]
8516001566 Bump types-pillow from 9.5.0.2 to 9.5.0.4 (#15640)
* Bump types-pillow from 9.5.0.2 to 9.5.0.4

Bumps [types-pillow](https://github.com/python/typeshed) from 9.5.0.2 to 9.5.0.4.
- [Commits](https://github.com/python/typeshed/commits)

---
updated-dependencies:
- dependency-name: types-pillow
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

* Changelog

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
2023-05-22 10:38:01 +01:00
dependabot[bot]
adae1cfc8c Bump types-setuptools from 67.7.0.2 to 67.8.0.0 (#15639)
* Bump types-setuptools from 67.7.0.2 to 67.8.0.0

Bumps [types-setuptools](https://github.com/python/typeshed) from 67.7.0.2 to 67.8.0.0.
- [Commits](https://github.com/python/typeshed/commits)

---
updated-dependencies:
- dependency-name: types-setuptools
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

* Changelog

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
2023-05-22 10:37:50 +01:00
Eric Eastwood
703a8f9c67 Instrument state and state_group storage related things (tracing) (#15610)
Instrument `state` and `state_group` storage related things (tracing) so it's a little more clear where these database transactions are coming from as there is a lot of wires crossing in these functions.

Part of `/messages` performance investigation: https://github.com/matrix-org/synapse/issues/13356
2023-05-19 12:26:58 -05:00
Eric Eastwood
ca3c07e833 Trace how many new events from the backfill response we need to process (#15633)
You can kinda derive this information from how many `_process_pulled_event` spans there are but it would be nice to quickly glance.
2023-05-19 11:18:45 -05:00
reivilibre
736199b763 Remove old R30 because R30v2 supercedes it (#10428)
R30v2 has been out since 2021-07-19 (https://github.com/matrix-org/synapse/pull/10332)
and we started collecting stats on 2021-08-16. Since it's been over a year now
(almost 2 years), this is enough grace period for us to now rip it out.
2023-05-19 11:13:44 -05:00
Patrick Cloke
1e89976b26 Rename blacklist/whitelist internally. (#15620)
Avoid renaming configuration settings for now and rename internal code
to use blocklist and allowlist instead.
2023-05-19 12:25:25 +00:00
Patrick Cloke
89a23c9406 Do not allow deactivated users to login with JWT. (#15624)
To improve the organization of this code it moves the JWT login
checks to a separate handler and then fixes the bug (and a
deprecation warning).
2023-05-19 08:06:54 -04:00
Patrick Cloke
07771fa487 Remove experimental configuration flags & unstable values for faster joins (#15625)
Synapse will no longer send (or respond to) the unstable flags
for faster joins. These were only available behind a configuration
flag and handled in parallel with the stable flags.
2023-05-19 07:23:09 -04:00
Sean Quah
d0de452d12 Fix HomeServers leaking during trial test runs (#15630)
This change fixes two memory leaks during `trial` test runs.

Garbage collection is disabled during each test case and a gen-0 GC is
run at the end of each test. However, when the gen-0 GC is run, the
`TestCase` object usually still holds references to the `HomeServer`
used during the test. As a result, the `HomeServer` gets promoted to
gen-1 and then never garbage collected.

Fix this by periodically running full GCs.

Additionally, fix `HomeServer`s leaking after tests that touch inbound
federation due to `FederationRateLimiter`s adding themselves to a global
set, by turning the set into a `WeakSet`.

Resolves #15622.

Signed-off-by: Sean Quah <seanq@matrix.org>
2023-05-19 11:17:12 +01:00
Nick Mills-Barrett
ad50510a06 Handle missing previous read marker event. (#15464)
If the previous read marker is pointing to an event that no longer exists
(e.g. due to retention) then assume that the newly given read marker
is newer.
2023-05-18 14:37:31 -04:00
Jonathan de Jong
e5b4d93770 Update Mutual Rooms (MSC2666) implementation (#15621)
To track changes in MSC2666:

- The change from `/mutual_rooms/{user_id}` to `/mutual_rooms?user_id={user_id}`.
- The addition of `next_batch_token` (and logic).
- Unstable flag now being `uk.half-shot.msc2666.query_mutual_rooms`.
- The error code when your own user is requested.
2023-05-18 12:49:12 -04:00
Patrick Cloke
5dc1f25c53 Fix olddeps build (#15626)
Do an `apt update` before install packages.
2023-05-18 10:53:57 -04:00
axel simon
4ec40b16ac flake.nix: start synapse automatically, add space usage warning (#15613)
Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2023-05-18 15:44:28 +01:00
Sean Quah
68dcd2cbcb Re-type config paths in ConfigErrors to be StrSequences (#15615)
Part of #14809.

Signed-off-by: Sean Quah <seanq@matrix.org>
2023-05-18 11:11:30 +01:00
Sean Quah
e15aa00bc0 Fix error message when app_service_config_files validation fails (#15614)
The second argument of `ConfigError` is a path, passed as an optional
`Iterable[str]` and not a `str`. If a string is passed directly,
Synapse unhelpfully emits "Error in configuration at
a.p.p._.s.e.r.v.i.c.e._.c.o.n.f.i.g._.f.i.l.e.s'" when the config
option has the wrong data type.

Signed-off-by: Sean Quah <seanq@matrix.org>
2023-05-18 10:58:13 +01:00
Quentin Gliech
41b9def9f2 Add a new admin API to create a new device for a user. (#15611)
This allows an external service (e.g. the matrix-authentication-service)
to create devices for users.
2023-05-17 14:39:06 +00:00
Patrick Cloke
4ee82c0576 Apply url_preview_url_blacklist to oEmbed and pre-cached images (#15601)
There are two situations which were previously not properly checked:

1. If the requested URL was replaced with an oEmbed URL, then the
   oEmbed URL was not checked against url_preview_url_blacklist.
2. Follow-up URLs (either via autodiscovery of oEmbed or to pre-cache
   images) were not checked against url_preview_url_blacklist.
2023-05-16 16:25:01 -04:00
Patrick Cloke
375b0a8a11 Update code to refer to "workers". (#15606)
A bunch of comments and variables are out of date and use
obsolete terms.
2023-05-16 15:56:38 -04:00
Eric Eastwood
7148c2a0d6 Run mypy type checking with the minimum supported Python version (#15602)
We use the oldest Python version because later Python versions can include some overloads which don't work in the older versions which we still support.

We're using Python 3.8 instead of 3.7 which is our actual minimum support version because it's EOL is in a matter of weeks so can avoid the extra effort. And in any case, minimum Python 3.8 support is better than winging it on Python 3.11.
2023-05-16 13:27:47 -05:00
Shay
9f6ff6a0eb Add not null constraint to column full_user_id of tables profiles and user_filters (#15537) 2023-05-16 10:57:39 -07:00
Eric Eastwood
77cda342be traceback.format_exception(...) usage that is compatible with Python 3.7 and 3.11 (#15599)
* Usage that is compatible with Python 3.8 and 3.11

> Since Python 3.10, instead of passing value and tb, an exception object can
  be passed as the first argument. If value and tb are provided, the first
  argument is ignored in order to provide backwards compatibility.
>
> -- https://docs.python.org/3/library/traceback.html

* Add changelog
2023-05-16 12:33:18 -05:00
Eric Eastwood
c51d2e6199 Fix subscriptable type usage in Python <3.9 (#15604)
Fix the following `mypy` errors when running `mypy` with Python 3.7:
```
synapse/storage/controllers/stats.py:58: error: "Counter" is not subscriptable, use "typing.Counter" instead  [misc]

tests/test_state.py:267: error: "dict" is not subscriptable, use "typing.Dict" instead  [misc]
```

Part of https://github.com/matrix-org/synapse/issues/15603

In Python 3.9, `typing` is deprecated and the types are subscriptable (generics) by default, https://peps.python.org/pep-0585/#implementation
2023-05-16 12:19:46 -05:00
Eric Eastwood
b6a7d49b6f traceback.format_exception(...) usage that is compatible with Python 3.7 and 3.11 (#15599)
* Usage that is compatible with Python 3.8 and 3.11

> Since Python 3.10, instead of passing value and tb, an exception object can
  be passed as the first argument. If value and tb are provided, the first
  argument is ignored in order to provide backwards compatibility.
>
> -- https://docs.python.org/3/library/traceback.html

* Add changelog
2023-05-16 14:56:42 +01:00
Olivier Wilkinson (reivilibre)
0ccfb9318c Tweak changelog 2023-05-16 11:57:29 +01:00
Olivier Wilkinson (reivilibre)
3ec9f3b0cc 1.84.0rc1 2023-05-16 11:23:05 +01:00
Eric Eastwood
c97198ee14 Revert "Fix subscriptable dict type"
This reverts commit 55b08534a4.
2023-05-15 17:44:26 -05:00
Eric Eastwood
55b08534a4 Fix subscriptable dict type
Fix:
```
tests/test_state.py:267: error: "dict" is not subscriptable, use "typing.Dict" instead  [misc]
```

In Python 3.9, `typing` is deprecated and the types are subscriptable (generics) by default,
https://peps.python.org/pep-0585/#implementation
2023-05-15 17:40:10 -05:00
Shay
ba572647b2 Export run_as_background_process from the module API (#15577) 2023-05-15 13:11:21 -07:00
Patrick Cloke
f2905d827f Implement MSC3821 to update redaction rules (third_party_invite.signed) (#15563)
Updates the redaction rules to protect enough information that the
event can still be properly verified.
2023-05-15 15:02:24 -04:00
Patrick Cloke
eb3c1823d8 Reject instead of erroring on invalid membership events. (#15564)
Instead of resulting in an internal server error for invalid events,
return that the event is invalid.
2023-05-15 15:01:29 -04:00
Patrick Cloke
ba6b21c81e Implement MSC3389 to protect relations from redaction. (#15565)
MSC3389 proposes protecting the relation type & parent event ID
from redaction. This keeps the relation information intact after
redaction which helps with some UX flaws (e.g. deleting an
event causes it to no longer be in a thread, which is confusing).
2023-05-15 12:58:09 +00:00
Mathieu Velten
8583346335 Revert "Bump pillow from 9.4.0 to 9.5.0 (#15593)"
This reverts commit 34ab801379.
2023-05-15 14:22:07 +02:00
icp
b3ada9bfb4 Allow poetry-core 1.6.0 (#15588) 2023-05-15 11:19:11 +02:00
villepeh
aa5c0592e7 Update Mastodon SSO instructions (#15587) 2023-05-15 11:17:24 +02:00
Michael Weimann
3690d5bd89 Add an unstable feature flag for MSC3981 to the /versions endpoint (#15558)
Signed-off-by: Michael Weimann <michaelw@matrix.org>
Co-authored-by: Patrick Cloke <clokep@users.noreply.github.com>
2023-05-15 10:54:49 +02:00
dependabot[bot]
7b6c9f4c04 Bump phonenumbers from 8.13.7 to 8.13.11 (#15590) 2023-05-15 10:45:34 +02:00
dependabot[bot]
2e8a2bda52 Bump types-psycopg2 from 2.9.21.9 to 2.9.21.10 (#15591) 2023-05-15 10:45:15 +02:00
dependabot[bot]
3fd8eb81de Bump types-commonmark from 0.9.2.2 to 0.9.2.3 (#15592) 2023-05-15 10:44:47 +02:00
dependabot[bot]
1b4782a37d Bump types-setuptools from 67.7.0.1 to 67.7.0.2 (#15594) 2023-05-15 10:44:31 +02:00
dependabot[bot]
34ab801379 Bump pillow from 9.4.0 to 9.5.0 (#15593) 2023-05-15 10:44:06 +02:00
dependabot[bot]
bcd2495469 Bump serde from 1.0.162 to 1.0.163 (#15589) 2023-05-15 10:42:51 +02:00
Patrick Cloke
def480442d Declare support for Matrix 1.6 (#15559)
Adds logging for key server requests which include a key ID.
This is technically in violation of the 1.6 spec, but is the only
way to remain backwards compatibly with earlier versions of
Synapse (and possibly other homeservers) which *did* include
the key ID.
2023-05-12 07:31:50 -04:00
Erik Johnston
808105bd31 Revert "Set thread_id column to non-null for event_push_{actions,actions_staging,summary} (#15437)" (#15580)
This reverts commit a7b3e9ce65.
2023-05-12 11:38:16 +01:00
David Robertson
c96a1d2a27 Relax poetry-core lower bound to 1.1.0 (#15571)
See https://github.com/matrix-org/synapse/pull/15566#issuecomment-1543844104

Also check you can `pip install` in the old-deps CI job
2023-05-12 11:21:11 +01:00
helix-loop
08297f2f18 Add pkg-config package to Stage 0 (#15567) 2023-05-12 11:32:09 +02:00
David Robertson
7c76514f1e Deal with more GHA deprecations (#15576)
* Bump netlify PR

* Manually cache mypy cache dir

cache cache cache cache cache cache cache cache cache cache

* Changelog
2023-05-11 18:24:32 +00:00
Eric Eastwood
d19d1edbcf Print full startup/initialization error (#15569)
I found the error in the **Before** really vague and obtuse and didn't realize port `5432` corresponded to the Postgres port until searching the codebase. It says to check the logs but that wasn't my first instinct. It's just more obvious if we just print the full thing which gives context of the error type and the traceback to the relevant area of code.

#### Before

```
$ poetry run python -m synapse.app.homeserver -c homeserver.yaml
**********************************************************************************
 Error during initialisation:
    connection to server at "localhost" (::1), port 5432 failed: Connection refused
 	Is the server running on that host and accepting TCP/IP connections?
 connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused
 	Is the server running on that host and accepting TCP/IP connections?
 
 There may be more information in the logs.
**********************************************************************************
```

#### After

```sh
$ poetry run python -m synapse.app.homeserver -c homeserver.yaml
**********************************************************************************
 Error during initialisation:
     Traceback (most recent call last):
       File "/home/eric/Documents/github/element/synapse/synapse/app/homeserver.py", line 352, in setup
         hs.setup()
       File "/home/eric/Documents/github/element/synapse/synapse/server.py", line 337, in setup
         self.datastores = Databases(self.DATASTORE_CLASS, self)
       File "/home/eric/Documents/github/element/synapse/synapse/storage/databases/__init__.py", line 65, in __init__
         with make_conn(database_config, engine, "startup") as db_conn:
       File "/home/eric/Documents/github/element/synapse/synapse/storage/database.py", line 161, in make_conn
         native_db_conn = engine.module.connect(**db_params)
       File "/home/eric/.cache/pypoetry/virtualenvs/matrix-synapse-xCtC9ulO-py3.10/lib/python3.10/site-packages/psycopg2/__init__.py", line 122, in connect
         conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
     psycopg2.OperationalError: connection to server at "localhost" (::1), port 5432 failed: Connection refused
     	Is the server running on that host and accepting TCP/IP connections?
     connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused
     	Is the server running on that host and accepting TCP/IP connections?
 
 
 There may be more information in the logs.
**********************************************************************************
```
2023-05-11 11:50:46 -05:00
David Robertson
5a7742a833 Allow pip install to use setuptools_rust 1.6.0 (#15570)
* Allow `pip install` to use setuptools_rust 1.6.0

This was bumped by dependabot in #15512, but we didn't bump also raise
the version guard here. I don't know how we can avoid this happening in
the future.

Closes #15461.

Spotted in [1] by @landryb.

[1]: https://github.com/matrix-org/synapse/issues/15461#issuecomment-1543513934

* Changelog
2023-05-11 16:22:47 +00:00
Roel ter Maat
2611433b70 Add redis SSL configuration options (#15312)
* Add SSL options to redis config

* fix lint issues

* Add documentation and changelog file

* add missing . at the end of the changelog

* Move client context factory to new file

* Rename ssl to tls and fix typo

* fix lint issues

* Added when redis attributes were added
2023-05-11 13:02:51 +01:00
V02460
5bf9ec9e3e Require at least poetry-core v1.2.0 (#15566)
Signed-off-by: Kai A. Hiller <V02460@gmail.com>
2023-05-11 12:40:55 +01:00
Jason Little
e4f545c452 Remove worker_replication_* settings (#15491)
* Add master to the instance_map as part of Complement, have ReplicationEndpoint look at instance_map for master.

* Fix typo in drive by.

* Remove unnecessary worker_replication_* bits from unit tests and add master to instance_map(hopefully in the right place)

* Several updates:

1. Switch from master to main for naming the main process in the instance_map. Add useful constants for easier adjustment of names in the future.
2. Add backwards compatibility for worker_replication_* to allow time to transition to new style. Make sure to prioritize declaring main directly on the instance_map.
3. Clean up old comments/commented out code.
4. Adjust unit tests to match with new code.
5. Adjust Complement setup infrastructure to only add main to the instance_map if workers are used and remove now unused options from the worker.yaml template.

* Initial Docs upload

* Changelog

* Missed some commented out code that can go now

* Remove TODO comment that no longer holds true.

* Fix links in docs

* More docs

* Remove debug logging

* Apply suggestions from code review

Co-authored-by: reivilibre <olivier@librepush.net>

* Apply suggestions from code review

Co-authored-by: reivilibre <olivier@librepush.net>

* Update version to latest, include completeish before/after examples in upgrade notes.

* Fix up and docs too

---------

Co-authored-by: reivilibre <olivier@librepush.net>
2023-05-11 11:30:56 +01:00
Andrew Morgan
722ccc30b5 Add an unstable feature flag for MSC3391 to the /versions endpoint (#15562) 2023-05-11 10:38:32 +01:00
277 changed files with 8315 additions and 2552 deletions

View File

@@ -31,35 +31,6 @@ sed -i \
-e '/systemd/d' \
pyproject.toml
# Use poetry to do the installation. This ensures that the versions are all mutually
# compatible (as far the package metadata declares, anyway); pip's package resolver
# is more lax.
#
# Rather than `poetry install --no-dev`, we drop all dev dependencies and the dev-docs
# group from the toml file. This means we don't have to ensure compatibility between
# old deps and dev tools.
pip install toml wheel
REMOVE_DEV_DEPENDENCIES="
import toml
with open('pyproject.toml', 'r') as f:
data = toml.loads(f.read())
del data['tool']['poetry']['dev-dependencies']
del data['tool']['poetry']['group']['dev-docs']
with open('pyproject.toml', 'w') as f:
toml.dump(data, f)
"
python3 -c "$REMOVE_DEV_DEPENDENCIES"
pip install poetry==1.3.2
poetry lock
echo "::group::Patched pyproject.toml"
cat pyproject.toml
echo "::endgroup::"
echo "::group::Lockfile after patch"
cat poetry.lock
echo "::endgroup::"

View File

@@ -1,49 +0,0 @@
name: Write changelog for dependabot PR
on:
pull_request:
types:
- opened
- reopened # For debugging!
permissions:
# Needed to be able to push the commit. See
# https://docs.github.com/en/code-security/dependabot/working-with-dependabot/automating-dependabot-with-github-actions#enable-auto-merge-on-a-pull-request
# for a similar example
contents: write
jobs:
add-changelog:
runs-on: 'ubuntu-latest'
if: ${{ github.actor == 'dependabot[bot]' }}
steps:
- uses: actions/checkout@v3
with:
ref: ${{ github.event.pull_request.head.ref }}
- name: Write, commit and push changelog
env:
PR_TITLE: ${{ github.event.pull_request.title }}
PR_NUMBER: ${{ github.event.pull_request.number }}
run: |
echo "${PR_TITLE}." > "changelog.d/${PR_NUMBER}".misc
git add changelog.d
git config user.email "github-actions[bot]@users.noreply.github.com"
git config user.name "GitHub Actions"
git commit -m "Changelog"
git push
shell: bash
# The `git push` above does not trigger CI on the dependabot PR.
#
# By default, workflows can't trigger other workflows when they're just using the
# default `GITHUB_TOKEN` access token. (This is intended to stop you from writing
# recursive workflow loops by accident, because that'll get very expensive very
# quickly.) Instead, you have to manually call out to another workflow, or else
# make your changes (i.e. the `git push` above) using a personal access token.
# See
# https://docs.github.com/en/actions/using-workflows/triggering-a-workflow#triggering-a-workflow-from-a-workflow
#
# I have tried and failed to find a way to trigger CI on the "merge ref" of the PR.
# See git commit history for previous attempts. If anyone desperately wants to try
# again in the future, make a matrix-bot account and use its access token to git push.
# THIS WORKFLOW HAS WRITE PERMISSIONS---do not add other jobs here unless they
# are sufficiently locked down to dependabot only as above.

View File

@@ -22,7 +22,7 @@ jobs:
path: book
- name: 📤 Deploy to Netlify
uses: matrix-org/netlify-pr-preview@v1
uses: matrix-org/netlify-pr-preview@v2
with:
path: book
owner: ${{ github.event.workflow_run.head_repository.owner.login }}

View File

@@ -34,6 +34,7 @@ jobs:
- id: set-distros
run: |
# if we're running from a tag, get the full list of distros; otherwise just use debian:sid
# NOTE: inside the actual Dockerfile-dhvirtualenv, the image name is expanded into its full image path
dists='["debian:sid"]'
if [[ $GITHUB_REF == refs/tags/* ]]; then
dists=$(scripts-dev/build_debian_packages.py --show-dists-json)

View File

@@ -45,16 +45,6 @@ jobs:
- run: poetry run scripts-dev/generate_sample_config.sh --check
- run: poetry run scripts-dev/config-lint.sh
check-schema-delta:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: "3.x"
- run: "pip install 'click==8.1.1' 'GitPython>=3.1.20'"
- run: scripts-dev/check_schema_delta.py --force-colors
check-lockfile:
runs-on: ubuntu-latest
steps:
@@ -107,14 +97,15 @@ jobs:
uses: dtolnay/rust-toolchain@1.58.1
- uses: Swatinem/rust-cache@v2
# NB: I have two concerns with this action:
# 1. We occasionally see odd mypy problems that aren't reproducible
# locally with clean caches. I suspect some dodgy caching behaviour.
# 2. The action uses GHA machinery that's deprecated
# (https://github.com/AustinScola/mypy-cache-github-action/issues/277)
# It may be simpler to use actions/cache ourselves to restore .mypy_cache.
# Cribbed from
# https://github.com/AustinScola/mypy-cache-github-action/blob/85ea4f2972abed39b33bd02c36e341b28ca59213/src/restore.ts#L10-L17
- name: Restore/persist mypy's cache
uses: AustinScola/mypy-cache-github-action@df56268388422ee282636ee2c7a9cc55ec644a41
uses: actions/cache@v3
with:
path: |
.mypy_cache
key: mypy-cache-${{ github.context.sha }}
restore-keys: mypy-cache-
- name: Run mypy
run: poetry run mypy
@@ -220,7 +211,6 @@ jobs:
- lint-newsfile
- lint-pydantic
- check-sampleconfig
- check-schema-delta
- check-lockfile
- lint-clippy
- lint-rustfmt
@@ -313,41 +303,33 @@ jobs:
# There aren't wheels for some of the older deps, so we need to install
# their build dependencies
- run: |
sudo apt-get -qq update
sudo apt-get -qq install build-essential libffi-dev python-dev \
libxml2-dev libxslt-dev xmlsec1 zlib1g-dev libjpeg-dev libwebp-dev
libxml2-dev libxslt-dev xmlsec1 zlib1g-dev libjpeg-dev libwebp-dev
- uses: actions/setup-python@v4
with:
python-version: '3.7'
# Calculating the old-deps actually takes a bunch of time, so we cache the
# pyproject.toml / poetry.lock. We need to cache pyproject.toml as
# otherwise the `poetry install` step will error due to the poetry.lock
# file being outdated.
#
# This caches the output of `Prepare old deps`, which should generate the
# same `pyproject.toml` and `poetry.lock` for a given `pyproject.toml` input.
- uses: actions/cache@v3
id: cache-poetry-old-deps
name: Cache poetry.lock
with:
path: |
poetry.lock
pyproject.toml
key: poetry-old-deps2-${{ hashFiles('pyproject.toml') }}
- name: Prepare old deps
if: steps.cache-poetry-old-deps.outputs.cache-hit != 'true'
run: .ci/scripts/prepare_old_deps.sh
# We only now install poetry so that `setup-python-poetry` caches the
# right poetry.lock's dependencies.
- uses: matrix-org/setup-python-poetry@v1
with:
python-version: '3.7'
poetry-version: "1.3.2"
extras: "all test"
# Note: we install using `pip` here, not poetry. `poetry install` ignores the
# build-system section (https://github.com/python-poetry/poetry/issues/6154), but
# we explicitly want to test that you can `pip install` using the oldest version
# of poetry-core and setuptools-rust.
- run: pip install .[all,test]
- run: poetry run trial -j6 tests
# We nuke the local copy, as we've installed synapse into the virtualenv
# (rather than use an editable install, which we no longer support). If we
# don't do this then python can't find the native lib.
- run: rm -rf synapse/
# Sanity check we can import/run Synapse
- run: python -m synapse.app.homeserver --help
- run: python -m twisted.trial -j6 tests
- name: Dump logs
# Logs are most useful when the command fails, always include them.
if: ${{ always() }}
@@ -616,6 +598,16 @@ jobs:
- run: cargo bench --no-run
check-schema-delta:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: "3.x"
- run: "pip install 'click==8.1.1' 'GitPython>=3.1.20'"
- run: scripts-dev/check_schema_delta.py --force-colors
# a job which marks all the other jobs as complete, thus allowing PRs to be merged.
tests-done:
if: ${{ always() }}

View File

@@ -1,3 +1,211 @@
Synapse 1.85.0rc2 (2023-06-01)
==============================
Bugfixes
--------
- Fix a performance issue introduced in Synapse v1.83.0 which meant that purging rooms was very slow and database-intensive. ([\#15693](https://github.com/matrix-org/synapse/issues/15693))
Deprecations and Removals
-------------------------
- Deprecate calling the `/register` endpoint with an unspecced `user` property for application services. ([\#15703](https://github.com/matrix-org/synapse/issues/15703))
Internal Changes
----------------
- Speed up background jobs `populate_full_user_id_user_filters` and `populate_full_user_id_profiles`. ([\#15700](https://github.com/matrix-org/synapse/issues/15700))
Synapse 1.85.0rc1 (2023-05-30)
==============================
Features
--------
- Improve performance of backfill requests by performing backfill of previously failed requests in the background. ([\#15585](https://github.com/matrix-org/synapse/issues/15585))
- Add a new [admin API](https://matrix-org.github.io/synapse/v1.85/usage/administration/admin_api/index.html) to [create a new device for a user](https://matrix-org.github.io/synapse/v1.85/admin_api/user_admin_api.html#create-a-device). ([\#15611](https://github.com/matrix-org/synapse/issues/15611))
- Add Unix socket support for Redis connections. Contributed by Jason Little. ([\#15644](https://github.com/matrix-org/synapse/issues/15644))
Bugfixes
--------
- Fix a long-standing bug where setting the read marker could fail when using message retention. Contributed by Nick @ Beeper (@fizzadar). ([\#15464](https://github.com/matrix-org/synapse/issues/15464))
- Fix a long-standing bug where the `url_preview_url_blacklist` configuration setting was not applied to oEmbed or image URLs found while previewing a URL. ([\#15601](https://github.com/matrix-org/synapse/issues/15601))
- Fix a long-standing bug where filters with multiple backslashes were rejected. ([\#15607](https://github.com/matrix-org/synapse/issues/15607))
- Fix a bug introduced in Synapse 1.82.0 where the error message displayed when validation of the `app_service_config_files` config option fails would be incorrectly formatted. ([\#15614](https://github.com/matrix-org/synapse/issues/15614))
- Fix a long-standing bug where deactivated users were still able to login using the custom `org.matrix.login.jwt` login type (if enabled). ([\#15624](https://github.com/matrix-org/synapse/issues/15624))
- Fix a long-standing bug where deactivated users were able to login in uncommon situations. ([\#15634](https://github.com/matrix-org/synapse/issues/15634))
Improved Documentation
----------------------
- Warn users that at least 3.75GB of space is needed for the nix Synapse development environment. ([\#15613](https://github.com/matrix-org/synapse/issues/15613))
- Remove outdated comment from the generated and sample homeserver log configs. ([\#15648](https://github.com/matrix-org/synapse/issues/15648))
- Improve contributor docs to make it more clear that Rust is a necessary prerequisite. Contributed by @grantm. ([\#15668](https://github.com/matrix-org/synapse/issues/15668))
Deprecations and Removals
-------------------------
- Remove the old version of the R30 (30-day retained users) phone-home metric. ([\#10428](https://github.com/matrix-org/synapse/issues/10428))
Internal Changes
----------------
- Create dependabot changelogs at release time. ([\#15481](https://github.com/matrix-org/synapse/issues/15481))
- Add not null constraint to column `full_user_id` of tables `profiles` and `user_filters`. ([\#15537](https://github.com/matrix-org/synapse/issues/15537))
- Allow connecting to HTTP Replication Endpoints by using `worker_name` when constructing the request. ([\#15578](https://github.com/matrix-org/synapse/issues/15578))
- Make the `thread_id` column on `event_push_actions`, `event_push_actions_staging`, and `event_push_summary` non-null. ([\#15597](https://github.com/matrix-org/synapse/issues/15597))
- Run mypy type checking with the minimum supported Python version to catch new usage that isn't backwards-compatible. ([\#15602](https://github.com/matrix-org/synapse/issues/15602))
- Fix subscriptable type usage in Python <3.9. ([\#15604](https://github.com/matrix-org/synapse/issues/15604))
- Update internal terminology. ([\#15606](https://github.com/matrix-org/synapse/issues/15606), [\#15620](https://github.com/matrix-org/synapse/issues/15620))
- Instrument `state` and `state_group` storage-related operations to better picture what's happening when tracing. ([\#15610](https://github.com/matrix-org/synapse/issues/15610), [\#15647](https://github.com/matrix-org/synapse/issues/15647))
- Trace how many new events from the backfill response we need to process. ([\#15633](https://github.com/matrix-org/synapse/issues/15633))
- Re-type config paths in `ConfigError`s to be `StrSequence`s instead of `Iterable[str]`s. ([\#15615](https://github.com/matrix-org/synapse/issues/15615))
- Update Mutual Rooms ([MSC2666](https://github.com/matrix-org/matrix-spec-proposals/pull/2666)) implementation to match new proposal text. ([\#15621](https://github.com/matrix-org/synapse/issues/15621))
- Remove the unstable identifiers from faster joins ([MSC3706](https://github.com/matrix-org/matrix-spec-proposals/pull/3706)). ([\#15625](https://github.com/matrix-org/synapse/issues/15625))
- Fix the olddeps CI. ([\#15626](https://github.com/matrix-org/synapse/issues/15626))
- Remove duplicate timestamp from test logs (`_trial_temp/test.log`). ([\#15636](https://github.com/matrix-org/synapse/issues/15636))
- Fix two memory leaks in `trial` test runs. ([\#15630](https://github.com/matrix-org/synapse/issues/15630))
- Limit the size of the `HomeServerConfig` cache in trial test runs. ([\#15646](https://github.com/matrix-org/synapse/issues/15646))
- Improve type hints. ([\#15658](https://github.com/matrix-org/synapse/issues/15658), [\#15659](https://github.com/matrix-org/synapse/issues/15659))
- Add requesting user id parameter to key claim methods in `TransportLayerClient`. ([\#15663](https://github.com/matrix-org/synapse/issues/15663))
- Speed up rebuilding of the user directory for local users. ([\#15665](https://github.com/matrix-org/synapse/issues/15665))
- Implement "option 2" for [MSC3820](https://github.com/matrix-org/matrix-spec-proposals/pull/3820): Room version 11. ([\#15666](https://github.com/matrix-org/synapse/issues/15666), [\#15678](https://github.com/matrix-org/synapse/issues/15678))
### Updates to locked dependencies
* Bump furo from 2023.3.27 to 2023.5.20. ([\#15642](https://github.com/matrix-org/synapse/issues/15642))
* Bump log from 0.4.17 to 0.4.18. ([\#15681](https://github.com/matrix-org/synapse/issues/15681))
* Bump prometheus-client from 0.16.0 to 0.17.0. ([\#15682](https://github.com/matrix-org/synapse/issues/15682))
* Bump pydantic from 1.10.7 to 1.10.8. ([\#15685](https://github.com/matrix-org/synapse/issues/15685))
* Bump pygithub from 1.58.1 to 1.58.2. ([\#15643](https://github.com/matrix-org/synapse/issues/15643))
* Bump requests from 2.28.2 to 2.31.0. ([\#15651](https://github.com/matrix-org/synapse/issues/15651))
* Bump sphinx from 6.1.3 to 6.2.1. ([\#15641](https://github.com/matrix-org/synapse/issues/15641))
* Bump types-bleach from 6.0.0.1 to 6.0.0.3. ([\#15686](https://github.com/matrix-org/synapse/issues/15686))
* Bump types-pillow from 9.5.0.2 to 9.5.0.4. ([\#15640](https://github.com/matrix-org/synapse/issues/15640))
* Bump types-pyyaml from 6.0.12.9 to 6.0.12.10. ([\#15683](https://github.com/matrix-org/synapse/issues/15683))
* Bump types-requests from 2.30.0.0 to 2.31.0.0. ([\#15684](https://github.com/matrix-org/synapse/issues/15684))
* Bump types-setuptools from 67.7.0.2 to 67.8.0.0. ([\#15639](https://github.com/matrix-org/synapse/issues/15639))
Synapse 1.84.1 (2023-05-26)
===========================
This patch release fixes a major issue with homeservers that do not have an `instance_map` defined but which do use workers.
If you have already upgraded to Synapse 1.84.0 and your homeserver is working normally, then there is no need to update to this patch release.
Bugfixes
--------
- Fix a bug introduced in Synapse v1.84.0 where workers do not start up when no `instance_map` was provided. ([\#15672](https://github.com/matrix-org/synapse/issues/15672))
Internal Changes
----------------
- Add `dch` and `notify-send` to the development Nix flake so that the release script can be used. ([\#15673](https://github.com/matrix-org/synapse/issues/15673))
Synapse 1.84.0 (2023-05-23)
===========================
The `worker_replication_*` configuration settings have been deprecated in favour of configuring the main process consistently with other instances in the `instance_map`. The deprecated settings will be removed in Synapse v1.88.0, but changing your configuration in advance is recommended. See the [upgrade notes](https://github.com/matrix-org/synapse/blob/release-v1.84/docs/upgrade.md#upgrading-to-v1840) for more information.
Bugfixes
--------
- Fix a bug introduced in Synapse 1.84.0rc1 where errors during startup were not reported correctly on Python < 3.10. ([\#15599](https://github.com/matrix-org/synapse/issues/15599))
Synapse 1.84.0rc1 (2023-05-16)
==============================
Features
--------
- Add an option to prevent media downloads from configured domains. ([\#15197](https://github.com/matrix-org/synapse/issues/15197))
- Add `forget_rooms_on_leave` config option to automatically forget rooms when users leave them or are removed from them. ([\#15224](https://github.com/matrix-org/synapse/issues/15224))
- Add redis TLS configuration options. ([\#15312](https://github.com/matrix-org/synapse/issues/15312))
- Add a config option to delay push notifications by a random amount, to discourage time-based profiling. ([\#15516](https://github.com/matrix-org/synapse/issues/15516))
- Stabilize support for [MSC2659](https://github.com/matrix-org/matrix-spec-proposals/pull/2659): application service ping endpoint. Contributed by Tulir @ Beeper. ([\#15528](https://github.com/matrix-org/synapse/issues/15528))
- Implement [MSC4009](https://github.com/matrix-org/matrix-spec-proposals/pull/4009) to expand the supported characters in Matrix IDs. ([\#15536](https://github.com/matrix-org/synapse/issues/15536))
- Advertise support for Matrix 1.6 on `/_matrix/client/versions`. ([\#15559](https://github.com/matrix-org/synapse/issues/15559))
- Print full error and stack-trace of any exception that occurs during startup/initialization. ([\#15569](https://github.com/matrix-org/synapse/issues/15569))
Bugfixes
--------
- Don't fail on federation over TOR where SRV queries are not supported. Contributed by Zdzichu. ([\#15523](https://github.com/matrix-org/synapse/issues/15523))
- Experimental support for [MSC4010](https://github.com/matrix-org/matrix-spec-proposals/pull/4010) which rejects setting the `"m.push_rules"` via account data. ([\#15554](https://github.com/matrix-org/synapse/issues/15554), [\#15555](https://github.com/matrix-org/synapse/issues/15555))
- Fix a long-standing bug where an invalid membership event could cause an internal server error. ([\#15564](https://github.com/matrix-org/synapse/issues/15564))
- Require at least poetry-core v1.1.0. ([\#15566](https://github.com/matrix-org/synapse/issues/15566), [\#15571](https://github.com/matrix-org/synapse/issues/15571))
Deprecations and Removals
-------------------------
- Remove need for `worker_replication_*` based settings in worker configuration yaml by placing this data directly on the `instance_map` instead. ([\#15491](https://github.com/matrix-org/synapse/issues/15491))
Updates to the Docker image
---------------------------
- Add pkg-config package to Stage 0 to be able to build Dockerfile on ppc64le architecture. ([\#15567](https://github.com/matrix-org/synapse/issues/15567))
Improved Documentation
----------------------
- Clarify documentation of the "Create or modify account" Admin API. ([\#15544](https://github.com/matrix-org/synapse/issues/15544))
- Fix path to the `statistics/database/rooms` admin API in documentation. ([\#15560](https://github.com/matrix-org/synapse/issues/15560))
- Update and improve Mastodon Single Sign-On documentation. ([\#15587](https://github.com/matrix-org/synapse/issues/15587))
Internal Changes
----------------
- Use oEmbed to generate URL previews for YouTube Shorts. ([\#15025](https://github.com/matrix-org/synapse/issues/15025))
- Create new `Client` for use with HTTP Replication between workers. Contributed by Jason Little. ([\#15470](https://github.com/matrix-org/synapse/issues/15470))
- Bump pyicu from 2.10.2 to 2.11. ([\#15509](https://github.com/matrix-org/synapse/issues/15509))
- Remove references to supporting per-user flag for [MSC2654](https://github.com/matrix-org/matrix-spec-proposals/pull/2654). ([\#15522](https://github.com/matrix-org/synapse/issues/15522))
- Don't use a trusted key server when running the demo scripts. ([\#15527](https://github.com/matrix-org/synapse/issues/15527))
- Speed up rebuilding of the user directory for local users. ([\#15529](https://github.com/matrix-org/synapse/issues/15529))
- Speed up deleting of old rows in `event_push_actions`. ([\#15531](https://github.com/matrix-org/synapse/issues/15531))
- Install the `xmlsec` and `mdbook` packages and switch back to the upstream [cachix/devenv](https://github.com/cachix/devenv) repo in the nix development environment. ([\#15532](https://github.com/matrix-org/synapse/issues/15532), [\#15533](https://github.com/matrix-org/synapse/issues/15533), [\#15545](https://github.com/matrix-org/synapse/issues/15545))
- Implement [MSC3987](https://github.com/matrix-org/matrix-spec-proposals/pull/3987) by removing `"dont_notify"` from the list of actions in default push rules. ([\#15534](https://github.com/matrix-org/synapse/issues/15534))
- Move various module API callback registration methods to a dedicated class. ([\#15535](https://github.com/matrix-org/synapse/issues/15535))
- Proxy `/user/devices` federation queries to application services for [MSC3984](https://github.com/matrix-org/matrix-spec-proposals/pull/3984). ([\#15539](https://github.com/matrix-org/synapse/issues/15539))
- Factor out an `is_mine_server_name` method. ([\#15542](https://github.com/matrix-org/synapse/issues/15542))
- Allow running Complement tests using [podman](https://podman.io/) by adding a `PODMAN` environment variable to `scripts-dev/complement.sh`. ([\#15543](https://github.com/matrix-org/synapse/issues/15543))
- Bump serde from 1.0.160 to 1.0.162. ([\#15548](https://github.com/matrix-org/synapse/issues/15548))
- Bump types-setuptools from 67.6.0.5 to 67.7.0.1. ([\#15549](https://github.com/matrix-org/synapse/issues/15549))
- Bump sentry-sdk from 1.19.1 to 1.22.1. ([\#15550](https://github.com/matrix-org/synapse/issues/15550))
- Bump ruff from 0.0.259 to 0.0.265. ([\#15551](https://github.com/matrix-org/synapse/issues/15551))
- Bump hiredis from 2.2.2 to 2.2.3. ([\#15552](https://github.com/matrix-org/synapse/issues/15552))
- Bump types-requests from 2.29.0.0 to 2.30.0.0. ([\#15553](https://github.com/matrix-org/synapse/issues/15553))
- Add `org.matrix.msc3981` info to `/_matrix/client/versions`. ([\#15558](https://github.com/matrix-org/synapse/issues/15558))
- Declare unstable support for [MSC3391](https://github.com/matrix-org/matrix-spec-proposals/pull/3391) under `/_matrix/client/versions` if the experimental implementation is enabled. ([\#15562](https://github.com/matrix-org/synapse/issues/15562))
- Implement [MSC3821](https://github.com/matrix-org/matrix-spec-proposals/pull/3821) to update the redaction rules. ([\#15563](https://github.com/matrix-org/synapse/issues/15563))
- Implement updated redaction rules from [MSC3389](https://github.com/matrix-org/matrix-spec-proposals/pull/3389). ([\#15565](https://github.com/matrix-org/synapse/issues/15565))
- Allow `pip install` to use setuptools_rust 1.6.0 when building Synapse. ([\#15570](https://github.com/matrix-org/synapse/issues/15570))
- Deal with upcoming Github Actions deprecations. ([\#15576](https://github.com/matrix-org/synapse/issues/15576))
- Export `run_as_background_process` from the module API. ([\#15577](https://github.com/matrix-org/synapse/issues/15577))
- Update build system requirements to allow building with poetry-core==1.6.0. ([\#15588](https://github.com/matrix-org/synapse/issues/15588))
- Bump serde from 1.0.162 to 1.0.163. ([\#15589](https://github.com/matrix-org/synapse/issues/15589))
- Bump phonenumbers from 8.13.7 to 8.13.11. ([\#15590](https://github.com/matrix-org/synapse/issues/15590))
- Bump types-psycopg2 from 2.9.21.9 to 2.9.21.10. ([\#15591](https://github.com/matrix-org/synapse/issues/15591))
- Bump types-commonmark from 0.9.2.2 to 0.9.2.3. ([\#15592](https://github.com/matrix-org/synapse/issues/15592))
- Bump types-setuptools from 67.7.0.1 to 67.7.0.2. ([\#15594](https://github.com/matrix-org/synapse/issues/15594))
Synapse 1.83.0 (2023-05-09)
===========================

15
Cargo.lock generated
View File

@@ -132,12 +132,9 @@ dependencies = [
[[package]]
name = "log"
version = "0.4.17"
version = "0.4.18"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "abb12e687cfb44aa40f41fc3978ef76448f9b6038cad6aef4259d3c095a2382e"
dependencies = [
"cfg-if",
]
checksum = "518ef76f2f87365916b142844c16d8fefd85039bc5699050210a7778ee1cd1de"
[[package]]
name = "memchr"
@@ -323,18 +320,18 @@ checksum = "d29ab0c6d3fc0ee92fe66e2d99f700eab17a8d57d1c1d3b748380fb20baa78cd"
[[package]]
name = "serde"
version = "1.0.162"
version = "1.0.163"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "71b2f6e1ab5c2b98c05f0f35b236b22e8df7ead6ffbf51d7808da7f8817e7ab6"
checksum = "2113ab51b87a539ae008b5c6c02dc020ffa39afd2d83cffcb3f4eb2722cebec2"
dependencies = [
"serde_derive",
]
[[package]]
name = "serde_derive"
version = "1.0.162"
version = "1.0.163"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a2a0814352fd64b58489904a44ea8d90cb1a91dcb6b4f5ebabc32c8318e93cb6"
checksum = "8c805777e3930c8883389c602315a24224bcc738b63905ef87cd1420353ea93e"
dependencies = [
"proc-macro2",
"quote",

1
changelog.d/14213.misc Normal file
View File

@@ -0,0 +1 @@
Log when events are (maybe unexpectedly) filtered out of responses in tests.

View File

@@ -1 +0,0 @@
Use oEmbed to generate URL previews for YouTube Shorts.

View File

@@ -1 +0,0 @@
Add an option to prevent media downloads from configured domains.

View File

@@ -1 +0,0 @@
Add `forget_rooms_on_leave` config option to automatically forget rooms when users leave them or are removed from them.

View File

@@ -0,0 +1 @@
Stable support for [MSC3882](https://github.com/matrix-org/matrix-spec-proposals/pull/3882) to allow an existing device/session to generate a login token for use on a new device/session.

View File

@@ -1 +0,0 @@
Make the `thread_id` column on `event_push_actions`, `event_push_actions_staging`, and `event_push_summary` non-null.

View File

@@ -0,0 +1 @@
Support resolving a room's [canonical alias](https://spec.matrix.org/v1.7/client-server-api/#mroomcanonical_alias) via the module API.

View File

@@ -1 +0,0 @@
Create new `Client` for use with HTTP Replication between workers. Contributed by Jason Little.

View File

@@ -1 +0,0 @@
Bump pyicu from 2.10.2 to 2.11.

View File

@@ -1 +0,0 @@
Add a config option to delay push notifications by a random amount, to discourage time-based profiling.

View File

@@ -1 +0,0 @@
Remove references to supporting per-user flag for [MSC2654](https://github.com/matrix-org/matrix-spec-proposals/pull/2654) (#15522).

View File

@@ -1 +0,0 @@
Don't fail on federation over TOR where SRV queries are not supported. Contributed by Zdzichu.

View File

@@ -1 +0,0 @@
Don't use a trusted key server when running the demo scripts.

View File

@@ -1 +0,0 @@
Stabilize support for [MSC2659](https://github.com/matrix-org/matrix-spec-proposals/pull/2659): application service ping endpoint. Contributed by Tulir @ Beeper.

View File

@@ -1 +0,0 @@
Speed up rebuilding of the user directory for local users.

View File

@@ -1 +0,0 @@
Speed up deleting of old rows in `event_push_actions`.

View File

@@ -1 +0,0 @@
Install the `xmlsec` and `mdbook` packages and switch back to the upstream [cachix/devenv](https://github.com/cachix/devenv) repo in the nix development environment.

View File

@@ -1 +0,0 @@
Install the `xmlsec` and `mdbook` packages and switch back to the upstream [cachix/devenv](https://github.com/cachix/devenv) repo in the nix development environment.

View File

@@ -1 +0,0 @@
Implement [MSC3987](https://github.com/matrix-org/matrix-spec-proposals/pull/3987) by removing `"dont_notify"` from the list of actions in default push rules.

View File

@@ -1 +0,0 @@
Move various module API callback registration methods to a dedicated class.

View File

@@ -1 +0,0 @@
Implement [MSC4009](https://github.com/matrix-org/matrix-spec-proposals/pull/4009) to expand the supported characters in Matrix IDs.

View File

@@ -1 +0,0 @@
Proxy `/user/devices` federation queries to application services for [MSC3984](https://github.com/matrix-org/matrix-spec-proposals/pull/3984).

View File

@@ -1 +0,0 @@
Factor out an `is_mine_server_name` method.

View File

@@ -1 +0,0 @@
Allow running Complement tests using [podman](https://podman.io/) by adding a `PODMAN` environment variable to `scripts-dev/complement.sh`.

View File

@@ -1 +0,0 @@
Clarify documentation of the "Create or modify account" Admin API.

View File

@@ -1 +0,0 @@
Install the `xmlsec` and `mdbook` packages and switch back to the upstream [cachix/devenv](https://github.com/cachix/devenv) repo in the nix development environment.

View File

@@ -1 +0,0 @@
Bump serde from 1.0.160 to 1.0.162.

View File

@@ -1 +0,0 @@
Bump types-setuptools from 67.6.0.5 to 67.7.0.1.

View File

@@ -1 +0,0 @@
Bump sentry-sdk from 1.19.1 to 1.22.1.

View File

@@ -1 +0,0 @@
Bump ruff from 0.0.259 to 0.0.265.

View File

@@ -1 +0,0 @@
Bump hiredis from 2.2.2 to 2.2.3.

View File

@@ -1 +0,0 @@
Bump types-requests from 2.29.0.0 to 2.30.0.0.

View File

@@ -1 +0,0 @@
Experimental support for [MSC4010](https://github.com/matrix-org/matrix-spec-proposals/pull/4010) which rejects setting the `"m.push_rules"` via account data.

View File

@@ -1 +0,0 @@
Experimental support for [MSC4010](https://github.com/matrix-org/matrix-spec-proposals/pull/4010) which rejects setting the `"m.push_rules"` via account data.

View File

@@ -1 +0,0 @@
Fix path to the `statistics/database/rooms` admin API in documentation.

View File

@@ -0,0 +1 @@
Experimental [MSC3861](https://github.com/matrix-org/matrix-spec-proposals/pull/3861) support: delegate auth to an OIDC provider.

1
changelog.d/15649.misc Normal file
View File

@@ -0,0 +1 @@
Read from column `full_user_id` rather than `user_id` of tables `profiles` and `user_filters`.

View File

@@ -0,0 +1 @@
Add Syanpse version deploy annotations to Grafana dashboard which enables easy correlation between behavior changes witnessed in a graph to a certain Synapse version and nail down regressions.

1
changelog.d/15675.misc Normal file
View File

@@ -0,0 +1 @@
Cache requests for user's devices over federation.

1
changelog.d/15689.misc Normal file
View File

@@ -0,0 +1 @@
Add fully qualified docker image names to Dockerfiles.

1
changelog.d/15690.misc Normal file
View File

@@ -0,0 +1 @@
Remove some unused code.

1
changelog.d/15694.misc Normal file
View File

@@ -0,0 +1 @@
Improve type hints.

1
changelog.d/15697.misc Normal file
View File

@@ -0,0 +1 @@
Improve type hints.

View File

@@ -0,0 +1 @@
Add a catch-all * to the supported relation types when redacting an event and its related events. This is an update to [MSC3912](https://github.com/matrix-org/matrix-spec-proposals/pull/3861) implementation.

1
changelog.d/15724.bugfix Normal file
View File

@@ -0,0 +1 @@
Fix missing dependencies in background jobs.

View File

@@ -70,6 +70,10 @@ redis:
port: 6379
# dbid: <redis_logical_db_id>
# password: <secret_password>
# use_tls: True
# certificate_file: <path_to_certificate>
# private_key_file: <path_to_private_key>
# ca_file: <path_to_ca_certificate>
```
This assumes that your Redis service is called `redis` in your Docker Compose file.

File diff suppressed because it is too large Load Diff

30
debian/changelog vendored
View File

@@ -1,3 +1,33 @@
matrix-synapse-py3 (1.85.0~rc2) stable; urgency=medium
* New Synapse release 1.85.0rc2.
-- Synapse Packaging team <packages@matrix.org> Thu, 01 Jun 2023 09:16:18 -0700
matrix-synapse-py3 (1.85.0~rc1) stable; urgency=medium
* New Synapse release 1.85.0rc1.
-- Synapse Packaging team <packages@matrix.org> Tue, 30 May 2023 13:56:54 +0100
matrix-synapse-py3 (1.84.1) stable; urgency=medium
* New Synapse release 1.84.1.
-- Synapse Packaging team <packages@matrix.org> Fri, 26 May 2023 16:15:30 +0100
matrix-synapse-py3 (1.84.0) stable; urgency=medium
* New Synapse release 1.84.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 23 May 2023 10:57:22 +0100
matrix-synapse-py3 (1.84.0~rc1) stable; urgency=medium
* New Synapse release 1.84.0rc1.
-- Synapse Packaging team <packages@matrix.org> Tue, 16 May 2023 11:12:02 +0100
matrix-synapse-py3 (1.83.0) stable; urgency=medium
* New Synapse release 1.83.0.

View File

@@ -27,7 +27,7 @@ ARG PYTHON_VERSION=3.11
###
# We hardcode the use of Debian bullseye here because this could change upstream
# and other Dockerfiles used for testing are expecting bullseye.
FROM docker.io/python:${PYTHON_VERSION}-slim-bullseye as requirements
FROM docker.io/library/python:${PYTHON_VERSION}-slim-bullseye as requirements
# RUN --mount is specific to buildkit and is documented at
# https://github.com/moby/buildkit/blob/master/frontend/dockerfile/docs/syntax.md#build-mounts-run---mount.
@@ -37,7 +37,7 @@ RUN \
--mount=type=cache,target=/var/cache/apt,sharing=locked \
--mount=type=cache,target=/var/lib/apt,sharing=locked \
apt-get update -qq && apt-get install -yqq \
build-essential curl git libffi-dev libssl-dev \
build-essential curl git libffi-dev libssl-dev pkg-config \
&& rm -rf /var/lib/apt/lists/*
# Install rust and ensure its in the PATH.
@@ -87,7 +87,7 @@ RUN if [ -z "$TEST_ONLY_IGNORE_POETRY_LOCKFILE" ]; then \
###
### Stage 1: builder
###
FROM docker.io/python:${PYTHON_VERSION}-slim-bullseye as builder
FROM docker.io/library/python:${PYTHON_VERSION}-slim-bullseye as builder
# install the OS build deps
RUN \
@@ -158,7 +158,7 @@ RUN --mount=type=cache,target=/synapse/target,sharing=locked \
### Stage 2: runtime
###
FROM docker.io/python:${PYTHON_VERSION}-slim-bullseye
FROM docker.io/library/python:${PYTHON_VERSION}-slim-bullseye
LABEL org.opencontainers.image.url='https://matrix.org/docs/projects/server/synapse'
LABEL org.opencontainers.image.documentation='https://github.com/matrix-org/synapse/blob/master/docker/README.md'

View File

@@ -24,7 +24,7 @@ ARG distro=""
# https://launchpad.net/~jyrki-pulliainen/+archive/ubuntu/dh-virtualenv, but
# it's not obviously easier to use that than to build our own.)
FROM ${distro} as builder
FROM docker.io/library/${distro} as builder
RUN apt-get update -qq -o Acquire::Languages=none
RUN env DEBIAN_FRONTEND=noninteractive apt-get install \
@@ -55,7 +55,7 @@ RUN cd /dh-virtualenv && DEB_BUILD_OPTIONS=nodoc dpkg-buildpackage -us -uc -b
###
### Stage 1
###
FROM ${distro}
FROM docker.io/library/${distro}
# Get the distro we want to pull from as a dynamic build variable
# (We need to define it in each build stage)

View File

@@ -7,7 +7,7 @@ ARG FROM=matrixdotorg/synapse:$SYNAPSE_VERSION
# target image. For repeated rebuilds, this is much faster than apt installing
# each time.
FROM debian:bullseye-slim AS deps_base
FROM docker.io/library/debian:bullseye-slim AS deps_base
RUN \
--mount=type=cache,target=/var/cache/apt,sharing=locked \
--mount=type=cache,target=/var/lib/apt,sharing=locked \
@@ -21,7 +21,7 @@ FROM debian:bullseye-slim AS deps_base
# which makes it much easier to copy (but we need to make sure we use an image
# based on the same debian version as the synapse image, to make sure we get
# the expected version of libc.
FROM redis:6-bullseye AS redis_base
FROM docker.io/library/redis:6-bullseye AS redis_base
# now build the final image, based on the the regular Synapse docker image
FROM $FROM

View File

@@ -73,7 +73,8 @@ The following environment variables are supported in `generate` mode:
will log sensitive information such as access tokens.
This should not be needed unless you are a developer attempting to debug something
particularly tricky.
* `SYNAPSE_LOG_TESTING`: if set, Synapse will log additional information useful
for testing.
## Postgres

View File

@@ -7,6 +7,7 @@
# https://github.com/matrix-org/synapse/blob/develop/docker/README-testing.md#testing-with-postgresql-and-single-or-multi-process-synapse
ARG SYNAPSE_VERSION=latest
# This is an intermediate image, to be built locally (not pulled from a registry).
ARG FROM=matrixdotorg/synapse-workers:$SYNAPSE_VERSION
FROM $FROM
@@ -19,8 +20,8 @@ FROM $FROM
# the same debian version as Synapse's docker image (so the versions of the
# shared libraries match).
RUN adduser --system --uid 999 postgres --home /var/lib/postgresql
COPY --from=postgres:13-bullseye /usr/lib/postgresql /usr/lib/postgresql
COPY --from=postgres:13-bullseye /usr/share/postgresql /usr/share/postgresql
COPY --from=docker.io/library/postgres:13-bullseye /usr/lib/postgresql /usr/lib/postgresql
COPY --from=docker.io/library/postgres:13-bullseye /usr/share/postgresql /usr/share/postgresql
RUN mkdir /var/run/postgresql && chown postgres /var/run/postgresql
ENV PATH="${PATH}:/usr/lib/postgresql/13/bin"
ENV PGDATA=/var/lib/postgresql/data

View File

@@ -6,10 +6,6 @@
worker_app: "{{ app }}"
worker_name: "{{ name }}"
# The replication listener on the main synapse process.
worker_replication_host: 127.0.0.1
worker_replication_http_port: 9093
worker_listeners:
- type: http
port: {{ port }}

View File

@@ -49,17 +49,35 @@ handlers:
class: logging.StreamHandler
formatter: precise
{% if not SYNAPSE_LOG_SENSITIVE %}
{#
If SYNAPSE_LOG_SENSITIVE is unset, then override synapse.storage.SQL to INFO
so that DEBUG entries (containing sensitive information) are not emitted.
#}
loggers:
# This is just here so we can leave `loggers` in the config regardless of whether
# we configure other loggers below (avoid empty yaml dict error).
_placeholder:
level: "INFO"
{% if not SYNAPSE_LOG_SENSITIVE %}
{#
If SYNAPSE_LOG_SENSITIVE is unset, then override synapse.storage.SQL to INFO
so that DEBUG entries (containing sensitive information) are not emitted.
#}
synapse.storage.SQL:
# beware: increasing this to DEBUG will make synapse log sensitive
# information such as access tokens.
level: INFO
{% endif %}
{% endif %}
{% if SYNAPSE_LOG_TESTING %}
{#
If Synapse is under test, log a few more useful things for a developer
attempting to debug something particularly tricky.
With `synapse.visibility.filtered_event_debug`, it logs when events are (maybe
unexpectedly) filtered out of responses in tests. It's just nice to be able to
look at the CI log and figure out why an event isn't being returned.
#}
synapse.visibility.filtered_event_debug:
level: DEBUG
{% endif %}
root:
level: {{ SYNAPSE_LOG_LEVEL or "INFO" }}

View File

@@ -40,6 +40,8 @@
# log level. INFO is the default.
# * SYNAPSE_LOG_SENSITIVE: If unset, SQL and SQL values won't be logged,
# regardless of the SYNAPSE_LOG_LEVEL setting.
# * SYNAPSE_LOG_TESTING: if set, Synapse will log additional information useful
# for testing.
#
# NOTE: According to Complement's ENTRYPOINT expectations for a homeserver image (as defined
# in the project's README), this script may be run multiple times, and functionality should
@@ -69,6 +71,9 @@ import yaml
from jinja2 import Environment, FileSystemLoader
MAIN_PROCESS_HTTP_LISTENER_PORT = 8080
MAIN_PROCESS_INSTANCE_NAME = "main"
MAIN_PROCESS_LOCALHOST_ADDRESS = "127.0.0.1"
MAIN_PROCESS_REPLICATION_PORT = 9093
# A simple name used as a placeholder in the WORKERS_CONFIG below. This will be replaced
# during processing with the name of the worker.
@@ -719,8 +724,8 @@ def generate_worker_files(
# shared config file.
listeners = [
{
"port": 9093,
"bind_address": "127.0.0.1",
"port": MAIN_PROCESS_REPLICATION_PORT,
"bind_address": MAIN_PROCESS_LOCALHOST_ADDRESS,
"type": "http",
"resources": [{"names": ["replication"]}],
}
@@ -870,6 +875,14 @@ def generate_worker_files(
workers_in_use = len(requested_worker_types) > 0
# If there are workers, add the main process to the instance_map too.
if workers_in_use:
instance_map = shared_config.setdefault("instance_map", {})
instance_map[MAIN_PROCESS_INSTANCE_NAME] = {
"host": MAIN_PROCESS_LOCALHOST_ADDRESS,
"port": MAIN_PROCESS_REPLICATION_PORT,
}
# Shared homeserver config
convert(
"/conf/shared.yaml.j2",
@@ -936,6 +949,7 @@ def generate_worker_log_config(
extra_log_template_args["SYNAPSE_LOG_SENSITIVE"] = environ.get(
"SYNAPSE_LOG_SENSITIVE"
)
extra_log_template_args["SYNAPSE_LOG_TESTING"] = environ.get("SYNAPSE_LOG_TESTING")
# Render and write the file
log_config_filepath = f"/conf/workers/{worker_name}.log.config"

View File

@@ -10,7 +10,7 @@ ARG PYTHON_VERSION=3.9
###
# We hardcode the use of Debian bullseye here because this could change upstream
# and other Dockerfiles used for testing are expecting bullseye.
FROM docker.io/python:${PYTHON_VERSION}-slim-bullseye
FROM docker.io/library/python:${PYTHON_VERSION}-slim-bullseye
# Install Rust and other dependencies (stolen from normal Dockerfile)
# install the OS build deps

View File

@@ -813,6 +813,33 @@ The following fields are returned in the JSON response body:
- `total` - Total number of user's devices.
### Create a device
Creates a new device for a specific `user_id` and `device_id`. Does nothing if the `device_id`
exists already.
The API is:
```
POST /_synapse/admin/v2/users/<user_id>/devices
{
"device_id": "QBUAZIFURK"
}
```
An empty JSON dict is returned.
**Parameters**
The following parameters should be set in the URL:
- `user_id` - fully qualified: for example, `@user:server.com`.
The following fields are required in the JSON request body:
- `device_id` - The device ID to create.
### Delete multiple devices
Deletes the given devices for a specific `user_id`, and invalidates
any access token associated with them.

View File

@@ -22,6 +22,9 @@ on Windows is not officially supported.
The code of Synapse is written in Python 3. To do pretty much anything, you'll need [a recent version of Python 3](https://www.python.org/downloads/). Your Python also needs support for [virtual environments](https://docs.python.org/3/library/venv.html). This is usually built-in, but some Linux distributions like Debian and Ubuntu split it out into its own package. Running `sudo apt install python3-venv` should be enough.
A recent version of the Rust compiler is needed to build the native modules. The
easiest way of installing the latest version is to use [rustup](https://rustup.rs/).
Synapse can connect to PostgreSQL via the [psycopg2](https://pypi.org/project/psycopg2/) Python library. Building this library from source requires access to PostgreSQL's C header files. On Debian or Ubuntu Linux, these can be installed with `sudo apt install libpq-dev`.
Synapse has an optional, improved user search with better Unicode support. For that you need the development package of `libicu`. On Debian or Ubuntu Linux, this can be installed with `sudo apt install libicu-dev`.
@@ -30,9 +33,6 @@ The source code of Synapse is hosted on GitHub. You will also need [a recent ver
For some tests, you will need [a recent version of Docker](https://docs.docker.com/get-docker/).
A recent version of the Rust compiler is needed to build the native modules. The
easiest way of installing the latest version is to use [rustup](https://rustup.rs/).
# 3. Get the source.
@@ -53,6 +53,11 @@ can find many good git tutorials on the web.
# 4. Install the dependencies
Before installing the Python dependencies, make sure you have installed a recent version
of Rust (see the "What do I need?" section above). The easiest way of installing the
latest version is to use [rustup](https://rustup.rs/).
Synapse uses the [poetry](https://python-poetry.org/) project to manage its dependencies
and development environment. Once you have installed Python 3 and added the
source, you should install `poetry`.
@@ -76,7 +81,8 @@ cd path/where/you/have/cloned/the/repository
poetry install --extras all
```
This will install the runtime and developer dependencies for the project.
This will install the runtime and developer dependencies for the project. Be sure to check
that the `poetry install` step completed cleanly.
## Running Synapse via poetry
@@ -84,14 +90,31 @@ To start a local instance of Synapse in the locked poetry environment, create a
```sh
cp docs/sample_config.yaml homeserver.yaml
cp docs/sample_log_config.yaml log_config.yaml
```
Now edit homeserver.yaml, and run Synapse with:
Now edit `homeserver.yaml`, things you might want to change include:
- Set a `server_name`
- Adjusting paths to be correct for your system like the `log_config` to point to the log config you just copied
- Using a [PostgreSQL database instead of SQLite](https://matrix-org.github.io/synapse/latest/usage/configuration/config_documentation.html#database)
- Adding a [`registration_shared_secret`](https://matrix-org.github.io/synapse/latest/usage/configuration/config_documentation.html#registration_shared_secret) so you can use [`register_new_matrix_user` command](https://matrix-org.github.io/synapse/latest/setup/installation.html#registering-a-user).
And then run Synapse with the following command:
```sh
poetry run python -m synapse.app.homeserver -c homeserver.yaml
```
If you get an error like the following:
```
importlib.metadata.PackageNotFoundError: matrix-synapse
```
this probably indicates that the `poetry install` step did not complete cleanly - go back and
resolve any issues and re-run until successful.
# 5. Get in touch.
Join our developer community on Matrix: [#synapse-dev:matrix.org](https://matrix.to/#/#synapse-dev:matrix.org)!

View File

@@ -260,15 +260,17 @@ doesn't require poetry. (It's what we use in CI too). However, you could try
## ...handle a Dependabot pull request?
Synapse uses Dependabot to keep the `poetry.lock` file up-to-date. When it
creates a pull request a GitHub Action will run to automatically create a changelog
file. Ensure that:
Synapse uses Dependabot to keep the `poetry.lock` and `Cargo.lock` file
up-to-date with the latest releases of our dependencies. The changelog check is
omitted for Dependabot PRs; the release script will include them in the
changelog.
When reviewing a dependabot PR, ensure that:
* the lockfile changes look reasonable;
* the upstream changelog file (linked in the description) doesn't include any
breaking changes;
* continuous integration passes (due to permissions, the GitHub Actions run on
the changelog commit will fail, look at the initial commit of the pull request);
* continuous integration passes.
In particular, any updates to the type hints (usually packages which start with `types-`)
should be safe to merge if linting passes.

View File

@@ -46,6 +46,9 @@ instead.
If the authentication is unsuccessful, the module must return `None`.
Note that the user is not automatically registered, the `register_user(..)` method of
the [module API](writing_a_module.html) can be used to lazily create users.
If multiple modules register an auth checker for the same login type but with different
fields, Synapse will refuse to start.

View File

@@ -569,7 +569,7 @@ You should receive a response similar to the following. Make sure to save it.
{"client_id":"someclientid_123","client_secret":"someclientsecret_123","id":"12345","name":"my_synapse_app","redirect_uri":"https://[synapse_public_baseurl]/_synapse/client/oidc/callback","website":null,"vapid_key":"somerandomvapidkey_123"}
```
As the Synapse login mechanism needs an attribute to uniquely identify users, and Mastodon's endpoint does not return a `sub` property, an alternative `subject_claim` has to be set. Your Synapse configuration should include the following:
As the Synapse login mechanism needs an attribute to uniquely identify users, and Mastodon's endpoint does not return a `sub` property, an alternative `subject_template` has to be set. Your Synapse configuration should include the following:
```yaml
oidc_providers:
@@ -585,7 +585,9 @@ oidc_providers:
scopes: ["read"]
user_mapping_provider:
config:
subject_claim: "id"
subject_template: "{{ user.id }}"
localpart_template: "{{ user.username }}"
display_name_template: "{{ user.display_name }}"
```
Note that the fields `client_id` and `client_secret` are taken from the CURL response above.

View File

@@ -30,12 +30,6 @@ minimal.
See [the TCP replication documentation](tcp_replication.md).
### The Slaved DataStore
There are read-only version of the synapse storage layer in
`synapse/replication/slave/storage` that use the response of the
replication API to invalidate their caches.
### The TCP Replication Module
Information about how the tcp replication module is structured, including how
the classes interact, can be found in

View File

@@ -68,9 +68,7 @@ root:
# Write logs to the `buffer` handler, which will buffer them together in memory,
# then write them to a file.
#
# Replace "buffer" with "console" to log to stderr instead. (Note that you'll
# also need to update the configuration for the `twisted` logger above, in
# this case.)
# Replace "buffer" with "console" to log to stderr instead.
#
handlers: [buffer]

View File

@@ -1,10 +1,6 @@
worker_app: synapse.app.generic_worker
worker_name: generic_worker1
# The replication listener on the main synapse process.
worker_replication_host: 127.0.0.1
worker_replication_http_port: 9093
worker_listeners:
- type: http
port: 8083

View File

@@ -88,6 +88,106 @@ process, for example:
dpkg -i matrix-synapse-py3_1.3.0+stretch1_amd64.deb
```
# Upgrading to v1.85.0
## Application service registration with "user" property deprecation
Application services should ensure they call the `/register` endpoint with a
`username` property. The legacy `user` property is considered deprecated and
should no longer be included.
A future version of Synapse (v1.88.0 or later) will remove support for legacy
application service login.
# Upgrading to v1.84.0
## Deprecation of `worker_replication_*` configuration settings
When using workers,
* `worker_replication_host`
* `worker_replication_http_port`
* `worker_replication_http_tls`
should now be removed from individual worker YAML configurations and the main process should instead be added to the `instance_map`
in the shared YAML configuration, using the name `main`.
The old `worker_replication_*` settings are now considered deprecated and are expected to be removed in Synapse v1.88.0.
### Example change
#### Before:
Shared YAML
```yaml
instance_map:
generic_worker1:
host: localhost
port: 5678
tls: false
```
Worker YAML
```yaml
worker_app: synapse.app.generic_worker
worker_name: generic_worker1
worker_replication_host: localhost
worker_replication_http_port: 3456
worker_replication_http_tls: false
worker_listeners:
- type: http
port: 1234
resources:
- names: [client, federation]
- type: http
port: 5678
resources:
- names: [replication]
worker_log_config: /etc/matrix-synapse/generic-worker-log.yaml
```
#### After:
Shared YAML
```yaml
instance_map:
main:
host: localhost
port: 3456
tls: false
generic_worker1:
host: localhost
port: 5678
tls: false
```
Worker YAML
```yaml
worker_app: synapse.app.generic_worker
worker_name: generic_worker1
worker_listeners:
- type: http
port: 1234
resources:
- names: [client, federation]
- type: http
port: 5678
resources:
- names: [replication]
worker_log_config: /etc/matrix-synapse/generic-worker-log.yaml
```
Notes:
* `tls` is optional but mirrors the functionality of `worker_replication_http_tls`
# Upgrading to v1.81.0
## Application service path & authentication deprecations

View File

@@ -42,11 +42,6 @@ The following statistics are sent to the configured reporting endpoint:
| `daily_e2ee_messages` | int | The number of (state) events with the type `m.room.encrypted` seen in the last 24 hours. |
| `daily_sent_messages` | int | The number of (state) events sent by a local user with the type `m.room.message` seen in the last 24 hours. |
| `daily_sent_e2ee_messages` | int | The number of (state) events sent by a local user with the type `m.room.encrypted` seen in the last 24 hours. |
| `r30_users_all` | int | The number of 30 day retained users, defined as users who have created their accounts more than 30 days ago, where they were last seen at most 30 days ago and where those two timestamps are over 30 days apart. Includes clients that do not fit into the below r30 client types. |
| `r30_users_android` | int | The number of 30 day retained users, as defined above. Filtered only to clients with "Android" in the user agent string. |
| `r30_users_ios` | int | The number of 30 day retained users, as defined above. Filtered only to clients with "iOS" in the user agent string. |
| `r30_users_electron` | int | The number of 30 day retained users, as defined above. Filtered only to clients with "Electron" in the user agent string. |
| `r30_users_web` | int | The number of 30 day retained users, as defined above. Filtered only to clients with "Mozilla" or "Gecko" in the user agent string. |
| `r30v2_users_all` | int | The number of 30 day retained users, with a revised algorithm. Defined as users that appear more than once in the past 60 days, and have more than 30 days between the most and least recent appearances in the past 60 days. Includes clients that do not fit into the below r30 client types. |
| `r30v2_users_android` | int | The number of 30 day retained users, as defined above. Filtered only to clients with ("riot" or "element") and "android" (case-insensitive) in the user agent string. |
| `r30v2_users_ios` | int | The number of 30 day retained users, as defined above. Filtered only to clients with ("riot" or "element") and "ios" (case-insensitive) in the user agent string. |

View File

@@ -2570,7 +2570,50 @@ Example configuration:
```yaml
nonrefreshable_access_token_lifetime: 24h
```
---
### `ui_auth`
The amount of time to allow a user-interactive authentication session to be active.
This defaults to 0, meaning the user is queried for their credentials
before every action, but this can be overridden to allow a single
validation to be re-used. This weakens the protections afforded by
the user-interactive authentication process, by allowing for multiple
(and potentially different) operations to use the same validation session.
This is ignored for potentially "dangerous" operations (including
deactivating an account, modifying an account password, adding a 3PID,
and minting additional login tokens).
Use the `session_timeout` sub-option here to change the time allowed for credential validation.
Example configuration:
```yaml
ui_auth:
session_timeout: "15s"
```
---
### `login_via_existing_session`
Matrix supports the ability of an existing session to mint a login token for
another client.
Synapse disables this by default as it has security ramifications -- a malicious
client could use the mechanism to spawn more than one session.
The duration of time the generated token is valid for can be configured with the
`token_timeout` sub-option.
User-interactive authentication is required when this is enabled unless the
`require_ui_auth` sub-option is set to `False`.
Example configuration:
```yaml
login_via_existing_session:
enabled: true
require_ui_auth: false
token_timeout: "5m"
```
---
## Metrics
Config options related to metrics.
@@ -3415,28 +3458,6 @@ password_config:
require_uppercase: true
```
---
### `ui_auth`
The amount of time to allow a user-interactive authentication session to be active.
This defaults to 0, meaning the user is queried for their credentials
before every action, but this can be overridden to allow a single
validation to be re-used. This weakens the protections afforded by
the user-interactive authentication process, by allowing for multiple
(and potentially different) operations to use the same validation session.
This is ignored for potentially "dangerous" operations (including
deactivating an account, modifying an account password, and
adding a 3PID).
Use the `session_timeout` sub-option here to change the time allowed for credential validation.
Example configuration:
```yaml
ui_auth:
session_timeout: "15s"
```
---
## Push
Configuration settings related to push notifications
@@ -3884,15 +3905,20 @@ federation_sender_instances:
### `instance_map`
When using workers this should be a map from [`worker_name`](#worker_name) to the
HTTP replication listener of the worker, if configured.
HTTP replication listener of the worker, if configured, and to the main process.
Each worker declared under [`stream_writers`](../../workers.md#stream-writers) needs
a HTTP replication listener, and that listener should be included in the `instance_map`.
(The main process also needs an HTTP replication listener, but it should not be
listed in the `instance_map`.)
The main process also needs an entry on the `instance_map`, and it should be listed under
`main` **if even one other worker exists**. Ensure the port matches with what is declared
inside the `listener` block for a `replication` listener.
Example configuration:
```yaml
instance_map:
main:
host: localhost
port: 8030
worker1:
host: localhost
port: 8034
@@ -3974,11 +4000,22 @@ This setting has the following sub-options:
* `enabled`: whether to use Redis support. Defaults to false.
* `host` and `port`: Optional host and port to use to connect to redis. Defaults to
localhost and 6379
* `path`: The full path to a local Unix socket file. **If this is used, `host` and
`port` are ignored.** Defaults to `/tmp/redis.sock'
* `password`: Optional password if configured on the Redis instance.
* `dbid`: Optional redis dbid if needs to connect to specific redis logical db.
* `use_tls`: Whether to use tls connection. Defaults to false.
* `certificate_file`: Optional path to the certificate file
* `private_key_file`: Optional path to the private key file
* `ca_file`: Optional path to the CA certificate file. Use this one or:
* `ca_path`: Optional path to the folder containing the CA certificate file
_Added in Synapse 1.78.0._
_Changed in Synapse 1.84.0: Added use\_tls, certificate\_file, private\_key\_file, ca\_file and ca\_path attributes_
_Changed in Synapse 1.85.0: Added path option to use a local Unix socket_
Example configuration:
```yaml
redis:
@@ -3987,6 +4024,10 @@ redis:
port: 6379
password: <secret_password>
dbid: <dbid>
#use_tls: True
#certificate_file: <path_to_the_certificate_file>
#private_key_file: <path_to_the_private_key_file>
#ca_file: <path_to_the_ca_certificate_file>
```
---
## Individual worker configuration
@@ -4024,6 +4065,7 @@ worker_name: generic_worker1
```
---
### `worker_replication_host`
*Deprecated as of version 1.84.0. Place `host` under `main` entry on the [`instance_map`](#instance_map) in your shared yaml configuration instead.*
The HTTP replication endpoint that it should talk to on the main Synapse process.
The main Synapse process defines this with a `replication` resource in
@@ -4035,6 +4077,7 @@ worker_replication_host: 127.0.0.1
```
---
### `worker_replication_http_port`
*Deprecated as of version 1.84.0. Place `port` under `main` entry on the [`instance_map`](#instance_map) in your shared yaml configuration instead.*
The HTTP replication port that it should talk to on the main Synapse process.
The main Synapse process defines this with a `replication` resource in
@@ -4046,6 +4089,7 @@ worker_replication_http_port: 9093
```
---
### `worker_replication_http_tls`
*Deprecated as of version 1.84.0. Place `tls` under `main` entry on the [`instance_map`](#instance_map) in your shared yaml configuration instead.*
Whether TLS should be used for talking to the HTTP replication port on the main
Synapse process.
@@ -4071,9 +4115,9 @@ A worker can handle HTTP requests. To do so, a `worker_listeners` option
must be declared, in the same way as the [`listeners` option](#listeners)
in the shared config.
Workers declared in [`stream_writers`](#stream_writers) will need to include a
`replication` listener here, in order to accept internal HTTP requests from
other workers.
Workers declared in [`stream_writers`](#stream_writers) and [`instance_map`](#instance_map)
will need to include a `replication` listener here, in order to accept internal HTTP
requests from other workers.
Example configuration:
```yaml

View File

@@ -87,12 +87,18 @@ shared configuration file.
### Shared configuration
Normally, only a couple of changes are needed to make an existing configuration
file suitable for use with workers. First, you need to enable an
Normally, only a few changes are needed to make an existing configuration
file suitable for use with workers:
* First, you need to enable an
["HTTP replication listener"](usage/configuration/config_documentation.md#listeners)
for the main process; and secondly, you need to enable
[redis-based replication](usage/configuration/config_documentation.md#redis).
Optionally, a [shared secret](usage/configuration/config_documentation.md#worker_replication_secret)
for the main process
* Secondly, you need to enable
[redis-based replication](usage/configuration/config_documentation.md#redis)
* You will need to add an [`instance_map`](usage/configuration/config_documentation.md#instance_map)
with the `main` process defined, as well as the relevant connection information from
it's HTTP `replication` listener (defined in step 1 above). Note that the `host` defined
is the address the worker needs to look for the `main` process at, not necessarily the same address that is bound to.
* Optionally, a [shared secret](usage/configuration/config_documentation.md#worker_replication_secret)
can be used to authenticate HTTP traffic between workers. For example:
```yaml
@@ -111,6 +117,11 @@ worker_replication_secret: ""
redis:
enabled: true
instance_map:
main:
host: 'localhost'
port: 9093
```
See the [configuration manual](usage/configuration/config_documentation.md)
@@ -130,13 +141,13 @@ In the config file for each worker, you must specify:
* The type of worker ([`worker_app`](usage/configuration/config_documentation.md#worker_app)).
The currently available worker applications are listed [below](#available-worker-applications).
* A unique name for the worker ([`worker_name`](usage/configuration/config_documentation.md#worker_name)).
* The HTTP replication endpoint that it should talk to on the main synapse process
([`worker_replication_host`](usage/configuration/config_documentation.md#worker_replication_host) and
[`worker_replication_http_port`](usage/configuration/config_documentation.md#worker_replication_http_port)).
* If handling HTTP requests, a [`worker_listeners`](usage/configuration/config_documentation.md#worker_listeners) option
with an `http` listener.
* **Synapse 1.72 and older:** if handling the `^/_matrix/client/v3/keys/upload` endpoint, the HTTP URI for
the main process (`worker_main_http_uri`). This config option is no longer required and is ignored when running Synapse 1.73 and newer.
* **Synapse 1.83 and older:** The HTTP replication endpoint that the worker should talk to on the main synapse process
([`worker_replication_host`](usage/configuration/config_documentation.md#worker_replication_host) and
[`worker_replication_http_port`](usage/configuration/config_documentation.md#worker_replication_http_port)). If using Synapse 1.84 and newer, these are not needed if `main` is defined on the [shared configuration](#shared-configuration) `instance_map`
For example:
@@ -417,11 +428,14 @@ effects of bursts of events from that bridge on events sent by normal users.
Additionally, the writing of specific streams (such as events) can be moved off
of the main process to a particular worker.
To enable this, the worker must have a
[HTTP `replication` listener](usage/configuration/config_documentation.md#listeners) configured,
have a [`worker_name`](usage/configuration/config_documentation.md#worker_name)
To enable this, the worker must have:
* An [HTTP `replication` listener](usage/configuration/config_documentation.md#listeners) configured,
* Have a [`worker_name`](usage/configuration/config_documentation.md#worker_name)
and be listed in the [`instance_map`](usage/configuration/config_documentation.md#instance_map)
config. The same worker can handle multiple streams, but unless otherwise documented,
config.
* Have the main process declared on the [`instance_map`](usage/configuration/config_documentation.md#instance_map) as well.
Note: The same worker can handle multiple streams, but unless otherwise documented,
each stream can only have a single writer.
For example, to move event persistence off to a dedicated worker, the shared
@@ -429,6 +443,9 @@ configuration would include:
```yaml
instance_map:
main:
host: localhost
port: 8030
event_persister1:
host: localhost
port: 8034

View File

@@ -1,35 +1,30 @@
# A nix flake that sets up a complete Synapse development environment. Dependencies
# A Nix flake that sets up a complete Synapse development environment. Dependencies
# for the SyTest (https://github.com/matrix-org/sytest) and Complement
# (https://github.com/matrix-org/complement) Matrix homeserver test suites are also
# installed automatically.
#
# You must have already installed nix (https://nixos.org) on your system to use this.
# nix can be installed on Linux or MacOS; NixOS is not required. Windows is not
# directly supported, but nix can be installed inside of WSL2 or even Docker
# You must have already installed Nix (https://nixos.org) on your system to use this.
# Nix can be installed on Linux or MacOS; NixOS is not required. Windows is not
# directly supported, but Nix can be installed inside of WSL2 or even Docker
# containers. Please refer to https://nixos.org/download for details.
#
# You must also enable support for flakes in Nix. See the following for how to
# do so permanently: https://nixos.wiki/wiki/Flakes#Enable_flakes
#
# Be warned: you'll need over 3.75 GB of free space to download all the dependencies.
#
# Usage:
#
# With nix installed, navigate to the directory containing this flake and run
# With Nix installed, navigate to the directory containing this flake and run
# `nix develop --impure`. The `--impure` is necessary in order to store state
# locally from "services", such as PostgreSQL and Redis.
#
# You should now be dropped into a new shell with all programs and dependencies
# availabile to you!
#
# You can start up pre-configured, local PostgreSQL and Redis instances by
# You can start up pre-configured local Synapse, PostgreSQL and Redis instances by
# running: `devenv up`. To stop them, use Ctrl-C.
#
# A PostgreSQL database called 'synapse' will be set up for you, along with
# a PostgreSQL user named 'synapse_user'.
# The 'host' can be found by running `echo $PGHOST` with the development
# shell activated. Use these values to configure your Synapse to connect
# to the local PostgreSQL database. You do not need to specify a password.
# https://matrix-org.github.io/synapse/latest/postgres
#
# All state (the venv, postgres and redis data and config) are stored in
# .devenv/state. Deleting a file from here and then re-entering the shell
# will recreate these files from scratch.
@@ -66,7 +61,7 @@
let
pkgs = nixpkgs.legacyPackages.${system};
in {
# Everything is configured via devenv - a nix module for creating declarative
# Everything is configured via devenv - a Nix module for creating declarative
# developer environments. See https://devenv.sh/reference/options/ for a list
# of all possible options.
default = devenv.lib.mkShell {
@@ -100,6 +95,10 @@
# For building the Synapse documentation website.
mdbook
# For releasing Synapse
debian-devscripts # (`dch` for manipulating the Debian changelog)
libnotify # (the release script uses `notify-send` to tell you when CI jobs are done)
];
# Install Python and manage a virtualenv with Poetry.
@@ -153,11 +152,39 @@
# Redis is needed in order to run Synapse in worker mode.
services.redis.enable = true;
# Configure and start Synapse. Before starting Synapse, this shell code:
# * generates a default homeserver.yaml config file if one does not exist, and
# * ensures a directory containing two additional homeserver config files exists;
# one to configure using the development environment's PostgreSQL as the
# database backend and another for enabling Redis support.
process.before = ''
python -m synapse.app.homeserver -c homeserver.yaml --generate-config --server-name=synapse.dev --report-stats=no
mkdir -p homeserver-config-overrides.d
cat > homeserver-config-overrides.d/database.yaml << EOF
## Do not edit this file. This file is generated by flake.nix
database:
name: psycopg2
args:
user: synapse_user
database: synapse
host: $PGHOST
cp_min: 5
cp_max: 10
EOF
cat > homeserver-config-overrides.d/redis.yaml << EOF
## Do not edit this file. This file is generated by flake.nix
redis:
enabled: true
EOF
'';
# Start synapse when `devenv up` is run.
processes.synapse.exec = "poetry run python -m synapse.app.homeserver -c homeserver.yaml --config-directory homeserver-config-overrides.d";
# Define the perl modules we require to run SyTest.
#
# This list was compiled by cross-referencing https://metacpan.org/
# with the modules defined in './cpanfile' and then finding the
# corresponding nix packages on https://search.nixos.org/packages.
# corresponding Nix packages on https://search.nixos.org/packages.
#
# This was done until `./install-deps.pl --dryrun` produced no output.
env.PERL5LIB = "${with pkgs.perl536Packages; makePerlPath [

View File

@@ -2,17 +2,32 @@
namespace_packages = True
plugins = pydantic.mypy, mypy_zope:plugin, scripts-dev/mypy_synapse_plugin.py
follow_imports = normal
check_untyped_defs = True
show_error_codes = True
show_traceback = True
mypy_path = stubs
warn_unreachable = True
warn_unused_ignores = True
local_partial_types = True
no_implicit_optional = True
# Strict checks, see mypy --help
warn_unused_configs = True
# disallow_any_generics = True
disallow_subclassing_any = True
# disallow_untyped_calls = True
disallow_untyped_defs = True
strict_equality = True
disallow_incomplete_defs = True
# check_untyped_defs = True
# disallow_untyped_decorators = True
warn_redundant_casts = True
warn_unused_ignores = True
# warn_return_any = True
# no_implicit_reexport = True
strict_equality = True
strict_concatenate = True
# Run mypy type checking with the minimum supported Python version to catch new usage
# that isn't backwards-compatible (types, overloads, etc).
python_version = 3.8
files =
docker/,
@@ -28,9 +43,7 @@ warn_unused_ignores = False
[mypy-synapse.util.caches.treecache]
disallow_untyped_defs = False
[mypy-tests.util.caches.test_descriptors]
disallow_untyped_defs = False
disallow_incomplete_defs = False
;; Dependencies without annotations
;; Before ignoring a module, check to see if type stubs are available.
@@ -40,18 +53,18 @@ disallow_untyped_defs = False
;; which we can pull in as a dev dependency by adding to `pyproject.toml`'s
;; `[tool.poetry.dev-dependencies]` list.
# https://github.com/lepture/authlib/issues/460
[mypy-authlib.*]
ignore_missing_imports = True
[mypy-ijson.*]
ignore_missing_imports = True
[mypy-lxml]
ignore_missing_imports = True
# https://github.com/msgpack/msgpack-python/issues/448
[mypy-msgpack]
ignore_missing_imports = True
# https://github.com/wolever/parameterized/issues/143
[mypy-parameterized.*]
ignore_missing_imports = True
@@ -73,6 +86,7 @@ ignore_missing_imports = True
[mypy-srvlookup.*]
ignore_missing_imports = True
# https://github.com/twisted/treq/pull/366
[mypy-treq.*]
ignore_missing_imports = True

187
poetry.lock generated
View File

@@ -1,4 +1,4 @@
# This file is automatically @generated by Poetry and should not be changed by hand.
# This file is automatically @generated by Poetry 1.4.2 and should not be changed by hand.
[[package]]
name = "alabaster"
@@ -580,20 +580,20 @@ dev = ["Sphinx", "coverage", "flake8", "lxml", "lxml-stubs", "memory-profiler",
[[package]]
name = "furo"
version = "2023.3.27"
version = "2023.5.20"
description = "A clean customisable Sphinx documentation theme."
category = "dev"
optional = false
python-versions = ">=3.7"
files = [
{file = "furo-2023.3.27-py3-none-any.whl", hash = "sha256:4ab2be254a2d5e52792d0ca793a12c35582dd09897228a6dd47885dabd5c9521"},
{file = "furo-2023.3.27.tar.gz", hash = "sha256:b99e7867a5cc833b2b34d7230631dd6558c7a29f93071fdbb5709634bb33c5a5"},
{file = "furo-2023.5.20-py3-none-any.whl", hash = "sha256:594a8436ddfe0c071f3a9e9a209c314a219d8341f3f1af33fdf7c69544fab9e6"},
{file = "furo-2023.5.20.tar.gz", hash = "sha256:40e09fa17c6f4b22419d122e933089226dcdb59747b5b6c79363089827dea16f"},
]
[package.dependencies]
beautifulsoup4 = "*"
pygments = ">=2.7"
sphinx = ">=5.0,<7.0"
sphinx = ">=6.0,<8.0"
sphinx-basic-ng = "*"
[[package]]
@@ -1215,6 +1215,21 @@ html5 = ["html5lib"]
htmlsoup = ["BeautifulSoup4"]
source = ["Cython (>=0.29.7)"]
[[package]]
name = "lxml-stubs"
version = "0.4.0"
description = "Type annotations for the lxml package"
category = "dev"
optional = false
python-versions = "*"
files = [
{file = "lxml-stubs-0.4.0.tar.gz", hash = "sha256:184877b42127256abc2b932ba8bd0ab5ea80bd0b0fee618d16daa40e0b71abee"},
{file = "lxml_stubs-0.4.0-py3-none-any.whl", hash = "sha256:3b381e9e82397c64ea3cc4d6f79d1255d015f7b114806d4826218805c10ec003"},
]
[package.extras]
test = ["coverage[toml] (==5.2)", "pytest (>=6.0.0)", "pytest-mypy-plugins (==1.9.3)"]
[[package]]
name = "markdown-it-py"
version = "2.2.0"
@@ -1632,14 +1647,14 @@ files = [
[[package]]
name = "phonenumbers"
version = "8.13.7"
version = "8.13.11"
description = "Python version of Google's common library for parsing, formatting, storing and validating international phone numbers."
category = "main"
optional = false
python-versions = "*"
files = [
{file = "phonenumbers-8.13.7-py2.py3-none-any.whl", hash = "sha256:d3e3555b38c89b121f5b2e917847003bdd07027569d758d5f40156c01aeac089"},
{file = "phonenumbers-8.13.7.tar.gz", hash = "sha256:253bb0e01250d21a11f2b42b3e6e161b7f6cb2ac440e2e2a95c1da71d221ee1a"},
{file = "phonenumbers-8.13.11-py2.py3-none-any.whl", hash = "sha256:107469114fd297258a485bdf8238d0522cb392db1257faf2bf23384ecbdb0e8a"},
{file = "phonenumbers-8.13.11.tar.gz", hash = "sha256:3e3274d88cab3609b55ff5b93417075dbca2d13064f103fbf562e0ea1dda0f9a"},
]
[[package]]
@@ -1781,14 +1796,14 @@ test = ["appdirs (==1.4.4)", "covdefaults (>=2.2.2)", "pytest (>=7.2.1)", "pytes
[[package]]
name = "prometheus-client"
version = "0.16.0"
version = "0.17.0"
description = "Python client for the Prometheus monitoring system."
category = "main"
optional = false
python-versions = ">=3.6"
files = [
{file = "prometheus_client-0.16.0-py3-none-any.whl", hash = "sha256:0836af6eb2c8f4fed712b2f279f6c0a8bbab29f9f4aa15276b91c7cb0d1616ab"},
{file = "prometheus_client-0.16.0.tar.gz", hash = "sha256:a03e35b359f14dd1630898543e2120addfdeacd1a6069c1367ae90fd93ad3f48"},
{file = "prometheus_client-0.17.0-py3-none-any.whl", hash = "sha256:a77b708cf083f4d1a3fb3ce5c95b4afa32b9c521ae363354a4a910204ea095ce"},
{file = "prometheus_client-0.17.0.tar.gz", hash = "sha256:9c3b26f1535945e85b8934fb374678d263137b78ef85f305b1156c7c881cd11b"},
]
[package.extras]
@@ -1887,48 +1902,48 @@ files = [
[[package]]
name = "pydantic"
version = "1.10.7"
version = "1.10.8"
description = "Data validation and settings management using python type hints"
category = "main"
optional = false
python-versions = ">=3.7"
files = [
{file = "pydantic-1.10.7-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e79e999e539872e903767c417c897e729e015872040e56b96e67968c3b918b2d"},
{file = "pydantic-1.10.7-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:01aea3a42c13f2602b7ecbbea484a98169fb568ebd9e247593ea05f01b884b2e"},
{file = "pydantic-1.10.7-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:516f1ed9bc2406a0467dd777afc636c7091d71f214d5e413d64fef45174cfc7a"},
{file = "pydantic-1.10.7-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ae150a63564929c675d7f2303008d88426a0add46efd76c3fc797cd71cb1b46f"},
{file = "pydantic-1.10.7-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:ecbbc51391248116c0a055899e6c3e7ffbb11fb5e2a4cd6f2d0b93272118a209"},
{file = "pydantic-1.10.7-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:f4a2b50e2b03d5776e7f21af73e2070e1b5c0d0df255a827e7c632962f8315af"},
{file = "pydantic-1.10.7-cp310-cp310-win_amd64.whl", hash = "sha256:a7cd2251439988b413cb0a985c4ed82b6c6aac382dbaff53ae03c4b23a70e80a"},
{file = "pydantic-1.10.7-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:68792151e174a4aa9e9fc1b4e653e65a354a2fa0fed169f7b3d09902ad2cb6f1"},
{file = "pydantic-1.10.7-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:dfe2507b8ef209da71b6fb5f4e597b50c5a34b78d7e857c4f8f3115effaef5fe"},
{file = "pydantic-1.10.7-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:10a86d8c8db68086f1e30a530f7d5f83eb0685e632e411dbbcf2d5c0150e8dcd"},
{file = "pydantic-1.10.7-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d75ae19d2a3dbb146b6f324031c24f8a3f52ff5d6a9f22f0683694b3afcb16fb"},
{file = "pydantic-1.10.7-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:464855a7ff7f2cc2cf537ecc421291b9132aa9c79aef44e917ad711b4a93163b"},
{file = "pydantic-1.10.7-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:193924c563fae6ddcb71d3f06fa153866423ac1b793a47936656e806b64e24ca"},
{file = "pydantic-1.10.7-cp311-cp311-win_amd64.whl", hash = "sha256:b4a849d10f211389502059c33332e91327bc154acc1845f375a99eca3afa802d"},
{file = "pydantic-1.10.7-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:cc1dde4e50a5fc1336ee0581c1612215bc64ed6d28d2c7c6f25d2fe3e7c3e918"},
{file = "pydantic-1.10.7-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e0cfe895a504c060e5d36b287ee696e2fdad02d89e0d895f83037245218a87fe"},
{file = "pydantic-1.10.7-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:670bb4683ad1e48b0ecb06f0cfe2178dcf74ff27921cdf1606e527d2617a81ee"},
{file = "pydantic-1.10.7-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:950ce33857841f9a337ce07ddf46bc84e1c4946d2a3bba18f8280297157a3fd1"},
{file = "pydantic-1.10.7-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:c15582f9055fbc1bfe50266a19771bbbef33dd28c45e78afbe1996fd70966c2a"},
{file = "pydantic-1.10.7-cp37-cp37m-win_amd64.whl", hash = "sha256:82dffb306dd20bd5268fd6379bc4bfe75242a9c2b79fec58e1041fbbdb1f7914"},
{file = "pydantic-1.10.7-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:8c7f51861d73e8b9ddcb9916ae7ac39fb52761d9ea0df41128e81e2ba42886cd"},
{file = "pydantic-1.10.7-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:6434b49c0b03a51021ade5c4daa7d70c98f7a79e95b551201fff682fc1661245"},
{file = "pydantic-1.10.7-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:64d34ab766fa056df49013bb6e79921a0265204c071984e75a09cbceacbbdd5d"},
{file = "pydantic-1.10.7-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:701daea9ffe9d26f97b52f1d157e0d4121644f0fcf80b443248434958fd03dc3"},
{file = "pydantic-1.10.7-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:cf135c46099ff3f919d2150a948ce94b9ce545598ef2c6c7bf55dca98a304b52"},
{file = "pydantic-1.10.7-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:b0f85904f73161817b80781cc150f8b906d521fa11e3cdabae19a581c3606209"},
{file = "pydantic-1.10.7-cp38-cp38-win_amd64.whl", hash = "sha256:9f6f0fd68d73257ad6685419478c5aece46432f4bdd8d32c7345f1986496171e"},
{file = "pydantic-1.10.7-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:c230c0d8a322276d6e7b88c3f7ce885f9ed16e0910354510e0bae84d54991143"},
{file = "pydantic-1.10.7-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:976cae77ba6a49d80f461fd8bba183ff7ba79f44aa5cfa82f1346b5626542f8e"},
{file = "pydantic-1.10.7-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7d45fc99d64af9aaf7e308054a0067fdcd87ffe974f2442312372dfa66e1001d"},
{file = "pydantic-1.10.7-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d2a5ebb48958754d386195fe9e9c5106f11275867051bf017a8059410e9abf1f"},
{file = "pydantic-1.10.7-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:abfb7d4a7cd5cc4e1d1887c43503a7c5dd608eadf8bc615413fc498d3e4645cd"},
{file = "pydantic-1.10.7-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:80b1fab4deb08a8292d15e43a6edccdffa5377a36a4597bb545b93e79c5ff0a5"},
{file = "pydantic-1.10.7-cp39-cp39-win_amd64.whl", hash = "sha256:d71e69699498b020ea198468e2480a2f1e7433e32a3a99760058c6520e2bea7e"},
{file = "pydantic-1.10.7-py3-none-any.whl", hash = "sha256:0cd181f1d0b1d00e2b705f1bf1ac7799a2d938cce3376b8007df62b29be3c2c6"},
{file = "pydantic-1.10.7.tar.gz", hash = "sha256:cfc83c0678b6ba51b0532bea66860617c4cd4251ecf76e9846fa5a9f3454e97e"},
{file = "pydantic-1.10.8-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:1243d28e9b05003a89d72e7915fdb26ffd1d39bdd39b00b7dbe4afae4b557f9d"},
{file = "pydantic-1.10.8-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:c0ab53b609c11dfc0c060d94335993cc2b95b2150e25583bec37a49b2d6c6c3f"},
{file = "pydantic-1.10.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f9613fadad06b4f3bc5db2653ce2f22e0de84a7c6c293909b48f6ed37b83c61f"},
{file = "pydantic-1.10.8-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:df7800cb1984d8f6e249351139667a8c50a379009271ee6236138a22a0c0f319"},
{file = "pydantic-1.10.8-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:0c6fafa0965b539d7aab0a673a046466d23b86e4b0e8019d25fd53f4df62c277"},
{file = "pydantic-1.10.8-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:e82d4566fcd527eae8b244fa952d99f2ca3172b7e97add0b43e2d97ee77f81ab"},
{file = "pydantic-1.10.8-cp310-cp310-win_amd64.whl", hash = "sha256:ab523c31e22943713d80d8d342d23b6f6ac4b792a1e54064a8d0cf78fd64e800"},
{file = "pydantic-1.10.8-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:666bdf6066bf6dbc107b30d034615d2627e2121506c555f73f90b54a463d1f33"},
{file = "pydantic-1.10.8-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:35db5301b82e8661fa9c505c800d0990bc14e9f36f98932bb1d248c0ac5cada5"},
{file = "pydantic-1.10.8-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f90c1e29f447557e9e26afb1c4dbf8768a10cc676e3781b6a577841ade126b85"},
{file = "pydantic-1.10.8-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:93e766b4a8226e0708ef243e843105bf124e21331694367f95f4e3b4a92bbb3f"},
{file = "pydantic-1.10.8-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:88f195f582851e8db960b4a94c3e3ad25692c1c1539e2552f3df7a9e972ef60e"},
{file = "pydantic-1.10.8-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:34d327c81e68a1ecb52fe9c8d50c8a9b3e90d3c8ad991bfc8f953fb477d42fb4"},
{file = "pydantic-1.10.8-cp311-cp311-win_amd64.whl", hash = "sha256:d532bf00f381bd6bc62cabc7d1372096b75a33bc197a312b03f5838b4fb84edd"},
{file = "pydantic-1.10.8-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:7d5b8641c24886d764a74ec541d2fc2c7fb19f6da2a4001e6d580ba4a38f7878"},
{file = "pydantic-1.10.8-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7b1f6cb446470b7ddf86c2e57cd119a24959af2b01e552f60705910663af09a4"},
{file = "pydantic-1.10.8-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c33b60054b2136aef8cf190cd4c52a3daa20b2263917c49adad20eaf381e823b"},
{file = "pydantic-1.10.8-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:1952526ba40b220b912cdc43c1c32bcf4a58e3f192fa313ee665916b26befb68"},
{file = "pydantic-1.10.8-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:bb14388ec45a7a0dc429e87def6396f9e73c8c77818c927b6a60706603d5f2ea"},
{file = "pydantic-1.10.8-cp37-cp37m-win_amd64.whl", hash = "sha256:16f8c3e33af1e9bb16c7a91fc7d5fa9fe27298e9f299cff6cb744d89d573d62c"},
{file = "pydantic-1.10.8-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:1ced8375969673929809d7f36ad322934c35de4af3b5e5b09ec967c21f9f7887"},
{file = "pydantic-1.10.8-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:93e6bcfccbd831894a6a434b0aeb1947f9e70b7468f274154d03d71fabb1d7c6"},
{file = "pydantic-1.10.8-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:191ba419b605f897ede9892f6c56fb182f40a15d309ef0142212200a10af4c18"},
{file = "pydantic-1.10.8-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:052d8654cb65174d6f9490cc9b9a200083a82cf5c3c5d3985db765757eb3b375"},
{file = "pydantic-1.10.8-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:ceb6a23bf1ba4b837d0cfe378329ad3f351b5897c8d4914ce95b85fba96da5a1"},
{file = "pydantic-1.10.8-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:6f2e754d5566f050954727c77f094e01793bcb5725b663bf628fa6743a5a9108"},
{file = "pydantic-1.10.8-cp38-cp38-win_amd64.whl", hash = "sha256:6a82d6cda82258efca32b40040228ecf43a548671cb174a1e81477195ed3ed56"},
{file = "pydantic-1.10.8-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:3e59417ba8a17265e632af99cc5f35ec309de5980c440c255ab1ca3ae96a3e0e"},
{file = "pydantic-1.10.8-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:84d80219c3f8d4cad44575e18404099c76851bc924ce5ab1c4c8bb5e2a2227d0"},
{file = "pydantic-1.10.8-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2e4148e635994d57d834be1182a44bdb07dd867fa3c2d1b37002000646cc5459"},
{file = "pydantic-1.10.8-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:12f7b0bf8553e310e530e9f3a2f5734c68699f42218bf3568ef49cd9b0e44df4"},
{file = "pydantic-1.10.8-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:42aa0c4b5c3025483240a25b09f3c09a189481ddda2ea3a831a9d25f444e03c1"},
{file = "pydantic-1.10.8-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:17aef11cc1b997f9d574b91909fed40761e13fac438d72b81f902226a69dac01"},
{file = "pydantic-1.10.8-cp39-cp39-win_amd64.whl", hash = "sha256:66a703d1983c675a6e0fed8953b0971c44dba48a929a2000a493c3772eb61a5a"},
{file = "pydantic-1.10.8-py3-none-any.whl", hash = "sha256:7456eb22ed9aaa24ff3e7b4757da20d9e5ce2a81018c1b3ebd81a0b88a18f3b2"},
{file = "pydantic-1.10.8.tar.gz", hash = "sha256:1410275520dfa70effadf4c21811d755e7ef9bb1f1d077a21958153a92c8d9ca"},
]
[package.dependencies]
@@ -1940,14 +1955,14 @@ email = ["email-validator (>=1.0.3)"]
[[package]]
name = "pygithub"
version = "1.58.1"
version = "1.58.2"
description = "Use the full Github API v3"
category = "dev"
optional = false
python-versions = ">=3.7"
files = [
{file = "PyGithub-1.58.1-py3-none-any.whl", hash = "sha256:4e7fe9c3ec30d5fde5b4fbb97f18821c9dbf372bf6df337fe66f6689a65e0a83"},
{file = "PyGithub-1.58.1.tar.gz", hash = "sha256:7d528b4ad92bc13122129fafd444ce3d04c47d2d801f6446b6e6ee2d410235b3"},
{file = "PyGithub-1.58.2-py3-none-any.whl", hash = "sha256:f435884af617c6debaa76cbc355372d1027445a56fbc39972a3b9ed4968badc8"},
{file = "PyGithub-1.58.2.tar.gz", hash = "sha256:1e6b1b7afe31f75151fb81f7ab6b984a7188a852bdb123dbb9ae90023c3ce60f"},
]
[package.dependencies]
@@ -2251,21 +2266,21 @@ md = ["cmarkgfm (>=0.8.0)"]
[[package]]
name = "requests"
version = "2.28.2"
version = "2.31.0"
description = "Python HTTP for Humans."
category = "main"
optional = false
python-versions = ">=3.7, <4"
python-versions = ">=3.7"
files = [
{file = "requests-2.28.2-py3-none-any.whl", hash = "sha256:64299f4909223da747622c030b781c0d7811e359c37124b4bd368fb8c6518baa"},
{file = "requests-2.28.2.tar.gz", hash = "sha256:98b1b2782e3c6c4904938b84c0eb932721069dfdb9134313beff7c83c2df24bf"},
{file = "requests-2.31.0-py3-none-any.whl", hash = "sha256:58cd2187c01e70e6e26505bca751777aa9f2ee0b7f4300988b709f44e013003f"},
{file = "requests-2.31.0.tar.gz", hash = "sha256:942c5a758f98d790eaed1a29cb6eefc7ffb0d1cf7af05c3d2791656dbd6ad1e1"},
]
[package.dependencies]
certifi = ">=2017.4.17"
charset-normalizer = ">=2,<4"
idna = ">=2.5,<4"
urllib3 = ">=1.21.1,<1.27"
urllib3 = ">=1.21.1,<3"
[package.extras]
socks = ["PySocks (>=1.5.6,!=1.5.7)"]
@@ -2565,21 +2580,21 @@ files = [
[[package]]
name = "sphinx"
version = "6.1.3"
version = "6.2.1"
description = "Python documentation generator"
category = "dev"
optional = false
python-versions = ">=3.8"
files = [
{file = "Sphinx-6.1.3.tar.gz", hash = "sha256:0dac3b698538ffef41716cf97ba26c1c7788dba73ce6f150c1ff5b4720786dd2"},
{file = "sphinx-6.1.3-py3-none-any.whl", hash = "sha256:807d1cb3d6be87eb78a381c3e70ebd8d346b9a25f3753e9947e866b2786865fc"},
{file = "Sphinx-6.2.1.tar.gz", hash = "sha256:6d56a34697bb749ffa0152feafc4b19836c755d90a7c59b72bc7dfd371b9cc6b"},
{file = "sphinx-6.2.1-py3-none-any.whl", hash = "sha256:97787ff1fa3256a3eef9eda523a63dbf299f7b47e053cfcf684a1c2a8380c912"},
]
[package.dependencies]
alabaster = ">=0.7,<0.8"
babel = ">=2.9"
colorama = {version = ">=0.4.5", markers = "sys_platform == \"win32\""}
docutils = ">=0.18,<0.20"
docutils = ">=0.18.1,<0.20"
imagesize = ">=1.3"
importlib-metadata = {version = ">=4.8", markers = "python_version < \"3.10\""}
Jinja2 = ">=3.0"
@@ -2597,7 +2612,7 @@ sphinxcontrib-serializinghtml = ">=1.1.5"
[package.extras]
docs = ["sphinxcontrib-websupport"]
lint = ["docutils-stubs", "flake8 (>=3.5.0)", "flake8-simplify", "isort", "mypy (>=0.990)", "ruff", "sphinx-lint", "types-requests"]
test = ["cython", "html5lib", "pytest (>=4.6)"]
test = ["cython", "filelock", "html5lib", "pytest (>=4.6)"]
[[package]]
name = "sphinx-autodoc2"
@@ -2998,26 +3013,26 @@ files = [
[[package]]
name = "types-bleach"
version = "6.0.0.1"
version = "6.0.0.3"
description = "Typing stubs for bleach"
category = "dev"
optional = false
python-versions = "*"
files = [
{file = "types-bleach-6.0.0.1.tar.gz", hash = "sha256:43d9129deb9e82918747437edf78f09ff440f2973f4702625b61994f3e698518"},
{file = "types_bleach-6.0.0.1-py3-none-any.whl", hash = "sha256:440df967254007be80bb0f4d851f026c29c709cc48359bf4935d2b2f3a6f9f90"},
{file = "types-bleach-6.0.0.3.tar.gz", hash = "sha256:8ce7896d4f658c562768674ffcf07492c7730e128018f03edd163ff912bfadee"},
{file = "types_bleach-6.0.0.3-py3-none-any.whl", hash = "sha256:d43eaf30a643ca824e16e2dcdb0c87ef9226237e2fa3ac4732a50cb3f32e145f"},
]
[[package]]
name = "types-commonmark"
version = "0.9.2.2"
version = "0.9.2.3"
description = "Typing stubs for commonmark"
category = "dev"
optional = false
python-versions = "*"
files = [
{file = "types-commonmark-0.9.2.2.tar.gz", hash = "sha256:f3259350634c2ce68ae503398430482f7cf44e5cae3d344995e916fbf453b4be"},
{file = "types_commonmark-0.9.2.2-py3-none-any.whl", hash = "sha256:d3d878692615e7fbe47bf19ba67497837b135812d665012a3d42219c1f2c3a61"},
{file = "types-commonmark-0.9.2.3.tar.gz", hash = "sha256:42769a2c194fd5b49fd9eedfd4a83cd1d2514c6d0a36f00f5c5ffe0b6a2d2fcf"},
{file = "types_commonmark-0.9.2.3-py3-none-any.whl", hash = "sha256:b575156e1b8a292d43acb36f861110b85c4bc7aa53bbfb5ac64addec15d18cfa"},
]
[[package]]
@@ -3058,26 +3073,26 @@ files = [
[[package]]
name = "types-pillow"
version = "9.5.0.2"
version = "9.5.0.4"
description = "Typing stubs for Pillow"
category = "dev"
optional = false
python-versions = "*"
files = [
{file = "types-Pillow-9.5.0.2.tar.gz", hash = "sha256:b3f9f621f259566c19c1deca21901017c8b1e3e200ed2e49e0a2d83c0a5175db"},
{file = "types_Pillow-9.5.0.2-py3-none-any.whl", hash = "sha256:58fdebd0ffa2353ecccdd622adde23bce89da5c0c8b96c34f2d1eca7b7e42d0e"},
{file = "types-Pillow-9.5.0.4.tar.gz", hash = "sha256:f1b6af47abd151847ee25911ffeba784899bc7dc7f9eba8ca6a5aac522b012ef"},
{file = "types_Pillow-9.5.0.4-py3-none-any.whl", hash = "sha256:69427d9fa4320ff6e30f00fb9c0dd71185dc0a16de4757774220104759483466"},
]
[[package]]
name = "types-psycopg2"
version = "2.9.21.9"
version = "2.9.21.10"
description = "Typing stubs for psycopg2"
category = "dev"
optional = false
python-versions = "*"
files = [
{file = "types-psycopg2-2.9.21.9.tar.gz", hash = "sha256:388dc36a04551632289c4aaf1fc5b91e147654b165db896d094844e216f22bf5"},
{file = "types_psycopg2-2.9.21.9-py3-none-any.whl", hash = "sha256:0332525fb9d3031d3da46f091e7d40b2c4d4958e9c00d2b4c1eaaa9f8ef9de4e"},
{file = "types-psycopg2-2.9.21.10.tar.gz", hash = "sha256:c2600892312ae1c34e12f145749795d93dc4eac3ef7dbf8a9c1bfd45385e80d7"},
{file = "types_psycopg2-2.9.21.10-py3-none-any.whl", hash = "sha256:918224a0731a3650832e46633e720703b5beef7693a064e777d9748654fcf5e5"},
]
[[package]]
@@ -3097,26 +3112,26 @@ cryptography = ">=35.0.0"
[[package]]
name = "types-pyyaml"
version = "6.0.12.9"
version = "6.0.12.10"
description = "Typing stubs for PyYAML"
category = "dev"
optional = false
python-versions = "*"
files = [
{file = "types-PyYAML-6.0.12.9.tar.gz", hash = "sha256:c51b1bd6d99ddf0aa2884a7a328810ebf70a4262c292195d3f4f9a0005f9eeb6"},
{file = "types_PyYAML-6.0.12.9-py3-none-any.whl", hash = "sha256:5aed5aa66bd2d2e158f75dda22b059570ede988559f030cf294871d3b647e3e8"},
{file = "types-PyYAML-6.0.12.10.tar.gz", hash = "sha256:ebab3d0700b946553724ae6ca636ea932c1b0868701d4af121630e78d695fc97"},
{file = "types_PyYAML-6.0.12.10-py3-none-any.whl", hash = "sha256:662fa444963eff9b68120d70cda1af5a5f2aa57900003c2006d7626450eaae5f"},
]
[[package]]
name = "types-requests"
version = "2.30.0.0"
version = "2.31.0.0"
description = "Typing stubs for requests"
category = "dev"
optional = false
python-versions = "*"
files = [
{file = "types-requests-2.30.0.0.tar.gz", hash = "sha256:dec781054324a70ba64430ae9e62e7e9c8e4618c185a5cb3f87a6738251b5a31"},
{file = "types_requests-2.30.0.0-py3-none-any.whl", hash = "sha256:c6cf08e120ca9f0dc4fa4e32c3f953c3fba222bcc1db6b97695bce8da1ba9864"},
{file = "types-requests-2.31.0.0.tar.gz", hash = "sha256:c1c29d20ab8d84dff468d7febfe8e0cb0b4664543221b386605e14672b44ea25"},
{file = "types_requests-2.31.0.0-py3-none-any.whl", hash = "sha256:7c5cea7940f8e92ec560bbc468f65bf684aa3dcf0554a6f8c4710f5f708dc598"},
]
[package.dependencies]
@@ -3124,14 +3139,14 @@ types-urllib3 = "*"
[[package]]
name = "types-setuptools"
version = "67.7.0.1"
version = "67.8.0.0"
description = "Typing stubs for setuptools"
category = "dev"
optional = false
python-versions = "*"
files = [
{file = "types-setuptools-67.7.0.1.tar.gz", hash = "sha256:980a2651b2b019809817e1585071596b87fbafcb54433ff3b12445461db23790"},
{file = "types_setuptools-67.7.0.1-py3-none-any.whl", hash = "sha256:471a4ecf6984ffada63ffcfa884bfcb62718bd2d1a1acf8ee5513ec99789ed5e"},
{file = "types-setuptools-67.8.0.0.tar.gz", hash = "sha256:95c9ed61871d6c0e258433373a4e1753c0a7c3627a46f4d4058c7b5a08ab844f"},
{file = "types_setuptools-67.8.0.0-py3-none-any.whl", hash = "sha256:6df73340d96b238a4188b7b7668814b37e8018168aef1eef94a3b1872e3f60ff"},
]
[[package]]
@@ -3409,22 +3424,22 @@ docs = ["Sphinx", "repoze.sphinx.autointerface"]
test = ["zope.i18nmessageid", "zope.testing", "zope.testrunner"]
[extras]
all = ["matrix-synapse-ldap3", "psycopg2", "psycopg2cffi", "psycopg2cffi-compat", "pysaml2", "authlib", "lxml", "sentry-sdk", "jaeger-client", "opentracing", "txredisapi", "hiredis", "Pympler", "pyicu"]
all = ["Pympler", "authlib", "hiredis", "jaeger-client", "lxml", "matrix-synapse-ldap3", "opentracing", "psycopg2", "psycopg2cffi", "psycopg2cffi-compat", "pyicu", "pysaml2", "sentry-sdk", "txredisapi"]
cache-memory = ["Pympler"]
jwt = ["authlib"]
matrix-synapse-ldap3 = ["matrix-synapse-ldap3"]
oidc = ["authlib"]
opentracing = ["jaeger-client", "opentracing"]
postgres = ["psycopg2", "psycopg2cffi", "psycopg2cffi-compat"]
redis = ["txredisapi", "hiredis"]
redis = ["hiredis", "txredisapi"]
saml2 = ["pysaml2"]
sentry = ["sentry-sdk"]
systemd = ["systemd-python"]
test = ["parameterized", "idna"]
test = ["idna", "parameterized"]
url-preview = ["lxml"]
user-search = ["pyicu"]
[metadata]
lock-version = "2.0"
python-versions = "^3.7.1"
content-hash = "ef3a16dd66177f7141239e1a2d3e07cc14c08f1e4e0c5127184d022bc062da52"
content-hash = "7ad11e62a675e09444cf33ca2de3216fc4efc5874a2575e54d95d577a52439d3"

View File

@@ -89,7 +89,7 @@ manifest-path = "rust/Cargo.toml"
[tool.poetry]
name = "matrix-synapse"
version = "1.83.0"
version = "1.85.0rc2"
description = "Homeserver for the Matrix decentralised comms protocol"
authors = ["Matrix.org Team and Contributors <packages@matrix.org>"]
license = "Apache-2.0"
@@ -314,6 +314,7 @@ black = ">=22.3.0"
ruff = "0.0.265"
# Typechecking
lxml-stubs = ">=0.4.0"
mypy = "*"
mypy-zope = "*"
types-bleach = ">=4.1.0"
@@ -368,7 +369,7 @@ furo = ">=2022.12.7,<2024.0.0"
# system changes.
# We are happy to raise these upper bounds upon request,
# provided we check that it's safe to do so (i.e. that CI passes).
requires = ["poetry-core>=1.0.0,<=1.5.0", "setuptools_rust>=1.3,<=1.5.2"]
requires = ["poetry-core>=1.1.0,<=1.6.0", "setuptools_rust>=1.3,<=1.6.0"]
build-backend = "poetry.core.masonry.api"

View File

@@ -20,6 +20,8 @@ from concurrent.futures import ThreadPoolExecutor
from types import FrameType
from typing import Collection, Optional, Sequence, Set
# These are expanded inside the dockerfile to be a fully qualified image name.
# e.g. docker.io/library/debian:bullseye
DISTS = (
"debian:buster", # oldstable: EOL 2022-08
"debian:bullseye",

View File

@@ -269,6 +269,10 @@ if [[ -n "$SYNAPSE_TEST_LOG_LEVEL" ]]; then
export PASS_SYNAPSE_LOG_SENSITIVE=1
fi
# Log a few more useful things for a developer attempting to debug something
# particularly tricky.
export PASS_SYNAPSE_LOG_TESTING=1
# Run the tests!
echo "Images built; running complement"
cd "$COMPLEMENT_DIR"

View File

@@ -18,10 +18,11 @@ can crop up, e.g the cache descriptors.
from typing import Callable, Optional, Type
from mypy.erasetype import remove_instance_last_known_values
from mypy.nodes import ARG_NAMED_OPT
from mypy.plugin import MethodSigContext, Plugin
from mypy.typeops import bind_self
from mypy.types import CallableType, NoneType, UnionType
from mypy.types import CallableType, Instance, NoneType, UnionType
class SynapsePlugin(Plugin):
@@ -92,10 +93,41 @@ def cached_function_method_signature(ctx: MethodSigContext) -> CallableType:
arg_names.append("on_invalidate")
arg_kinds.append(ARG_NAMED_OPT) # Arg is an optional kwarg.
# Finally we ensure the return type is a Deferred.
if (
isinstance(signature.ret_type, Instance)
and signature.ret_type.type.fullname == "twisted.internet.defer.Deferred"
):
# If it is already a Deferred, nothing to do.
ret_type = signature.ret_type
else:
ret_arg = None
if isinstance(signature.ret_type, Instance):
# If a coroutine, wrap the coroutine's return type in a Deferred.
if signature.ret_type.type.fullname == "typing.Coroutine":
ret_arg = signature.ret_type.args[2]
# If an awaitable, wrap the awaitable's final value in a Deferred.
elif signature.ret_type.type.fullname == "typing.Awaitable":
ret_arg = signature.ret_type.args[0]
# Otherwise, wrap the return value in a Deferred.
if ret_arg is None:
ret_arg = signature.ret_type
# This should be able to use ctx.api.named_generic_type, but that doesn't seem
# to find the correct symbol for anything more than 1 module deep.
#
# modules is not part of CheckerPluginInterface. The following is a combination
# of TypeChecker.named_generic_type and TypeChecker.lookup_typeinfo.
sym = ctx.api.modules["twisted.internet.defer"].names.get("Deferred") # type: ignore[attr-defined]
ret_type = Instance(sym.node, [remove_instance_last_known_values(ret_arg)])
signature = signature.copy_modified(
arg_types=arg_types,
arg_names=arg_names,
arg_kinds=arg_kinds,
ret_type=ret_type,
)
return signature

View File

@@ -27,7 +27,7 @@ import time
import urllib.request
from os import path
from tempfile import TemporaryDirectory
from typing import Any, List, Optional
from typing import Any, List, Match, Optional, Union
import attr
import click
@@ -233,7 +233,7 @@ def _prepare() -> None:
subprocess.check_output(["poetry", "version", new_version])
# Generate changelogs.
generate_and_write_changelog(current_version, new_version)
generate_and_write_changelog(synapse_repo, current_version, new_version)
# Generate debian changelogs
if parsed_new_version.pre is not None:
@@ -814,7 +814,7 @@ def get_changes_for_version(wanted_version: version.Version) -> str:
def generate_and_write_changelog(
current_version: version.Version, new_version: str
repo: Repo, current_version: version.Version, new_version: str
) -> None:
# We do this by getting a draft so that we can edit it before writing to the
# changelog.
@@ -827,6 +827,10 @@ def generate_and_write_changelog(
new_changes = new_changes.replace(
"No significant changes.", f"No significant changes since {current_version}."
)
new_changes += build_dependabot_changelog(
repo,
current_version,
)
# Prepend changes to changelog
with open("CHANGES.md", "r+") as f:
@@ -841,5 +845,49 @@ def generate_and_write_changelog(
os.remove(filename)
def build_dependabot_changelog(repo: Repo, current_version: version.Version) -> str:
"""Summarise dependabot commits between `current_version` and `release_branch`.
Returns an empty string if there have been no such commits; otherwise outputs a
third-level markdown header followed by an unordered list."""
last_release_commit = repo.tag("v" + str(current_version)).commit
rev_spec = f"{last_release_commit.hexsha}.."
commits = list(git.objects.Commit.iter_items(repo, rev_spec))
messages = []
for commit in reversed(commits):
if commit.author.name == "dependabot[bot]":
message: Union[str, bytes] = commit.message
if isinstance(message, bytes):
message = message.decode("utf-8")
messages.append(message.split("\n", maxsplit=1)[0])
if not messages:
print(f"No dependabot commits in range {rev_spec}", file=sys.stderr)
return ""
messages.sort()
def replacer(match: Match[str]) -> str:
desc = match.group(1)
number = match.group(2)
return f"* {desc}. ([\\#{number}](https://github.com/matrix-org/synapse/issues/{number}))"
for i, message in enumerate(messages):
messages[i] = re.sub(r"(.*) \(#(\d+)\)$", replacer, message)
messages.insert(0, "### Updates to locked dependencies\n")
# Add an extra blank line to the bottom of the section
messages.append("")
return "\n".join(messages)
@cli.command()
@click.argument("since")
def test_dependabot_changelog(since: str) -> None:
"""Test building the dependabot changelog.
Summarises all dependabot commits between the SINCE tag and the current git HEAD."""
print(build_dependabot_changelog(git.Repo("."), version.Version(since)))
if __name__ == "__main__":
cli()

View File

@@ -61,6 +61,9 @@ def lazyConnection(
# most methods to it via ConnectionHandler.__getattr__.
class ConnectionHandler(RedisProtocol):
def disconnect(self) -> "Deferred[None]": ...
def __repr__(self) -> str: ...
class UnixConnectionHandler(ConnectionHandler): ...
class RedisFactory(protocol.ReconnectingClientFactory):
continueTrying: bool

View File

@@ -0,0 +1,175 @@
# Copyright 2023 The Matrix.org Foundation.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from typing import Optional, Tuple
from typing_extensions import Protocol
from twisted.web.server import Request
from synapse.appservice import ApplicationService
from synapse.http.site import SynapseRequest
from synapse.types import Requester
# guests always get this device id.
GUEST_DEVICE_ID = "guest_device"
class Auth(Protocol):
"""The interface that an auth provider must implement."""
async def check_user_in_room(
self,
room_id: str,
requester: Requester,
allow_departed_users: bool = False,
) -> Tuple[str, Optional[str]]:
"""Check if the user is in the room, or was at some point.
Args:
room_id: The room to check.
user_id: The user to check.
current_state: Optional map of the current state of the room.
If provided then that map is used to check whether they are a
member of the room. Otherwise the current membership is
loaded from the database.
allow_departed_users: if True, accept users that were previously
members but have now departed.
Raises:
AuthError if the user is/was not in the room.
Returns:
The current membership of the user in the room and the
membership event ID of the user.
"""
async def get_user_by_req(
self,
request: SynapseRequest,
allow_guest: bool = False,
allow_expired: bool = False,
) -> Requester:
"""Get a registered user's ID.
Args:
request: An HTTP request with an access_token query parameter.
allow_guest: If False, will raise an AuthError if the user making the
request is a guest.
allow_expired: If True, allow the request through even if the account
is expired, or session token lifetime has ended. Note that
/login will deliver access tokens regardless of expiration.
Returns:
Resolves to the requester
Raises:
InvalidClientCredentialsError if no user by that token exists or the token
is invalid.
AuthError if access is denied for the user in the access token
"""
async def validate_appservice_can_control_user_id(
self, app_service: ApplicationService, user_id: str
) -> None:
"""Validates that the app service is allowed to control
the given user.
Args:
app_service: The app service that controls the user
user_id: The author MXID that the app service is controlling
Raises:
AuthError: If the application service is not allowed to control the user
(user namespace regex does not match, wrong homeserver, etc)
or if the user has not been registered yet.
"""
async def get_user_by_access_token(
self,
token: str,
allow_expired: bool = False,
) -> Requester:
"""Validate access token and get user_id from it
Args:
token: The access token to get the user by
allow_expired: If False, raises an InvalidClientTokenError
if the token is expired
Raises:
InvalidClientTokenError if a user by that token exists, but the token is
expired
InvalidClientCredentialsError if no user by that token exists or the token
is invalid
"""
async def is_server_admin(self, requester: Requester) -> bool:
"""Check if the given user is a local server admin.
Args:
requester: user to check
Returns:
True if the user is an admin
"""
async def check_can_change_room_list(
self, room_id: str, requester: Requester
) -> bool:
"""Determine whether the user is allowed to edit the room's entry in the
published room list.
Args:
room_id
user
"""
@staticmethod
def has_access_token(request: Request) -> bool:
"""Checks if the request has an access_token.
Returns:
False if no access_token was given, True otherwise.
"""
@staticmethod
def get_access_token_from_request(request: Request) -> str:
"""Extracts the access_token from the request.
Args:
request: The http request.
Returns:
The access_token
Raises:
MissingClientTokenError: If there isn't a single access_token in the
request
"""
async def check_user_in_room_or_world_readable(
self, room_id: str, requester: Requester, allow_departed_users: bool = False
) -> Tuple[str, Optional[str]]:
"""Checks that the user is or was in the room or the room is world
readable. If it isn't then an exception is raised.
Args:
room_id: room to check
user_id: user to check
allow_departed_users: if True, accept users that were previously
members but have now departed
Returns:
Resolves to the current membership of the user in the room and the
membership event ID of the user. If the user is not in the room and
never has been, then `(Membership.JOIN, None)` is returned.
"""

View File

@@ -1,4 +1,4 @@
# Copyright 2014 - 2016 OpenMarket Ltd
# Copyright 2023 The Matrix.org Foundation.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
@@ -14,7 +14,6 @@
import logging
from typing import TYPE_CHECKING, Optional, Tuple
import pymacaroons
from netaddr import IPAddress
from twisted.web.server import Request
@@ -24,19 +23,11 @@ from synapse.api.constants import EventTypes, HistoryVisibility, Membership
from synapse.api.errors import (
AuthError,
Codes,
InvalidClientTokenError,
MissingClientTokenError,
UnstableSpecAuthError,
)
from synapse.appservice import ApplicationService
from synapse.http import get_request_user_agent
from synapse.http.site import SynapseRequest
from synapse.logging.opentracing import (
active_span,
force_tracing,
start_active_span,
trace,
)
from synapse.logging.opentracing import trace
from synapse.types import Requester, create_requester
from synapse.util.cancellation import cancellable
@@ -46,26 +37,13 @@ if TYPE_CHECKING:
logger = logging.getLogger(__name__)
# guests always get this device id.
GUEST_DEVICE_ID = "guest_device"
class Auth:
"""
This class contains functions for authenticating users of our client-server API.
"""
class BaseAuth:
"""Common base class for all auth implementations."""
def __init__(self, hs: "HomeServer"):
self.hs = hs
self.clock = hs.get_clock()
self.store = hs.get_datastores().main
self._account_validity_handler = hs.get_account_validity_handler()
self._storage_controllers = hs.get_storage_controllers()
self._macaroon_generator = hs.get_macaroon_generator()
self._track_appservice_user_ips = hs.config.appservice.track_appservice_user_ips
self._track_puppeted_user_ips = hs.config.api.track_puppeted_user_ips
self._force_tracing_for_users = hs.config.tracing.force_tracing_for_users
async def check_user_in_room(
self,
@@ -119,139 +97,49 @@ class Auth:
errcode=Codes.NOT_JOINED,
)
@cancellable
async def get_user_by_req(
self,
request: SynapseRequest,
allow_guest: bool = False,
allow_expired: bool = False,
) -> Requester:
"""Get a registered user's ID.
@trace
async def check_user_in_room_or_world_readable(
self, room_id: str, requester: Requester, allow_departed_users: bool = False
) -> Tuple[str, Optional[str]]:
"""Checks that the user is or was in the room or the room is world
readable. If it isn't then an exception is raised.
Args:
request: An HTTP request with an access_token query parameter.
allow_guest: If False, will raise an AuthError if the user making the
request is a guest.
allow_expired: If True, allow the request through even if the account
is expired, or session token lifetime has ended. Note that
/login will deliver access tokens regardless of expiration.
room_id: room to check
user_id: user to check
allow_departed_users: if True, accept users that were previously
members but have now departed
Returns:
Resolves to the requester
Raises:
InvalidClientCredentialsError if no user by that token exists or the token
is invalid.
AuthError if access is denied for the user in the access token
Resolves to the current membership of the user in the room and the
membership event ID of the user. If the user is not in the room and
never has been, then `(Membership.JOIN, None)` is returned.
"""
parent_span = active_span()
with start_active_span("get_user_by_req"):
requester = await self._wrapped_get_user_by_req(
request, allow_guest, allow_expired
)
if parent_span:
if requester.authenticated_entity in self._force_tracing_for_users:
# request tracing is enabled for this user, so we need to force it
# tracing on for the parent span (which will be the servlet span).
#
# It's too late for the get_user_by_req span to inherit the setting,
# so we also force it on for that.
force_tracing()
force_tracing(parent_span)
parent_span.set_tag(
"authenticated_entity", requester.authenticated_entity
)
parent_span.set_tag("user_id", requester.user.to_string())
if requester.device_id is not None:
parent_span.set_tag("device_id", requester.device_id)
if requester.app_service is not None:
parent_span.set_tag("appservice_id", requester.app_service.id)
return requester
@cancellable
async def _wrapped_get_user_by_req(
self,
request: SynapseRequest,
allow_guest: bool,
allow_expired: bool,
) -> Requester:
"""Helper for get_user_by_req
Once get_user_by_req has set up the opentracing span, this does the actual work.
"""
try:
ip_addr = request.getClientAddress().host
user_agent = get_request_user_agent(request)
access_token = self.get_access_token_from_request(request)
# First check if it could be a request from an appservice
requester = await self._get_appservice_user(request)
if not requester:
# If not, it should be from a regular user
requester = await self.get_user_by_access_token(
access_token, allow_expired=allow_expired
)
# Deny the request if the user account has expired.
# This check is only done for regular users, not appservice ones.
if not allow_expired:
if await self._account_validity_handler.is_user_expired(
requester.user.to_string()
):
# Raise the error if either an account validity module has determined
# the account has expired, or the legacy account validity
# implementation is enabled and determined the account has expired
raise AuthError(
403,
"User account has expired",
errcode=Codes.EXPIRED_ACCOUNT,
)
if ip_addr and (
not requester.app_service or self._track_appservice_user_ips
# check_user_in_room will return the most recent membership
# event for the user if:
# * The user is a non-guest user, and was ever in the room
# * The user is a guest user, and has joined the room
# else it will throw.
return await self.check_user_in_room(
room_id, requester, allow_departed_users=allow_departed_users
)
except AuthError:
visibility = await self._storage_controllers.state.get_current_state_event(
room_id, EventTypes.RoomHistoryVisibility, ""
)
if (
visibility
and visibility.content.get("history_visibility")
== HistoryVisibility.WORLD_READABLE
):
# XXX(quenting): I'm 95% confident that we could skip setting the
# device_id to "dummy-device" for appservices, and that the only impact
# would be some rows which whould not deduplicate in the 'user_ips'
# table during the transition
recorded_device_id = (
"dummy-device"
if requester.device_id is None and requester.app_service is not None
else requester.device_id
)
await self.store.insert_client_ip(
user_id=requester.authenticated_entity,
access_token=access_token,
ip=ip_addr,
user_agent=user_agent,
device_id=recorded_device_id,
)
# Track also the puppeted user client IP if enabled and the user is puppeting
if (
requester.user.to_string() != requester.authenticated_entity
and self._track_puppeted_user_ips
):
await self.store.insert_client_ip(
user_id=requester.user.to_string(),
access_token=access_token,
ip=ip_addr,
user_agent=user_agent,
device_id=requester.device_id,
)
if requester.is_guest and not allow_guest:
raise AuthError(
403,
"Guest access not allowed",
errcode=Codes.GUEST_ACCESS_FORBIDDEN,
)
request.requester = requester
return requester
except KeyError:
raise MissingClientTokenError()
return Membership.JOIN, None
raise AuthError(
403,
"User %r not in room %s, and room previews are disabled"
% (requester.user, room_id),
)
async def validate_appservice_can_control_user_id(
self, app_service: ApplicationService, user_id: str
@@ -284,184 +172,16 @@ class Auth:
403, "Application service has not registered this user (%s)" % user_id
)
@cancellable
async def _get_appservice_user(self, request: Request) -> Optional[Requester]:
"""
Given a request, reads the request parameters to determine:
- whether it's an application service that's making this request
- what user the application service should be treated as controlling
(the user_id URI parameter allows an application service to masquerade
any applicable user in its namespace)
- what device the application service should be treated as controlling
(the device_id[^1] URI parameter allows an application service to masquerade
as any device that exists for the relevant user)
[^1] Unstable and provided by MSC3202.
Must use `org.matrix.msc3202.device_id` in place of `device_id` for now.
Returns:
the application service `Requester` of that request
Postconditions:
- The `app_service` field in the returned `Requester` is set
- The `user_id` field in the returned `Requester` is either the application
service sender or the controlled user set by the `user_id` URI parameter
- The returned application service is permitted to control the returned user ID.
- The returned device ID, if present, has been checked to be a valid device ID
for the returned user ID.
"""
DEVICE_ID_ARG_NAME = b"org.matrix.msc3202.device_id"
app_service = self.store.get_app_service_by_token(
self.get_access_token_from_request(request)
)
if app_service is None:
return None
if app_service.ip_range_whitelist:
ip_address = IPAddress(request.getClientAddress().host)
if ip_address not in app_service.ip_range_whitelist:
return None
# This will always be set by the time Twisted calls us.
assert request.args is not None
if b"user_id" in request.args:
effective_user_id = request.args[b"user_id"][0].decode("utf8")
await self.validate_appservice_can_control_user_id(
app_service, effective_user_id
)
else:
effective_user_id = app_service.sender
effective_device_id: Optional[str] = None
if (
self.hs.config.experimental.msc3202_device_masquerading_enabled
and DEVICE_ID_ARG_NAME in request.args
):
effective_device_id = request.args[DEVICE_ID_ARG_NAME][0].decode("utf8")
# We only just set this so it can't be None!
assert effective_device_id is not None
device_opt = await self.store.get_device(
effective_user_id, effective_device_id
)
if device_opt is None:
# For now, use 400 M_EXCLUSIVE if the device doesn't exist.
# This is an open thread of discussion on MSC3202 as of 2021-12-09.
raise AuthError(
400,
f"Application service trying to use a device that doesn't exist ('{effective_device_id}' for {effective_user_id})",
Codes.EXCLUSIVE,
)
return create_requester(
effective_user_id, app_service=app_service, device_id=effective_device_id
)
async def get_user_by_access_token(
self,
token: str,
allow_expired: bool = False,
) -> Requester:
"""Validate access token and get user_id from it
Args:
token: The access token to get the user by
allow_expired: If False, raises an InvalidClientTokenError
if the token is expired
Raises:
InvalidClientTokenError if a user by that token exists, but the token is
expired
InvalidClientCredentialsError if no user by that token exists or the token
is invalid
"""
# First look in the database to see if the access token is present
# as an opaque token.
user_info = await self.store.get_user_by_access_token(token)
if user_info:
valid_until_ms = user_info.valid_until_ms
if (
not allow_expired
and valid_until_ms is not None
and valid_until_ms < self.clock.time_msec()
):
# there was a valid access token, but it has expired.
# soft-logout the user.
raise InvalidClientTokenError(
msg="Access token has expired", soft_logout=True
)
# Mark the token as used. This is used to invalidate old refresh
# tokens after some time.
await self.store.mark_access_token_as_used(user_info.token_id)
requester = create_requester(
user_id=user_info.user_id,
access_token_id=user_info.token_id,
is_guest=user_info.is_guest,
shadow_banned=user_info.shadow_banned,
device_id=user_info.device_id,
authenticated_entity=user_info.token_owner,
)
return requester
# If the token isn't found in the database, then it could still be a
# macaroon for a guest, so we check that here.
try:
user_id = self._macaroon_generator.verify_guest_token(token)
# Guest access tokens are not stored in the database (there can
# only be one access token per guest, anyway).
#
# In order to prevent guest access tokens being used as regular
# user access tokens (and hence getting around the invalidation
# process), we look up the user id and check that it is indeed
# a guest user.
#
# It would of course be much easier to store guest access
# tokens in the database as well, but that would break existing
# guest tokens.
stored_user = await self.store.get_user_by_id(user_id)
if not stored_user:
raise InvalidClientTokenError("Unknown user_id %s" % user_id)
if not stored_user["is_guest"]:
raise InvalidClientTokenError(
"Guest access token used for regular user"
)
return create_requester(
user_id=user_id,
is_guest=True,
# all guests get the same device id
device_id=GUEST_DEVICE_ID,
authenticated_entity=user_id,
)
except (
pymacaroons.exceptions.MacaroonException,
TypeError,
ValueError,
) as e:
logger.warning(
"Invalid access token in auth: %s %s.",
type(e),
e,
)
raise InvalidClientTokenError("Invalid access token passed.")
async def is_server_admin(self, requester: Requester) -> bool:
"""Check if the given user is a local server admin.
Args:
requester: The user making the request, according to the access token.
requester: user to check
Returns:
True if the user is an admin
"""
return await self.store.is_server_admin(requester.user)
raise NotImplementedError()
async def check_can_change_room_list(
self, room_id: str, requester: Requester
@@ -470,8 +190,8 @@ class Auth:
published room list.
Args:
room_id: The room to check.
requester: The user making the request, according to the access token.
room_id
user
"""
is_admin = await self.is_server_admin(requester)
@@ -518,7 +238,6 @@ class Auth:
return bool(query_params) or bool(auth_headers)
@staticmethod
@cancellable
def get_access_token_from_request(request: Request) -> str:
"""Extracts the access_token from the request.
@@ -556,47 +275,77 @@ class Auth:
return query_params[0].decode("ascii")
@trace
async def check_user_in_room_or_world_readable(
self, room_id: str, requester: Requester, allow_departed_users: bool = False
) -> Tuple[str, Optional[str]]:
"""Checks that the user is or was in the room or the room is world
readable. If it isn't then an exception is raised.
@cancellable
async def get_appservice_user(
self, request: Request, access_token: str
) -> Optional[Requester]:
"""
Given a request, reads the request parameters to determine:
- whether it's an application service that's making this request
- what user the application service should be treated as controlling
(the user_id URI parameter allows an application service to masquerade
any applicable user in its namespace)
- what device the application service should be treated as controlling
(the device_id[^1] URI parameter allows an application service to masquerade
as any device that exists for the relevant user)
Args:
room_id: The room to check.
requester: The user making the request, according to the access token.
allow_departed_users: If True, accept users that were previously
members but have now departed.
[^1] Unstable and provided by MSC3202.
Must use `org.matrix.msc3202.device_id` in place of `device_id` for now.
Returns:
Resolves to the current membership of the user in the room and the
membership event ID of the user. If the user is not in the room and
never has been, then `(Membership.JOIN, None)` is returned.
"""
the application service `Requester` of that request
try:
# check_user_in_room will return the most recent membership
# event for the user if:
# * The user is a non-guest user, and was ever in the room
# * The user is a guest user, and has joined the room
# else it will throw.
return await self.check_user_in_room(
room_id, requester, allow_departed_users=allow_departed_users
Postconditions:
- The `app_service` field in the returned `Requester` is set
- The `user_id` field in the returned `Requester` is either the application
service sender or the controlled user set by the `user_id` URI parameter
- The returned application service is permitted to control the returned user ID.
- The returned device ID, if present, has been checked to be a valid device ID
for the returned user ID.
"""
DEVICE_ID_ARG_NAME = b"org.matrix.msc3202.device_id"
app_service = self.store.get_app_service_by_token(access_token)
if app_service is None:
return None
if app_service.ip_range_whitelist:
ip_address = IPAddress(request.getClientAddress().host)
if ip_address not in app_service.ip_range_whitelist:
return None
# This will always be set by the time Twisted calls us.
assert request.args is not None
if b"user_id" in request.args:
effective_user_id = request.args[b"user_id"][0].decode("utf8")
await self.validate_appservice_can_control_user_id(
app_service, effective_user_id
)
except AuthError:
visibility = await self._storage_controllers.state.get_current_state_event(
room_id, EventTypes.RoomHistoryVisibility, ""
)
if (
visibility
and visibility.content.get("history_visibility")
== HistoryVisibility.WORLD_READABLE
):
return Membership.JOIN, None
raise UnstableSpecAuthError(
403,
"User %s not in room %s, and room previews are disabled"
% (requester.user, room_id),
errcode=Codes.NOT_JOINED,
else:
effective_user_id = app_service.sender
effective_device_id: Optional[str] = None
if (
self.hs.config.experimental.msc3202_device_masquerading_enabled
and DEVICE_ID_ARG_NAME in request.args
):
effective_device_id = request.args[DEVICE_ID_ARG_NAME][0].decode("utf8")
# We only just set this so it can't be None!
assert effective_device_id is not None
device_opt = await self.store.get_device(
effective_user_id, effective_device_id
)
if device_opt is None:
# For now, use 400 M_EXCLUSIVE if the device doesn't exist.
# This is an open thread of discussion on MSC3202 as of 2021-12-09.
raise AuthError(
400,
f"Application service trying to use a device that doesn't exist ('{effective_device_id}' for {effective_user_id})",
Codes.EXCLUSIVE,
)
return create_requester(
effective_user_id, app_service=app_service, device_id=effective_device_id
)

View File

@@ -0,0 +1,291 @@
# Copyright 2023 The Matrix.org Foundation.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import logging
from typing import TYPE_CHECKING
import pymacaroons
from synapse.api.errors import (
AuthError,
Codes,
InvalidClientTokenError,
MissingClientTokenError,
)
from synapse.http import get_request_user_agent
from synapse.http.site import SynapseRequest
from synapse.logging.opentracing import active_span, force_tracing, start_active_span
from synapse.types import Requester, create_requester
from synapse.util.cancellation import cancellable
from . import GUEST_DEVICE_ID
from .base import BaseAuth
if TYPE_CHECKING:
from synapse.server import HomeServer
logger = logging.getLogger(__name__)
class InternalAuth(BaseAuth):
"""
This class contains functions for authenticating users of our client-server API.
"""
def __init__(self, hs: "HomeServer"):
super().__init__(hs)
self.clock = hs.get_clock()
self._account_validity_handler = hs.get_account_validity_handler()
self._macaroon_generator = hs.get_macaroon_generator()
self._track_appservice_user_ips = hs.config.appservice.track_appservice_user_ips
self._track_puppeted_user_ips = hs.config.api.track_puppeted_user_ips
self._force_tracing_for_users = hs.config.tracing.force_tracing_for_users
@cancellable
async def get_user_by_req(
self,
request: SynapseRequest,
allow_guest: bool = False,
allow_expired: bool = False,
) -> Requester:
"""Get a registered user's ID.
Args:
request: An HTTP request with an access_token query parameter.
allow_guest: If False, will raise an AuthError if the user making the
request is a guest.
allow_expired: If True, allow the request through even if the account
is expired, or session token lifetime has ended. Note that
/login will deliver access tokens regardless of expiration.
Returns:
Resolves to the requester
Raises:
InvalidClientCredentialsError if no user by that token exists or the token
is invalid.
AuthError if access is denied for the user in the access token
"""
parent_span = active_span()
with start_active_span("get_user_by_req"):
requester = await self._wrapped_get_user_by_req(
request, allow_guest, allow_expired
)
if parent_span:
if requester.authenticated_entity in self._force_tracing_for_users:
# request tracing is enabled for this user, so we need to force it
# tracing on for the parent span (which will be the servlet span).
#
# It's too late for the get_user_by_req span to inherit the setting,
# so we also force it on for that.
force_tracing()
force_tracing(parent_span)
parent_span.set_tag(
"authenticated_entity", requester.authenticated_entity
)
parent_span.set_tag("user_id", requester.user.to_string())
if requester.device_id is not None:
parent_span.set_tag("device_id", requester.device_id)
if requester.app_service is not None:
parent_span.set_tag("appservice_id", requester.app_service.id)
return requester
@cancellable
async def _wrapped_get_user_by_req(
self,
request: SynapseRequest,
allow_guest: bool,
allow_expired: bool,
) -> Requester:
"""Helper for get_user_by_req
Once get_user_by_req has set up the opentracing span, this does the actual work.
"""
try:
ip_addr = request.getClientAddress().host
user_agent = get_request_user_agent(request)
access_token = self.get_access_token_from_request(request)
# First check if it could be a request from an appservice
requester = await self.get_appservice_user(request, access_token)
if not requester:
# If not, it should be from a regular user
requester = await self.get_user_by_access_token(
access_token, allow_expired=allow_expired
)
# Deny the request if the user account has expired.
# This check is only done for regular users, not appservice ones.
if not allow_expired:
if await self._account_validity_handler.is_user_expired(
requester.user.to_string()
):
# Raise the error if either an account validity module has determined
# the account has expired, or the legacy account validity
# implementation is enabled and determined the account has expired
raise AuthError(
403,
"User account has expired",
errcode=Codes.EXPIRED_ACCOUNT,
)
if ip_addr and (
not requester.app_service or self._track_appservice_user_ips
):
# XXX(quenting): I'm 95% confident that we could skip setting the
# device_id to "dummy-device" for appservices, and that the only impact
# would be some rows which whould not deduplicate in the 'user_ips'
# table during the transition
recorded_device_id = (
"dummy-device"
if requester.device_id is None and requester.app_service is not None
else requester.device_id
)
await self.store.insert_client_ip(
user_id=requester.authenticated_entity,
access_token=access_token,
ip=ip_addr,
user_agent=user_agent,
device_id=recorded_device_id,
)
# Track also the puppeted user client IP if enabled and the user is puppeting
if (
requester.user.to_string() != requester.authenticated_entity
and self._track_puppeted_user_ips
):
await self.store.insert_client_ip(
user_id=requester.user.to_string(),
access_token=access_token,
ip=ip_addr,
user_agent=user_agent,
device_id=requester.device_id,
)
if requester.is_guest and not allow_guest:
raise AuthError(
403,
"Guest access not allowed",
errcode=Codes.GUEST_ACCESS_FORBIDDEN,
)
request.requester = requester
return requester
except KeyError:
raise MissingClientTokenError()
async def get_user_by_access_token(
self,
token: str,
allow_expired: bool = False,
) -> Requester:
"""Validate access token and get user_id from it
Args:
token: The access token to get the user by
allow_expired: If False, raises an InvalidClientTokenError
if the token is expired
Raises:
InvalidClientTokenError if a user by that token exists, but the token is
expired
InvalidClientCredentialsError if no user by that token exists or the token
is invalid
"""
# First look in the database to see if the access token is present
# as an opaque token.
user_info = await self.store.get_user_by_access_token(token)
if user_info:
valid_until_ms = user_info.valid_until_ms
if (
not allow_expired
and valid_until_ms is not None
and valid_until_ms < self.clock.time_msec()
):
# there was a valid access token, but it has expired.
# soft-logout the user.
raise InvalidClientTokenError(
msg="Access token has expired", soft_logout=True
)
# Mark the token as used. This is used to invalidate old refresh
# tokens after some time.
await self.store.mark_access_token_as_used(user_info.token_id)
requester = create_requester(
user_id=user_info.user_id,
access_token_id=user_info.token_id,
is_guest=user_info.is_guest,
shadow_banned=user_info.shadow_banned,
device_id=user_info.device_id,
authenticated_entity=user_info.token_owner,
)
return requester
# If the token isn't found in the database, then it could still be a
# macaroon for a guest, so we check that here.
try:
user_id = self._macaroon_generator.verify_guest_token(token)
# Guest access tokens are not stored in the database (there can
# only be one access token per guest, anyway).
#
# In order to prevent guest access tokens being used as regular
# user access tokens (and hence getting around the invalidation
# process), we look up the user id and check that it is indeed
# a guest user.
#
# It would of course be much easier to store guest access
# tokens in the database as well, but that would break existing
# guest tokens.
stored_user = await self.store.get_user_by_id(user_id)
if not stored_user:
raise InvalidClientTokenError("Unknown user_id %s" % user_id)
if not stored_user["is_guest"]:
raise InvalidClientTokenError(
"Guest access token used for regular user"
)
return create_requester(
user_id=user_id,
is_guest=True,
# all guests get the same device id
device_id=GUEST_DEVICE_ID,
authenticated_entity=user_id,
)
except (
pymacaroons.exceptions.MacaroonException,
TypeError,
ValueError,
) as e:
logger.warning(
"Invalid access token in auth: %s %s.",
type(e),
e,
)
raise InvalidClientTokenError("Invalid access token passed.")
async def is_server_admin(self, requester: Requester) -> bool:
"""Check if the given user is a local server admin.
Args:
requester: The user making the request, according to the access token.
Returns:
True if the user is an admin
"""
return await self.store.is_server_admin(requester.user)

View File

@@ -0,0 +1,352 @@
# Copyright 2023 The Matrix.org Foundation.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import logging
from typing import TYPE_CHECKING, Any, Dict, List, Optional
from urllib.parse import urlencode
from authlib.oauth2 import ClientAuth
from authlib.oauth2.auth import encode_client_secret_basic, encode_client_secret_post
from authlib.oauth2.rfc7523 import ClientSecretJWT, PrivateKeyJWT, private_key_jwt_sign
from authlib.oauth2.rfc7662 import IntrospectionToken
from authlib.oidc.discovery import OpenIDProviderMetadata, get_well_known_url
from twisted.web.client import readBody
from twisted.web.http_headers import Headers
from synapse.api.auth.base import BaseAuth
from synapse.api.errors import (
AuthError,
HttpResponseException,
InvalidClientTokenError,
OAuthInsufficientScopeError,
StoreError,
SynapseError,
)
from synapse.http.site import SynapseRequest
from synapse.logging.context import make_deferred_yieldable
from synapse.types import Requester, UserID, create_requester
from synapse.util import json_decoder
from synapse.util.caches.cached_call import RetryOnExceptionCachedCall
if TYPE_CHECKING:
from synapse.server import HomeServer
logger = logging.getLogger(__name__)
# Scope as defined by MSC2967
# https://github.com/matrix-org/matrix-spec-proposals/pull/2967
SCOPE_MATRIX_API = "urn:matrix:org.matrix.msc2967.client:api:*"
SCOPE_MATRIX_GUEST = "urn:matrix:org.matrix.msc2967.client:api:guest"
SCOPE_MATRIX_DEVICE_PREFIX = "urn:matrix:org.matrix.msc2967.client:device:"
# Scope which allows access to the Synapse admin API
SCOPE_SYNAPSE_ADMIN = "urn:synapse:admin:*"
def scope_to_list(scope: str) -> List[str]:
"""Convert a scope string to a list of scope tokens"""
return scope.strip().split(" ")
class PrivateKeyJWTWithKid(PrivateKeyJWT): # type: ignore[misc]
"""An implementation of the private_key_jwt client auth method that includes a kid header.
This is needed because some providers (Keycloak) require the kid header to figure
out which key to use to verify the signature.
"""
def sign(self, auth: Any, token_endpoint: str) -> bytes:
return private_key_jwt_sign(
auth.client_secret,
client_id=auth.client_id,
token_endpoint=token_endpoint,
claims=self.claims,
header={"kid": auth.client_secret["kid"]},
)
class MSC3861DelegatedAuth(BaseAuth):
AUTH_METHODS = {
"client_secret_post": encode_client_secret_post,
"client_secret_basic": encode_client_secret_basic,
"client_secret_jwt": ClientSecretJWT(),
"private_key_jwt": PrivateKeyJWTWithKid(),
}
EXTERNAL_ID_PROVIDER = "oauth-delegated"
def __init__(self, hs: "HomeServer"):
super().__init__(hs)
self._config = hs.config.experimental.msc3861
auth_method = MSC3861DelegatedAuth.AUTH_METHODS.get(
self._config.client_auth_method.value, None
)
# Those assertions are already checked when parsing the config
assert self._config.enabled, "OAuth delegation is not enabled"
assert self._config.issuer, "No issuer provided"
assert self._config.client_id, "No client_id provided"
assert auth_method is not None, "Invalid client_auth_method provided"
self._http_client = hs.get_proxied_http_client()
self._hostname = hs.hostname
self._admin_token = self._config.admin_token
self._issuer_metadata = RetryOnExceptionCachedCall(self._load_metadata)
if isinstance(auth_method, PrivateKeyJWTWithKid):
# Use the JWK as the client secret when using the private_key_jwt method
assert self._config.jwk, "No JWK provided"
self._client_auth = ClientAuth(
self._config.client_id, self._config.jwk, auth_method
)
else:
# Else use the client secret
assert self._config.client_secret, "No client_secret provided"
self._client_auth = ClientAuth(
self._config.client_id, self._config.client_secret, auth_method
)
async def _load_metadata(self) -> OpenIDProviderMetadata:
if self._config.issuer_metadata is not None:
return OpenIDProviderMetadata(**self._config.issuer_metadata)
url = get_well_known_url(self._config.issuer, external=True)
response = await self._http_client.get_json(url)
metadata = OpenIDProviderMetadata(**response)
# metadata.validate_introspection_endpoint()
return metadata
async def _introspect_token(self, token: str) -> IntrospectionToken:
"""
Send a token to the introspection endpoint and returns the introspection response
Parameters:
token: The token to introspect
Raises:
HttpResponseException: If the introspection endpoint returns a non-2xx response
ValueError: If the introspection endpoint returns an invalid JSON response
JSONDecodeError: If the introspection endpoint returns a non-JSON response
Exception: If the HTTP request fails
Returns:
The introspection response
"""
metadata = await self._issuer_metadata.get()
introspection_endpoint = metadata.get("introspection_endpoint")
raw_headers: Dict[str, str] = {
"Content-Type": "application/x-www-form-urlencoded",
"User-Agent": str(self._http_client.user_agent, "utf-8"),
"Accept": "application/json",
}
args = {"token": token, "token_type_hint": "access_token"}
body = urlencode(args, True)
# Fill the body/headers with credentials
uri, raw_headers, body = self._client_auth.prepare(
method="POST", uri=introspection_endpoint, headers=raw_headers, body=body
)
headers = Headers({k: [v] for (k, v) in raw_headers.items()})
# Do the actual request
# We're not using the SimpleHttpClient util methods as we don't want to
# check the HTTP status code, and we do the body encoding ourselves.
response = await self._http_client.request(
method="POST",
uri=uri,
data=body.encode("utf-8"),
headers=headers,
)
resp_body = await make_deferred_yieldable(readBody(response))
if response.code < 200 or response.code >= 300:
raise HttpResponseException(
response.code,
response.phrase.decode("ascii", errors="replace"),
resp_body,
)
resp = json_decoder.decode(resp_body.decode("utf-8"))
if not isinstance(resp, dict):
raise ValueError(
"The introspection endpoint returned an invalid JSON response."
)
return IntrospectionToken(**resp)
async def is_server_admin(self, requester: Requester) -> bool:
return "urn:synapse:admin:*" in requester.scope
async def get_user_by_req(
self,
request: SynapseRequest,
allow_guest: bool = False,
allow_expired: bool = False,
) -> Requester:
access_token = self.get_access_token_from_request(request)
requester = await self.get_appservice_user(request, access_token)
if not requester:
# TODO: we probably want to assert the allow_guest inside this call
# so that we don't provision the user if they don't have enough permission:
requester = await self.get_user_by_access_token(access_token, allow_expired)
if not allow_guest and requester.is_guest:
raise OAuthInsufficientScopeError([SCOPE_MATRIX_API])
request.requester = requester
return requester
async def get_user_by_access_token(
self,
token: str,
allow_expired: bool = False,
) -> Requester:
if self._admin_token is not None and token == self._admin_token:
# XXX: This is a temporary solution so that the admin API can be called by
# the OIDC provider. This will be removed once we have OIDC client
# credentials grant support in matrix-authentication-service.
logging.info("Admin toked used")
# XXX: that user doesn't exist and won't be provisioned.
# This is mostly fine for admin calls, but we should also think about doing
# requesters without a user_id.
admin_user = UserID("__oidc_admin", self._hostname)
return create_requester(
user_id=admin_user,
scope=["urn:synapse:admin:*"],
)
try:
introspection_result = await self._introspect_token(token)
except Exception:
logger.exception("Failed to introspect token")
raise SynapseError(503, "Unable to introspect the access token")
logger.info(f"Introspection result: {introspection_result!r}")
# TODO: introspection verification should be more extensive, especially:
# - verify the audience
if not introspection_result.get("active"):
raise InvalidClientTokenError("Token is not active")
# Let's look at the scope
scope: List[str] = scope_to_list(introspection_result.get("scope", ""))
# Determine type of user based on presence of particular scopes
has_user_scope = SCOPE_MATRIX_API in scope
has_guest_scope = SCOPE_MATRIX_GUEST in scope
if not has_user_scope and not has_guest_scope:
raise InvalidClientTokenError("No scope in token granting user rights")
# Match via the sub claim
sub: Optional[str] = introspection_result.get("sub")
if sub is None:
raise InvalidClientTokenError(
"Invalid sub claim in the introspection result"
)
user_id_str = await self.store.get_user_by_external_id(
MSC3861DelegatedAuth.EXTERNAL_ID_PROVIDER, sub
)
if user_id_str is None:
# If we could not find a user via the external_id, it either does not exist,
# or the external_id was never recorded
# TODO: claim mapping should be configurable
username: Optional[str] = introspection_result.get("username")
if username is None or not isinstance(username, str):
raise AuthError(
500,
"Invalid username claim in the introspection result",
)
user_id = UserID(username, self._hostname)
# First try to find a user from the username claim
user_info = await self.store.get_userinfo_by_id(user_id=user_id.to_string())
if user_info is None:
# If the user does not exist, we should create it on the fly
# TODO: we could use SCIM to provision users ahead of time and listen
# for SCIM SET events if those ever become standard:
# https://datatracker.ietf.org/doc/html/draft-hunt-scim-notify-00
# TODO: claim mapping should be configurable
# If present, use the name claim as the displayname
name: Optional[str] = introspection_result.get("name")
await self.store.register_user(
user_id=user_id.to_string(), create_profile_with_displayname=name
)
# And record the sub as external_id
await self.store.record_user_external_id(
MSC3861DelegatedAuth.EXTERNAL_ID_PROVIDER, sub, user_id.to_string()
)
else:
user_id = UserID.from_string(user_id_str)
# Find device_ids in scope
# We only allow a single device_id in the scope, so we find them all in the
# scope list, and raise if there are more than one. The OIDC server should be
# the one enforcing valid scopes, so we raise a 500 if we find an invalid scope.
device_ids = [
tok[len(SCOPE_MATRIX_DEVICE_PREFIX) :]
for tok in scope
if tok.startswith(SCOPE_MATRIX_DEVICE_PREFIX)
]
if len(device_ids) > 1:
raise AuthError(
500,
"Multiple device IDs in scope",
)
device_id = device_ids[0] if device_ids else None
if device_id is not None:
# Sanity check the device_id
if len(device_id) > 255 or len(device_id) < 1:
raise AuthError(
500,
"Invalid device ID in scope",
)
# Create the device on the fly if it does not exist
try:
await self.store.get_device(
user_id=user_id.to_string(), device_id=device_id
)
except StoreError:
await self.store.store_device(
user_id=user_id.to_string(),
device_id=device_id,
initial_device_display_name="OIDC-native client",
)
# TODO: there is a few things missing in the requester here, which still need
# to be figured out, like:
# - impersonation, with the `authenticated_entity`, which is used for
# rate-limiting, MAU limits, etc.
# - shadow-banning, with the `shadow_banned` flag
# - a proper solution for appservices, which still needs to be figured out in
# the context of MSC3861
return create_requester(
user_id=user_id,
device_id=device_id,
scope=scope,
is_guest=(has_guest_scope and not has_user_scope),
)

View File

@@ -119,14 +119,20 @@ class Codes(str, Enum):
class CodeMessageException(RuntimeError):
"""An exception with integer code and message string attributes.
"""An exception with integer code, a message string attributes and optional headers.
Attributes:
code: HTTP error code
msg: string describing the error
headers: optional response headers to send
"""
def __init__(self, code: Union[int, HTTPStatus], msg: str):
def __init__(
self,
code: Union[int, HTTPStatus],
msg: str,
headers: Optional[Dict[str, str]] = None,
):
super().__init__("%d: %s" % (code, msg))
# Some calls to this method pass instances of http.HTTPStatus for `code`.
@@ -137,6 +143,7 @@ class CodeMessageException(RuntimeError):
# To eliminate this behaviour, we convert them to their integer equivalents here.
self.code = int(code)
self.msg = msg
self.headers = headers
class RedirectException(CodeMessageException):
@@ -182,6 +189,7 @@ class SynapseError(CodeMessageException):
msg: str,
errcode: str = Codes.UNKNOWN,
additional_fields: Optional[Dict] = None,
headers: Optional[Dict[str, str]] = None,
):
"""Constructs a synapse error.
@@ -190,7 +198,7 @@ class SynapseError(CodeMessageException):
msg: The human-readable error message.
errcode: The matrix error code e.g 'M_FORBIDDEN'
"""
super().__init__(code, msg)
super().__init__(code, msg, headers)
self.errcode = errcode
if additional_fields is None:
self._additional_fields: Dict = {}
@@ -335,6 +343,20 @@ class AuthError(SynapseError):
super().__init__(code, msg, errcode, additional_fields)
class OAuthInsufficientScopeError(SynapseError):
"""An error raised when the caller does not have sufficient scope to perform the requested action"""
def __init__(
self,
required_scopes: List[str],
):
headers = {
"WWW-Authenticate": 'Bearer error="insufficient_scope", scope="%s"'
% (" ".join(required_scopes))
}
super().__init__(401, "Insufficient scope", Codes.FORBIDDEN, None, headers)
class UnstableSpecAuthError(AuthError):
"""An error raised when a new error code is being proposed to replace a previous one.
This error will return a "org.matrix.unstable.errcode" property with the new error code,

View File

@@ -128,20 +128,7 @@ USER_FILTER_SCHEMA = {
"account_data": {"$ref": "#/definitions/filter"},
"room": {"$ref": "#/definitions/room_filter"},
"event_format": {"type": "string", "enum": ["client", "federation"]},
"event_fields": {
"type": "array",
"items": {
"type": "string",
# Don't allow '\\' in event field filters. This makes matching
# events a lot easier as we can then use a negative lookbehind
# assertion to split '\.' If we allowed \\ then it would
# incorrectly split '\\.' See synapse.events.utils.serialize_event
#
# Note that because this is a regular expression, we have to escape
# each backslash in the pattern.
"pattern": r"^((?!\\\\).)*$",
},
},
"event_fields": {"type": "array", "items": {"type": "string"}},
},
"additionalProperties": True, # Allow new fields for forward compatibility
}
@@ -165,9 +152,9 @@ class Filtering:
self.DEFAULT_FILTER_COLLECTION = FilterCollection(hs, {})
async def get_user_filter(
self, user_localpart: str, filter_id: Union[int, str]
self, user_id: UserID, filter_id: Union[int, str]
) -> "FilterCollection":
result = await self.store.get_user_filter(user_localpart, filter_id)
result = await self.store.get_user_filter(user_id, filter_id)
return FilterCollection(self._hs, result)
def add_user_filter(self, user_id: UserID, user_filter: JsonDict) -> Awaitable[int]:

View File

@@ -96,11 +96,15 @@ class RoomVersion:
msc2716_historical: bool
# MSC2716: Adds support for redacting "insertion", "chunk", and "marker" events
msc2716_redactions: bool
# MSC3389: Protect relation information from redaction.
msc3389_relation_redactions: bool
# MSC3787: Adds support for a `knock_restricted` join rule, mixing concepts of
# knocks and restricted join rules into the same join condition.
msc3787_knock_restricted_join_rule: bool
# MSC3667: Enforce integer power levels
msc3667_int_only_power_levels: bool
# MSC3821: Do not redact the third_party_invite content field for membership events.
msc3821_redaction_rules: bool
# MSC3931: Adds a push rule condition for "room version feature flags", making
# some push rules room version dependent. Note that adding a flag to this list
# is not enough to mark it "supported": the push rule evaluator also needs to
@@ -128,8 +132,10 @@ class RoomVersions:
msc2403_knocking=False,
msc2716_historical=False,
msc2716_redactions=False,
msc3389_relation_redactions=False,
msc3787_knock_restricted_join_rule=False,
msc3667_int_only_power_levels=False,
msc3821_redaction_rules=False,
msc3931_push_features=(),
msc3989_redaction_rules=False,
)
@@ -149,8 +155,10 @@ class RoomVersions:
msc2403_knocking=False,
msc2716_historical=False,
msc2716_redactions=False,
msc3389_relation_redactions=False,
msc3787_knock_restricted_join_rule=False,
msc3667_int_only_power_levels=False,
msc3821_redaction_rules=False,
msc3931_push_features=(),
msc3989_redaction_rules=False,
)
@@ -170,8 +178,10 @@ class RoomVersions:
msc2403_knocking=False,
msc2716_historical=False,
msc2716_redactions=False,
msc3389_relation_redactions=False,
msc3787_knock_restricted_join_rule=False,
msc3667_int_only_power_levels=False,
msc3821_redaction_rules=False,
msc3931_push_features=(),
msc3989_redaction_rules=False,
)
@@ -191,8 +201,10 @@ class RoomVersions:
msc2403_knocking=False,
msc2716_historical=False,
msc2716_redactions=False,
msc3389_relation_redactions=False,
msc3787_knock_restricted_join_rule=False,
msc3667_int_only_power_levels=False,
msc3821_redaction_rules=False,
msc3931_push_features=(),
msc3989_redaction_rules=False,
)
@@ -212,8 +224,10 @@ class RoomVersions:
msc2403_knocking=False,
msc2716_historical=False,
msc2716_redactions=False,
msc3389_relation_redactions=False,
msc3787_knock_restricted_join_rule=False,
msc3667_int_only_power_levels=False,
msc3821_redaction_rules=False,
msc3931_push_features=(),
msc3989_redaction_rules=False,
)
@@ -233,8 +247,10 @@ class RoomVersions:
msc2403_knocking=False,
msc2716_historical=False,
msc2716_redactions=False,
msc3389_relation_redactions=False,
msc3787_knock_restricted_join_rule=False,
msc3667_int_only_power_levels=False,
msc3821_redaction_rules=False,
msc3931_push_features=(),
msc3989_redaction_rules=False,
)
@@ -254,8 +270,10 @@ class RoomVersions:
msc2403_knocking=False,
msc2716_historical=False,
msc2716_redactions=False,
msc3389_relation_redactions=False,
msc3787_knock_restricted_join_rule=False,
msc3667_int_only_power_levels=False,
msc3821_redaction_rules=False,
msc3931_push_features=(),
msc3989_redaction_rules=False,
)
@@ -275,8 +293,10 @@ class RoomVersions:
msc2403_knocking=True,
msc2716_historical=False,
msc2716_redactions=False,
msc3389_relation_redactions=False,
msc3787_knock_restricted_join_rule=False,
msc3667_int_only_power_levels=False,
msc3821_redaction_rules=False,
msc3931_push_features=(),
msc3989_redaction_rules=False,
)
@@ -296,8 +316,10 @@ class RoomVersions:
msc2403_knocking=True,
msc2716_historical=False,
msc2716_redactions=False,
msc3389_relation_redactions=False,
msc3787_knock_restricted_join_rule=False,
msc3667_int_only_power_levels=False,
msc3821_redaction_rules=False,
msc3931_push_features=(),
msc3989_redaction_rules=False,
)
@@ -317,8 +339,10 @@ class RoomVersions:
msc2403_knocking=True,
msc2716_historical=False,
msc2716_redactions=False,
msc3389_relation_redactions=False,
msc3787_knock_restricted_join_rule=False,
msc3667_int_only_power_levels=False,
msc3821_redaction_rules=False,
msc3931_push_features=(),
msc3989_redaction_rules=False,
)
@@ -338,8 +362,33 @@ class RoomVersions:
msc2403_knocking=True,
msc2716_historical=False,
msc2716_redactions=False,
msc3389_relation_redactions=False,
msc3787_knock_restricted_join_rule=True,
msc3667_int_only_power_levels=False,
msc3821_redaction_rules=False,
msc3931_push_features=(),
msc3989_redaction_rules=False,
)
MSC3821 = RoomVersion(
"org.matrix.msc3821.opt1",
RoomDisposition.UNSTABLE,
EventFormatVersions.ROOM_V4_PLUS,
StateResolutionVersions.V2,
enforce_key_validity=True,
special_case_aliases_auth=False,
strict_canonicaljson=True,
limit_notifications_power_levels=True,
msc2175_implicit_room_creator=False,
msc2176_redaction_rules=False,
msc3083_join_rules=True,
msc3375_redaction_rules=True,
msc2403_knocking=True,
msc2716_historical=False,
msc2716_redactions=False,
msc3389_relation_redactions=False,
msc3787_knock_restricted_join_rule=False,
msc3667_int_only_power_levels=False,
msc3821_redaction_rules=True,
msc3931_push_features=(),
msc3989_redaction_rules=False,
)
@@ -359,8 +408,10 @@ class RoomVersions:
msc2403_knocking=True,
msc2716_historical=False,
msc2716_redactions=False,
msc3389_relation_redactions=False,
msc3787_knock_restricted_join_rule=True,
msc3667_int_only_power_levels=True,
msc3821_redaction_rules=False,
msc3931_push_features=(),
msc3989_redaction_rules=False,
)
@@ -380,8 +431,10 @@ class RoomVersions:
msc2403_knocking=True,
msc2716_historical=True,
msc2716_redactions=True,
msc3389_relation_redactions=False,
msc3787_knock_restricted_join_rule=False,
msc3667_int_only_power_levels=False,
msc3821_redaction_rules=False,
msc3931_push_features=(),
msc3989_redaction_rules=False,
)
@@ -402,8 +455,10 @@ class RoomVersions:
msc2403_knocking=True,
msc2716_historical=False,
msc2716_redactions=False,
msc3389_relation_redactions=False,
msc3787_knock_restricted_join_rule=True,
msc3667_int_only_power_levels=True,
msc3821_redaction_rules=False,
msc3931_push_features=(PushRuleRoomFlag.EXTENSIBLE_EVENTS,),
msc3989_redaction_rules=False,
)
@@ -423,11 +478,37 @@ class RoomVersions:
msc2403_knocking=True,
msc2716_historical=False,
msc2716_redactions=False,
msc3389_relation_redactions=False,
msc3787_knock_restricted_join_rule=True,
msc3667_int_only_power_levels=True,
msc3821_redaction_rules=False,
msc3931_push_features=(),
msc3989_redaction_rules=True,
)
MSC3820opt2 = RoomVersion(
# Based upon v10
"org.matrix.msc3820.opt2",
RoomDisposition.UNSTABLE,
EventFormatVersions.ROOM_V4_PLUS,
StateResolutionVersions.V2,
enforce_key_validity=True,
special_case_aliases_auth=False,
strict_canonicaljson=True,
limit_notifications_power_levels=True,
msc2175_implicit_room_creator=True, # Used by MSC3820
msc2176_redaction_rules=True, # Used by MSC3820
msc3083_join_rules=True,
msc3375_redaction_rules=True,
msc2403_knocking=True,
msc2716_historical=False,
msc2716_redactions=False,
msc3389_relation_redactions=False,
msc3787_knock_restricted_join_rule=True,
msc3667_int_only_power_levels=True,
msc3821_redaction_rules=True, # Used by MSC3820
msc3931_push_features=(),
msc3989_redaction_rules=True, # Used by MSC3820
)
KNOWN_ROOM_VERSIONS: Dict[str, RoomVersion] = {
@@ -447,6 +528,7 @@ KNOWN_ROOM_VERSIONS: Dict[str, RoomVersion] = {
RoomVersions.V10,
RoomVersions.MSC2716v4,
RoomVersions.MSC3989,
RoomVersions.MSC3820opt2,
)
}

View File

@@ -21,6 +21,7 @@ import socket
import sys
import traceback
import warnings
from textwrap import indent
from typing import (
TYPE_CHECKING,
Any,
@@ -212,8 +213,12 @@ def handle_startup_exception(e: Exception) -> NoReturn:
# Exceptions that occur between setting up the logging and forking or starting
# the reactor are written to the logs, followed by a summary to stderr.
logger.exception("Exception during startup")
error_string = "".join(traceback.format_exception(type(e), e, e.__traceback__))
indented_error_string = indent(error_string, " ")
quit_with_error(
f"Error during initialisation:\n {e}\nThere may be more information in the logs."
f"Error during initialisation:\n{indented_error_string}\nThere may be more information in the logs."
)

View File

@@ -64,7 +64,7 @@ from synapse.util.logcontext import LoggingContext
logger = logging.getLogger("synapse.app.admin_cmd")
class AdminCmdSlavedStore(
class AdminCmdStore(
FilteringWorkerStore,
ClientIpWorkerStore,
DeviceWorkerStore,
@@ -103,7 +103,7 @@ class AdminCmdSlavedStore(
class AdminCmdServer(HomeServer):
DATASTORE_CLASS = AdminCmdSlavedStore # type: ignore
DATASTORE_CLASS = AdminCmdStore # type: ignore
async def export_data_command(hs: HomeServer, args: argparse.Namespace) -> None:

View File

@@ -102,7 +102,7 @@ from synapse.util.httpresourcetree import create_resource_tree
logger = logging.getLogger("synapse.app.generic_worker")
class GenericWorkerSlavedStore(
class GenericWorkerStore(
# FIXME(#3714): We need to add UserDirectoryStore as we write directly
# rather than going via the correct worker.
UserDirectoryStore,
@@ -154,7 +154,7 @@ class GenericWorkerSlavedStore(
class GenericWorkerServer(HomeServer):
DATASTORE_CLASS = GenericWorkerSlavedStore # type: ignore
DATASTORE_CLASS = GenericWorkerStore # type: ignore
def _listen_http(self, listener_config: ListenerConfig) -> None:
assert listener_config.http_options is not None

View File

@@ -127,10 +127,6 @@ async def phone_stats_home(
daily_sent_messages = await store.count_daily_sent_messages()
stats["daily_sent_messages"] = daily_sent_messages
r30_results = await store.count_r30_users()
for name, count in r30_results.items():
stats["r30_users_" + name] = count
r30v2_results = await store.count_r30v2_users()
for name, count in r30v2_results.items():
stats["r30v2_users_" + name] = count

View File

@@ -86,6 +86,7 @@ class ApplicationService:
url.rstrip("/") if isinstance(url, str) else None
) # url must not end with a slash
self.hs_token = hs_token
# The full Matrix ID for this application service's sender.
self.sender = sender
self.namespaces = self._check_namespaces(namespaces)
self.id = id
@@ -212,7 +213,7 @@ class ApplicationService:
True if the application service is interested in the user, False if not.
"""
return (
# User is the appservice's sender_localpart user
# User is the appservice's configured sender_localpart user
user_id == self.sender
# User is in the appservice's user namespace
or self.is_user_in_namespace(user_id)

View File

@@ -44,6 +44,7 @@ import jinja2
import pkg_resources
import yaml
from synapse.types import StrSequence
from synapse.util.templates import _create_mxc_to_http_filter, _format_ts_filter
logger = logging.getLogger(__name__)
@@ -58,7 +59,7 @@ class ConfigError(Exception):
the problem lies.
"""
def __init__(self, msg: str, path: Optional[Iterable[str]] = None):
def __init__(self, msg: str, path: Optional[StrSequence] = None):
self.msg = msg
self.path = path

View File

@@ -61,9 +61,10 @@ from synapse.config import ( # noqa: F401
voip,
workers,
)
from synapse.types import StrSequence
class ConfigError(Exception):
def __init__(self, msg: str, path: Optional[Iterable[str]] = None):
def __init__(self, msg: str, path: Optional[StrSequence] = None):
self.msg = msg
self.path = path

View File

@@ -11,17 +11,17 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from typing import Any, Dict, Iterable, Type, TypeVar
from typing import Any, Dict, Type, TypeVar
import jsonschema
from pydantic import BaseModel, ValidationError, parse_obj_as
from synapse.config._base import ConfigError
from synapse.types import JsonDict
from synapse.types import JsonDict, StrSequence
def validate_config(
json_schema: JsonDict, config: Any, config_path: Iterable[str]
json_schema: JsonDict, config: Any, config_path: StrSequence
) -> None:
"""Validates a config setting against a JsonSchema definition
@@ -45,7 +45,7 @@ def validate_config(
def json_error_to_config_error(
e: jsonschema.ValidationError, config_path: Iterable[str]
e: jsonschema.ValidationError, config_path: StrSequence
) -> ConfigError:
"""Converts a json validation error to a user-readable ConfigError

Some files were not shown because too many files have changed in this diff Show More