1
0

Compare commits

..

69 Commits

Author SHA1 Message Date
Andrew Morgan
1e64f9c255 Add support for devenv developer environments 2023-01-18 15:36:35 +00:00
Andrew Morgan
f4d2a734f9 Remove outdated commands from the code style doc & point to the contributing guide. (#14773) 2023-01-11 15:21:12 +00:00
reivilibre
5172c8c403 Faster remote room joins (worker mode): do not populate external hosts-in-room cache when sending events as this requires blocking for full state. (#14749)
Signed-off-by: Olivier Wilkinson (reivilibre) <oliverw@matrix.org>
Co-authored-by: Sean Quah <seanq@matrix.org>
2023-01-11 13:21:53 +00:00
Patrick Cloke
7f2cabf271 Fix-up type hints for tests.push module. (#14816) 2023-01-11 07:35:40 -05:00
reivilibre
d6bda5addd Add index to improve performance of the /timestamp_to_event endpoint used for jumping to a specific date in the timeline of a room. (#14799) 2023-01-11 12:29:13 +00:00
Dirk Klimpel
73f097888e Add listener health (#14747)
Fixes: #8780
2023-01-11 12:00:38 +00:00
Andrew Morgan
7b3a8f2b0c Add poetry.toml to .gitignore (#14807) 2023-01-11 11:44:13 +00:00
Dirk Klimpel
bc7ca704dd Add tag to listeners documentation (#14803)
* Add `tag` to `listeners` documentation

* newsfile
2023-01-11 10:47:44 +00:00
Richard van der Hoff
06ab64f201 Implement MSC3925: changes to bundling of edits (#14811)
Two parts to this:

 * Bundle the whole of the replacement with any edited events. This is backwards-compatible so I haven't put it behind a flag.
 * Optionally, inhibit server-side replacement of edited events. This has scope to break things, so it is currently disabled by default.
2023-01-10 16:31:28 +00:00
David Robertson
f417fb84b8 Update changelog 2 2023-01-10 12:30:01 +00:00
David Robertson
e5c01272a7 Update changelog 2023-01-10 12:26:19 +00:00
David Robertson
9a4c69f59f 1.75.0rc1 2023-01-10 12:18:50 +00:00
reivilibre
ba4ea7d13f Batch up replication requests to request the resyncing of remote users's devices. (#14716) 2023-01-10 11:17:59 +00:00
Dirk Klimpel
3479599387 Add missing worker settings to shared configuration (#14748)
* Add missing worker settings to shared configuration

* newsfile

* update docs after review

* more update for doc

* This -> These

Co-authored-by: David Robertson <david.m.robertson1@gmail.com>
2023-01-09 18:35:19 +00:00
Andrew Morgan
54a7228fa6 Skip testing pypy-3.7-linux wheels as we don't have openssl 3.x on manylinux2014 (#14802) 2023-01-09 17:51:37 +00:00
Jeyachandran Rathnam
58d2adc3da Remove undocumented device from pushrules (#14727)
* Remove undocumented device from pushrules

* Add changelog

* Update changelog.d/14727.misc

* Rename 14727.misc to 14727.bugfix

Co-authored-by: David Robertson <davidr@element.io>
2023-01-09 17:17:24 +00:00
Dirk Klimpel
c7b2c31161 Update link to towncrier in contribution guide (#14801)
* Update link to towncrier in contribution guide

* newsfile
2023-01-09 16:33:49 +00:00
David Robertson
c0145b06f5 Fix upgrade notes for installing ICU (#14797)
* Fix upgrade notes for installing ICU

As noticed in https://github.com/matrix-org/synapse/pull/14712/files#r1058433297

* Changelog
2023-01-09 14:43:46 +00:00
Jeyachandran Rathnam
babeeb4e7a Unescape HTML entities in oEmbed titles. (#14781)
It doesn't seem valid that HTML entities should appear in
the title field of oEmbed responses, but a popular WordPress
plug-in seems to do it.

There should not be harm in unescaping these.
2023-01-09 14:22:02 +00:00
Patrick Cloke
7e582a25f8 Improve /sync performance of when passing filters with empty arrays. (#14786)
This has two related changes:

* It enables fast-path processing for an empty filter (`[]`) which was
  previously only used for wildcard not-filters (`["*"]`).
* It special cases a `/sync` filter with no-rooms to skip all room
  processing, previously we would partially skip processing, but would
  generally still calculate intermediate values for each room which were
  then unused.

Future changes might consider further optimizations:

* Skip calculating per-room account data when all rooms are filtered (currently
  this is thrown away).
* Make similar improvements to other endpoints which support filters.
2023-01-09 08:43:50 -05:00
Jeyachandran Rathnam
5e0888076f Disable sending confirmation email when 3pid is disabled #14682 (#14725)
* Fixes #12277 :Disable sending confirmation email when 3pid is disabled

* Fix test_add_email_if_disabled test case to reflect changes to enable_3pid_changes flag

* Add changelog file

* Rename newsfragment.

Co-authored-by: Patrick Cloke <clokep@users.noreply.github.com>
2023-01-09 11:12:03 +00:00
dependabot[bot]
b4de0c63df Bump peaceiris/actions-gh-pages from 3.9.0 to 3.9.1 (#14791)
* Bump peaceiris/actions-gh-pages from 3.9.0 to 3.9.1

Bumps [peaceiris/actions-gh-pages](https://github.com/peaceiris/actions-gh-pages) from 3.9.0 to 3.9.1.
- [Release notes](https://github.com/peaceiris/actions-gh-pages/releases)
- [Changelog](https://github.com/peaceiris/actions-gh-pages/blob/main/CHANGELOG.md)
- [Commits](de7ea6f8ef...64b46b4226)

---
updated-dependencies:
- dependency-name: peaceiris/actions-gh-pages
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

* Changelog

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
2023-01-09 10:09:13 +00:00
dependabot[bot]
1438f93948 Bump importlib-metadata from 4.2.0 to 6.0.0 (#14795)
* Bump importlib-metadata from 4.2.0 to 6.0.0

Bumps [importlib-metadata](https://github.com/python/importlib_metadata) from 4.2.0 to 6.0.0.
- [Release notes](https://github.com/python/importlib_metadata/releases)
- [Changelog](https://github.com/python/importlib_metadata/blob/main/CHANGES.rst)
- [Commits](https://github.com/python/importlib_metadata/compare/v4.2.0...v6.0.0)

---
updated-dependencies:
- dependency-name: importlib-metadata
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>

* Changelog

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
2023-01-09 10:08:50 +00:00
dependabot[bot]
32c2ff8eab Bump ruff from 0.0.206 to 0.0.215 (#14796)
* Bump ruff from 0.0.206 to 0.0.215

Bumps [ruff](https://github.com/charliermarsh/ruff) from 0.0.206 to 0.0.215.
- [Release notes](https://github.com/charliermarsh/ruff/releases)
- [Changelog](https://github.com/charliermarsh/ruff/blob/main/BREAKING_CHANGES.md)
- [Commits](https://github.com/charliermarsh/ruff/compare/v0.0.206...v0.0.215)

---
updated-dependencies:
- dependency-name: ruff
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

* Changelog

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
2023-01-09 10:08:43 +00:00
dependabot[bot]
51c8ebec33 Bump types-setuptools from 65.6.0.2 to 65.6.0.3 (#14794)
* Bump types-setuptools from 65.6.0.2 to 65.6.0.3

Bumps [types-setuptools](https://github.com/python/typeshed) from 65.6.0.2 to 65.6.0.3.
- [Release notes](https://github.com/python/typeshed/releases)
- [Commits](https://github.com/python/typeshed/commits)

---
updated-dependencies:
- dependency-name: types-setuptools
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

* Changelog

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
2023-01-09 10:08:03 +00:00
dependabot[bot]
0ae8feee18 Bump pyopenssl from 22.1.0 to 23.0.0 (#14793)
* Bump pyopenssl from 22.1.0 to 23.0.0

Bumps [pyopenssl](https://github.com/pyca/pyopenssl) from 22.1.0 to 23.0.0.
- [Release notes](https://github.com/pyca/pyopenssl/releases)
- [Changelog](https://github.com/pyca/pyopenssl/blob/main/CHANGELOG.rst)
- [Commits](https://github.com/pyca/pyopenssl/compare/22.1.0...23.0.0)

---
updated-dependencies:
- dependency-name: pyopenssl
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>

* Changelog

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
2023-01-09 10:07:52 +00:00
dependabot[bot]
331797586e Bump types-pillow from 9.3.0.4 to 9.4.0.0 (#14792)
* Bump types-pillow from 9.3.0.4 to 9.4.0.0

Bumps [types-pillow](https://github.com/python/typeshed) from 9.3.0.4 to 9.4.0.0.
- [Release notes](https://github.com/python/typeshed/releases)
- [Commits](https://github.com/python/typeshed/commits)

---
updated-dependencies:
- dependency-name: types-pillow
  dependency-type: direct:development
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

* Changelog

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
2023-01-09 10:07:24 +00:00
reivilibre
1984fc16f1 Use htmltest to check links in the Synapse documentation. (#14743)
* Add htmltest to check links in the documentation

* Newsfile

Signed-off-by: Olivier Wilkinson (reivilibre) <oliverw@matrix.org>

Signed-off-by: Olivier Wilkinson (reivilibre) <oliverw@matrix.org>
2023-01-05 18:21:45 +00:00
reivilibre
4eb2f4e02b Fix broken links in the Synapse documentation. (#14744)
* Fix stale external links

* Fix some internal links

* Fix URLs without trailing / where needed

* Fix more links

* Newsfile

Signed-off-by: Olivier Wilkinson (reivilibre) <oliverw@matrix.org>

* Reapply docs/openid.md fix after conflict

Signed-off-by: Olivier Wilkinson (reivilibre) <oliverw@matrix.org>
2023-01-05 18:18:00 +00:00
dependabot[bot]
7b642167e6 Bump JasonEtco/create-an-issue from 2.8.2 to 2.9.1 (#14731)
* Bump JasonEtco/create-an-issue from 2.8.2 to 2.9.1

Bumps [JasonEtco/create-an-issue](https://github.com/JasonEtco/create-an-issue) from 2.8.2 to 2.9.1.
- [Release notes](https://github.com/JasonEtco/create-an-issue/releases)
- [Commits](3a8ba79651...e27dddc79c)

---
updated-dependencies:
- dependency-name: JasonEtco/create-an-issue
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

* Changelog

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
Co-authored-by: reivilibre <oliverw@matrix.org>
Co-authored-by: Mathieu Velten <mathieuv@matrix.org>
2023-01-05 10:10:43 +00:00
dependabot[bot]
70961911a8 Bump dawidd6/action-download-artifact from 2.24.2 to 2.24.3 (#14779)
* Bump dawidd6/action-download-artifact from 2.24.2 to 2.24.3

Bumps [dawidd6/action-download-artifact](https://github.com/dawidd6/action-download-artifact) from 2.24.2 to 2.24.3.
- [Release notes](https://github.com/dawidd6/action-download-artifact/releases)
- [Commits](e6e25ac3a2...bd10f381a9)

---
updated-dependencies:
- dependency-name: dawidd6/action-download-artifact
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

* Changelog

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
2023-01-05 09:51:09 +00:00
dependabot[bot]
da911e9ddf Bump types-requests from 2.28.11.5 to 2.28.11.7 (#14763)
* Bump types-requests from 2.28.11.5 to 2.28.11.7

Bumps [types-requests](https://github.com/python/typeshed) from 2.28.11.5 to 2.28.11.7.
- [Release notes](https://github.com/python/typeshed/releases)
- [Commits](https://github.com/python/typeshed/commits)

---
updated-dependencies:
- dependency-name: types-requests
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

* Changelog

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
2023-01-05 09:50:52 +00:00
dependabot[bot]
bd9ada3860 Bump pillow from 9.3.0 to 9.4.0 (#14762)
* Bump pillow from 9.3.0 to 9.4.0

Bumps [pillow](https://github.com/python-pillow/Pillow) from 9.3.0 to 9.4.0.
- [Release notes](https://github.com/python-pillow/Pillow/releases)
- [Changelog](https://github.com/python-pillow/Pillow/blob/main/CHANGES.rst)
- [Commits](https://github.com/python-pillow/Pillow/compare/9.3.0...9.4.0)

---
updated-dependencies:
- dependency-name: pillow
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

* Changelog

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
2023-01-05 09:50:41 +00:00
dependabot[bot]
be26379d00 Bump gitpython from 3.1.29 to 3.1.30 (#14761)
* Bump gitpython from 3.1.29 to 3.1.30

Bumps [gitpython](https://github.com/gitpython-developers/GitPython) from 3.1.29 to 3.1.30.
- [Release notes](https://github.com/gitpython-developers/GitPython/releases)
- [Changelog](https://github.com/gitpython-developers/GitPython/blob/main/CHANGES)
- [Commits](https://github.com/gitpython-developers/GitPython/compare/3.1.29...3.1.30)

---
updated-dependencies:
- dependency-name: gitpython
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

* Changelog

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
2023-01-05 09:50:14 +00:00
dependabot[bot]
62aa5c514d Bump pydantic from 1.10.2 to 1.10.4 (#14760) 2023-01-05 09:50:03 +00:00
dependabot[bot]
f79ef37b8c Bump ruff from 0.0.189 to 0.0.206 (#14759) 2023-01-05 09:49:50 +00:00
dependabot[bot]
827678196e Bump serde from 1.0.151 to 1.0.152 (#14758) 2023-01-05 09:49:35 +00:00
Patrick Cloke
44b476b26e Document how to use Twitter as an OAuth 2.0 provider. (#14778)
This also alphabetizes the documentation for the various OpenID providers.
2023-01-04 15:00:27 -05:00
Patrick Cloke
630d0aeaf6 Support RFC7636 PKCE in the OAuth 2.0 flow. (#14750)
PKCE can protect against certain attacks and is enabled by default. Support
can be controlled manually by setting the pkce_method of each oidc_providers
entry to 'auto' (default), 'always', or 'never'.

This is required by Twitter OAuth 2.0 support.
2023-01-04 14:58:08 -05:00
Erik Johnston
747f8eb231 Use env vars in GHA dependabot changelog (#14772) 2023-01-04 16:46:25 +00:00
Andrew Morgan
e787fb776c Switch to our fork of dh-virtualenv for compatibility with Python 3.11 (#14774) 2023-01-04 16:26:29 +00:00
Patrick Cloke
906dfaa2cf Support non-OpenID compliant user info endpoints (#14753)
OpenID specifies the format of the user info endpoint and some
OAuth 2.0 IdPs do not follow it, e.g. NextCloud and Twitter.

This adds subject_template and picture_template options to the
default mapping provider for more flexibility in matching those user
info responses.
2023-01-04 08:26:10 -05:00
Nick Mills-Barrett
db1cfe9c80 Update all stream IDs after processing replication rows (#14723)
This creates a new store method, `process_replication_position` that
is called after `process_replication_rows`. By moving stream ID advances
here this guarantees any relevant cache invalidations will have been
applied before the stream is advanced.

This avoids race conditions where Python switches between threads mid
way through processing the `process_replication_rows` method where stream
IDs may be advanced before caches are invalidated due to class resolution
ordering.

See this comment/issue for further discussion:
	https://github.com/matrix-org/synapse/issues/14158#issuecomment-1344048703
2023-01-04 11:49:26 +00:00
Andrew Morgan
c4456114e1 Add experimental support for MSC3391: deleting account data (#14714) 2023-01-01 03:40:46 +00:00
Patrick Cloke
044fa1a1de Actually use the picture_claim as configured in OIDC config. (#14751)
Previously it was only using the default value ("picture") when
fetching the picture from the user info.
2022-12-29 12:18:06 -05:00
dependabot[bot]
eb9ae47799 Bump attrs from 22.1.0 to 22.2.0 (#14734)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
Co-authored-by: reivilibre <oliverw@matrix.org>
2022-12-29 11:21:56 +01:00
dependabot[bot]
368ad7c5c7 Bump isort from 5.10.1 to 5.11.4 (#14733)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
Co-authored-by: reivilibre <oliverw@matrix.org>
2022-12-29 10:49:30 +01:00
dependabot[bot]
8ea6fd8d0b Bump setuptools from 65.3.0 to 65.5.1 (#14738)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
Co-authored-by: reivilibre <oliverw@matrix.org>
2022-12-29 10:48:39 +01:00
dependabot[bot]
ba2d38f22d Bump black from 22.10.0 to 22.12.0 (#14735)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
Co-authored-by: reivilibre <oliverw@matrix.org>
2022-12-28 17:53:25 +00:00
dependabot[bot]
ee0e00a200 Bump sentry-sdk from 1.12.0 to 1.12.1 (#14736)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
Co-authored-by: reivilibre <oliverw@matrix.org>
2022-12-28 17:53:18 +00:00
dependabot[bot]
9aaf27b42a Bump towncrier from 22.8.0 to 22.12.0 (#14732)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
Co-authored-by: reivilibre <oliverw@matrix.org>
2022-12-28 17:53:11 +00:00
reivilibre
46993770e5 Suppress the update check in the ruff linter. (#14741)
* Suppress update check in ruff

* Newsfile

Signed-off-by: Olivier Wilkinson (reivilibre) <oliverw@matrix.org>

Signed-off-by: Olivier Wilkinson (reivilibre) <oliverw@matrix.org>
2022-12-28 17:23:19 +00:00
Vertux
8d20b1ba1e Broken link "request_id_header" (#14740)
* Broken link "request_id_header"

The link above leads to an ERROR 404

* Update docs/reverse_proxy.md

Co-authored-by: reivilibre <olivier@librepush.net>
2022-12-28 15:45:28 +00:00
Brendan Abolivier
3854d0f949 Add a cached helper to the module API (#14663) 2022-12-28 13:48:21 +00:00
Patrick Cloke
a4ca770655 Add missing type hints to tests. (#14687)
Adds type hints to tests.metrics and tests.crypto.
2022-12-28 08:29:35 -05:00
Dirk Klimpel
2fb4071c1f Move email to Server section in config file documentation (#14730)
* Move `email` to server in config file documentation

* changelog
2022-12-28 12:17:51 +00:00
Richard van der Hoff
a52822d39c Log to-device msgids when we return them over /sync (#14724) 2022-12-23 14:04:50 +00:00
Jeyachandran Rathnam
5c9be9c760 Check sqlite database file exists before porting. (#14692)
To avoid creating an empty SQLite file if the given path
is incorrect.
2022-12-22 13:26:37 -05:00
Patrick Cloke
14abf22dd6 Update docs about ruff vs. flake8. 2022-12-21 13:08:20 -05:00
Patrick Cloke
7010a3d015 Switch to ruff instead of flake8. (#14633)
ruff is a flake8-compatible Python linter written in Rust.
It supports the flake8 plugins that we use and is significantly
faster in testing.
2022-12-21 13:05:21 -05:00
Patrick Cloke
5831bed450 Bump minimum PyYAML to 3.13. (#14720)
PyYAML 3.13 fixes some issues with Python 3.7 compatibility
and was released in 2018.
2022-12-21 12:29:19 -05:00
Olivier Wilkinson (reivilibre)
b624e010f1 (remove no-op changelog entry) 2022-12-21 12:28:55 +00:00
reivilibre
ec656be480 Revert update of hiredis in Poetry lockfile: revert from 2.1.0 to 2.0.0. (#14718)
* Revert "Bump hiredis from 2.0.0 to 2.1.0 (#14699)"

This reverts commit 9c89707b56.

* Newsfile

Signed-off-by: Olivier Wilkinson (reivilibre) <oliverw@matrix.org>

Signed-off-by: Olivier Wilkinson (reivilibre) <oliverw@matrix.org>
2022-12-21 12:28:13 +00:00
Sean Quah
43c54ba753 Merge branch 'master' into develop 2022-12-20 18:09:30 +00:00
Sean Quah
774e20b570 1.74.0 2022-12-20 16:08:33 +00:00
Mathieu Velten
eb2defc2f7 Add release note and update doc regarding ICU (#14712)
Fixes #14704.

Signed-off-by: Mathieu Velten <mathieuv@matrix.org>
2022-12-20 16:06:26 +00:00
dependabot[bot]
4be998add4 Bump lxml from 4.9.1 to 4.9.2 (#14698)
* Bump lxml from 4.9.1 to 4.9.2

Bumps [lxml](https://github.com/lxml/lxml) from 4.9.1 to 4.9.2.
- [Release notes](https://github.com/lxml/lxml/releases)
- [Changelog](https://github.com/lxml/lxml/blob/master/CHANGES.txt)
- [Commits](https://github.com/lxml/lxml/compare/lxml-4.9.1...lxml-4.9.2)

---
updated-dependencies:
- dependency-name: lxml
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

* Changelog

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
Co-authored-by: reivilibre <oliverw@matrix.org>
2022-12-19 16:48:20 +00:00
dependabot[bot]
af347e4d69 Bump serde_json from 1.0.89 to 1.0.91 (#14696)
* Bump serde_json from 1.0.89 to 1.0.91

Bumps [serde_json](https://github.com/serde-rs/json) from 1.0.89 to 1.0.91.
- [Release notes](https://github.com/serde-rs/json/releases)
- [Commits](https://github.com/serde-rs/json/compare/v1.0.89...v1.0.91)

---
updated-dependencies:
- dependency-name: serde_json
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

* Changelog

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
Co-authored-by: reivilibre <oliverw@matrix.org>
2022-12-19 16:48:06 +00:00
dependabot[bot]
4de951180d Bump anyhow from 1.0.66 to 1.0.68 (#14694)
* Bump anyhow from 1.0.66 to 1.0.68

Bumps [anyhow](https://github.com/dtolnay/anyhow) from 1.0.66 to 1.0.68.
- [Release notes](https://github.com/dtolnay/anyhow/releases)
- [Commits](https://github.com/dtolnay/anyhow/compare/1.0.66...1.0.68)

---
updated-dependencies:
- dependency-name: anyhow
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

* Changelog

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: GitHub Actions <github-actions[bot]@users.noreply.github.com>
Co-authored-by: reivilibre <oliverw@matrix.org>
2022-12-19 16:47:56 +00:00
136 changed files with 2780 additions and 1658 deletions

18
.flake8
View File

@@ -1,18 +0,0 @@
# TODO: incorporate this into pyproject.toml if flake8 supports it in the future.
# See https://github.com/PyCQA/flake8/issues/234
[flake8]
# see https://pycodestyle.readthedocs.io/en/latest/intro.html#error-codes
# for error codes. The ones we ignore are:
# W503: line break before binary operator
# W504: line break after binary operator
# E203: whitespace before ':' (which is contrary to pep8?)
# E731: do not assign a lambda expression, use a def
# E501: Line too long (black enforces this for us)
#
# flake8-bugbear runs extra checks. Its error codes are described at
# https://github.com/PyCQA/flake8-bugbear#list-of-warnings
# B019: Use of functools.lru_cache or functools.cache on methods can lead to memory leaks
# B023: Functions defined inside a loop must not use variables redefined in the loop
# B024: Abstract base class with no abstract method.
ignore=W503,W504,E203,E731,E501,B019,B023,B024

View File

@@ -6,7 +6,7 @@ on:
- reopened # For debugging!
permissions:
# Needed to be able to push the commit. See
# Needed to be able to push the commit. See
# https://docs.github.com/en/code-security/dependabot/working-with-dependabot/automating-dependabot-with-github-actions#enable-auto-merge-on-a-pull-request
# for a similar example
contents: write
@@ -20,8 +20,11 @@ jobs:
with:
ref: ${{ github.event.pull_request.head.ref }}
- name: Write, commit and push changelog
env:
PR_TITLE: ${{ github.event.pull_request.title }}
PR_NUMBER: ${{ github.event.pull_request.number }}
run: |
echo "${{ github.event.pull_request.title }}." > "changelog.d/${{ github.event.pull_request.number }}".misc
echo "${PR_TITLE}." > "changelog.d/${PR_NUMBER}".misc
git add changelog.d
git config user.email "github-actions[bot]@users.noreply.github.com"
git config user.name "GitHub Actions"

View File

@@ -14,7 +14,7 @@ jobs:
# There's a 'download artifact' action, but it hasn't been updated for the workflow_run action
# (https://github.com/actions/download-artifact/issues/60) so instead we get this mess:
- name: 📥 Download artifact
uses: dawidd6/action-download-artifact@e6e25ac3a2b93187502a8be1ef9e9603afc34925 # v2.24.2
uses: dawidd6/action-download-artifact@bd10f381a96414ce2b13a11bfa89902ba7cea07f # v2.24.3
with:
workflow: docs-pr.yaml
run_id: ${{ github.event.workflow_run.id }}

View File

@@ -4,6 +4,8 @@ on:
pull_request:
paths:
- docs/**
- book.toml
- .github/workflows/docs-pr.yaml
jobs:
pages:
@@ -32,3 +34,27 @@ jobs:
path: book
# We'll only use this in a workflow_run, then we're done with it
retention-days: 1
link-check:
name: Check links in documentation
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Setup mdbook
uses: peaceiris/actions-mdbook@adeb05db28a0c0004681db83893d56c0388ea9ea # v1.2.0
with:
mdbook-version: '0.4.17'
- name: Setup htmltest
run: |
wget https://github.com/wjdp/htmltest/releases/download/v0.17.0/htmltest_0.17.0_linux_amd64.tar.gz
echo '775c597ee74899d6002cd2d93076f897f4ba68686bceabe2e5d72e84c57bc0fb htmltest_0.17.0_linux_amd64.tar.gz' | sha256sum -c
tar zxf htmltest_0.17.0_linux_amd64.tar.gz
- name: Test links with htmltest
# Build the book with `./` as the site URL (to make checks on 404.html possible)
# Then run htmltest (without checking external links since that involves the network and is slow).
run: |
MDBOOK_OUTPUT__HTML__SITE_URL="./" mdbook build
./htmltest book --skip-external

View File

@@ -58,7 +58,7 @@ jobs:
# Deploy to the target directory.
- name: Deploy to gh pages
uses: peaceiris/actions-gh-pages@de7ea6f8efb354206b205ef54722213d99067935 # v3.9.0
uses: peaceiris/actions-gh-pages@64b46b4226a4a12da2239ba3ea5aa73e3163c75b # v3.9.1
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: ./book

View File

@@ -208,7 +208,7 @@ jobs:
steps:
- uses: actions/checkout@v3
- uses: JasonEtco/create-an-issue@3a8ba796516b57db8cb2ee6dfc65bc76cd39d56d # v2.8.2
- uses: JasonEtco/create-an-issue@e27dddc79c92bc6e4562f268fffa5ed752639abd # v2.9.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:

View File

@@ -148,7 +148,7 @@ jobs:
env:
# Skip testing for platforms which various libraries don't have wheels
# for, and so need extra build deps.
CIBW_TEST_SKIP: pp39-* *i686* *musl* pp37-macosx*
CIBW_TEST_SKIP: pp3{7,9}-* *i686* *musl*
# Fix Rust OOM errors on emulated aarch64: https://github.com/rust-lang/cargo/issues/10583
CARGO_NET_GIT_FETCH_WITH_CLI: true
CIBW_ENVIRONMENT_PASS_LINUX: CARGO_NET_GIT_FETCH_WITH_CLI

View File

@@ -53,7 +53,7 @@ jobs:
- run: scripts-dev/check_schema_delta.py --force-colors
lint:
uses: "matrix-org/backend-meta/.github/workflows/python-poetry-ci.yml@v1"
uses: "matrix-org/backend-meta/.github/workflows/python-poetry-ci.yml@v2"
with:
typechecking-extras: "all"

View File

@@ -174,7 +174,7 @@ jobs:
steps:
- uses: actions/checkout@v3
- uses: JasonEtco/create-an-issue@3a8ba796516b57db8cb2ee6dfc65bc76cd39d56d # v2.8.2
- uses: JasonEtco/create-an-issue@e27dddc79c92bc6e4562f268fffa5ed752639abd # v2.9.1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:

8
.gitignore vendored
View File

@@ -69,3 +69,11 @@ book/
# Poetry will create a setup.py, which we don't want to include.
/setup.py
# Don't include users' poetry configs
/poetry.toml
# Devenv
.devenv*
devenv.local.nix

View File

@@ -1,3 +1,106 @@
Synapse 1.75.0rc1 (2023-01-10)
==============================
Features
--------
- Add a `cached` function to `synapse.module_api` that returns a decorator to cache return values of functions. ([\#14663](https://github.com/matrix-org/synapse/issues/14663))
- Add experimental support for [MSC3391](https://github.com/matrix-org/matrix-spec-proposals/pull/3391) (removing account data). ([\#14714](https://github.com/matrix-org/synapse/issues/14714))
- Support [RFC7636](https://datatracker.ietf.org/doc/html/rfc7636) Proof Key for Code Exchange for OAuth single sign-on. ([\#14750](https://github.com/matrix-org/synapse/issues/14750))
- Support non-OpenID compliant userinfo claims for subject and picture. ([\#14753](https://github.com/matrix-org/synapse/issues/14753))
- Improve performance of `/sync` when filtering all rooms, message types, or senders. ([\#14786](https://github.com/matrix-org/synapse/issues/14786))
- Improve performance of the `/hierarchy` endpoint. ([\#14263](https://github.com/matrix-org/synapse/issues/14263))
Bugfixes
--------
- Fix the *MAU Limits* section of the Grafana dashboard relying on a specific `job` name for the workers of a Synapse deployment. ([\#14644](https://github.com/matrix-org/synapse/issues/14644))
- Fix a bug introduced in Synapse 1.70.0 which could cause spurious `UNIQUE constraint failed` errors in the `rotate_notifs` background job. ([\#14669](https://github.com/matrix-org/synapse/issues/14669))
- Ensure stream IDs are always updated after caches get invalidated with workers. Contributed by Nick @ Beeper (@fizzadar). ([\#14723](https://github.com/matrix-org/synapse/issues/14723))
- Remove the unspecced `device` field from `/pushrules` responses. ([\#14727](https://github.com/matrix-org/synapse/issues/14727))
- Fix a bug introduced in Synapse 1.73.0 where the `picture_claim` configured under `oidc_providers` was unused (the default value of `"picture"` was used instead). ([\#14751](https://github.com/matrix-org/synapse/issues/14751))
- Unescape HTML entities in URL preview titles making use of oEmbed responses. ([\#14781](https://github.com/matrix-org/synapse/issues/14781))
- Disable sending confirmation email when 3pid is disabled. ([\#14725](https://github.com/matrix-org/synapse/issues/14725))
Improved Documentation
----------------------
- Declare support for Python 3.11. ([\#14673](https://github.com/matrix-org/synapse/issues/14673))
- Fix `target_memory_usage` being used in the description for the actual `cache_autotune` sub-option `target_cache_memory_usage`. ([\#14674](https://github.com/matrix-org/synapse/issues/14674))
- Move `email` to Server section in config file documentation. ([\#14730](https://github.com/matrix-org/synapse/issues/14730))
- Fix broken links in the Synapse documentation. ([\#14744](https://github.com/matrix-org/synapse/issues/14744))
- Add missing worker settings to shared configuration documentation. ([\#14748](https://github.com/matrix-org/synapse/issues/14748))
- Document using Twitter as a OAuth 2.0 authentication provider. ([\#14778](https://github.com/matrix-org/synapse/issues/14778))
- Fix Synapse 1.74 upgrade notes to correctly explain how to install pyICU when installing Synapse from PyPI. ([\#14797](https://github.com/matrix-org/synapse/issues/14797))
- Update link to towncrier in contribution guide. ([\#14801](https://github.com/matrix-org/synapse/issues/14801))
- Use `htmltest` to check links in the Synapse documentation. ([\#14743](https://github.com/matrix-org/synapse/issues/14743))
Internal Changes
----------------
- Faster remote room joins: stream the un-partial-stating of events over replication. ([\#14545](https://github.com/matrix-org/synapse/issues/14545), [\#14546](https://github.com/matrix-org/synapse/issues/14546))
- Use [ruff](https://github.com/charliermarsh/ruff/) instead of flake8. ([\#14633](https://github.com/matrix-org/synapse/issues/14633), [\#14741](https://github.com/matrix-org/synapse/issues/14741))
- Change `handle_new_client_event` signature so that a 429 does not reach clients on `PartialStateConflictError`, and internally retry when needed instead. ([\#14665](https://github.com/matrix-org/synapse/issues/14665))
- Remove dependency on jQuery on reCAPTCHA page. ([\#14672](https://github.com/matrix-org/synapse/issues/14672))
- Faster joins: make `compute_state_after_events` consistent with other state-fetching functions that take a `StateFilter`. ([\#14676](https://github.com/matrix-org/synapse/issues/14676))
- Add missing type hints. ([\#14680](https://github.com/matrix-org/synapse/issues/14680), [\#14681](https://github.com/matrix-org/synapse/issues/14681), [\#14687](https://github.com/matrix-org/synapse/issues/14687))
- Improve type annotations for the helper methods on a `CachedFunction`. ([\#14685](https://github.com/matrix-org/synapse/issues/14685))
- Check that the SQLite database file exists before porting to PostgreSQL. ([\#14692](https://github.com/matrix-org/synapse/issues/14692))
- Add `.direnv/` directory to .gitignore to prevent local state generated by the [direnv](https://direnv.net/) development tool from being committed. ([\#14707](https://github.com/matrix-org/synapse/issues/14707))
- Batch up replication requests to request the resyncing of remote users's devices. ([\#14716](https://github.com/matrix-org/synapse/issues/14716))
- If debug logging is enabled, log the `msgid`s of any to-device messages that are returned over `/sync`. ([\#14724](https://github.com/matrix-org/synapse/issues/14724))
- Change GHA CI job to follow best practices. ([\#14772](https://github.com/matrix-org/synapse/issues/14772))
- Switch to our fork of `dh-virtualenv` to work around an upstream Python 3.11 incompatibility. ([\#14774](https://github.com/matrix-org/synapse/issues/14774))
- Skip testing built wheels for PyPy 3.7 on Linux x86_64 as we lack new required dependencies in the build environment. ([\#14802](https://github.com/matrix-org/synapse/issues/14802))
### Dependabot updates
<details>
- Bump JasonEtco/create-an-issue from 2.8.1 to 2.8.2. ([\#14693](https://github.com/matrix-org/synapse/issues/14693))
- Bump anyhow from 1.0.66 to 1.0.68. ([\#14694](https://github.com/matrix-org/synapse/issues/14694))
- Bump blake2 from 0.10.5 to 0.10.6. ([\#14695](https://github.com/matrix-org/synapse/issues/14695))
- Bump serde_json from 1.0.89 to 1.0.91. ([\#14696](https://github.com/matrix-org/synapse/issues/14696))
- Bump serde from 1.0.150 to 1.0.151. ([\#14697](https://github.com/matrix-org/synapse/issues/14697))
- Bump lxml from 4.9.1 to 4.9.2. ([\#14698](https://github.com/matrix-org/synapse/issues/14698))
- Bump types-jsonschema from 4.17.0.1 to 4.17.0.2. ([\#14700](https://github.com/matrix-org/synapse/issues/14700))
- Bump sentry-sdk from 1.11.1 to 1.12.0. ([\#14701](https://github.com/matrix-org/synapse/issues/14701))
- Bump types-setuptools from 65.6.0.1 to 65.6.0.2. ([\#14702](https://github.com/matrix-org/synapse/issues/14702))
- Bump minimum PyYAML to 3.13. ([\#14720](https://github.com/matrix-org/synapse/issues/14720))
- Bump JasonEtco/create-an-issue from 2.8.2 to 2.9.1. ([\#14731](https://github.com/matrix-org/synapse/issues/14731))
- Bump towncrier from 22.8.0 to 22.12.0. ([\#14732](https://github.com/matrix-org/synapse/issues/14732))
- Bump isort from 5.10.1 to 5.11.4. ([\#14733](https://github.com/matrix-org/synapse/issues/14733))
- Bump attrs from 22.1.0 to 22.2.0. ([\#14734](https://github.com/matrix-org/synapse/issues/14734))
- Bump black from 22.10.0 to 22.12.0. ([\#14735](https://github.com/matrix-org/synapse/issues/14735))
- Bump sentry-sdk from 1.12.0 to 1.12.1. ([\#14736](https://github.com/matrix-org/synapse/issues/14736))
- Bump setuptools from 65.3.0 to 65.5.1. ([\#14738](https://github.com/matrix-org/synapse/issues/14738))
- Bump serde from 1.0.151 to 1.0.152. ([\#14758](https://github.com/matrix-org/synapse/issues/14758))
- Bump ruff from 0.0.189 to 0.0.206. ([\#14759](https://github.com/matrix-org/synapse/issues/14759))
- Bump pydantic from 1.10.2 to 1.10.4. ([\#14760](https://github.com/matrix-org/synapse/issues/14760))
- Bump gitpython from 3.1.29 to 3.1.30. ([\#14761](https://github.com/matrix-org/synapse/issues/14761))
- Bump pillow from 9.3.0 to 9.4.0. ([\#14762](https://github.com/matrix-org/synapse/issues/14762))
- Bump types-requests from 2.28.11.5 to 2.28.11.7. ([\#14763](https://github.com/matrix-org/synapse/issues/14763))
- Bump dawidd6/action-download-artifact from 2.24.2 to 2.24.3. ([\#14779](https://github.com/matrix-org/synapse/issues/14779))
- Bump peaceiris/actions-gh-pages from 3.9.0 to 3.9.1. ([\#14791](https://github.com/matrix-org/synapse/issues/14791))
- Bump types-pillow from 9.3.0.4 to 9.4.0.0. ([\#14792](https://github.com/matrix-org/synapse/issues/14792))
- Bump pyopenssl from 22.1.0 to 23.0.0. ([\#14793](https://github.com/matrix-org/synapse/issues/14793))
- Bump types-setuptools from 65.6.0.2 to 65.6.0.3. ([\#14794](https://github.com/matrix-org/synapse/issues/14794))
- Bump importlib-metadata from 4.2.0 to 6.0.0. ([\#14795](https://github.com/matrix-org/synapse/issues/14795))
- Bump ruff from 0.0.206 to 0.0.215. ([\#14796](https://github.com/matrix-org/synapse/issues/14796))
</details>
Synapse 1.74.0 (2022-12-20)
===========================
Improved Documentation
----------------------
- Add release note and update documentation regarding optional ICU support in user search. ([\#14712](https://github.com/matrix-org/synapse/issues/14712))
Synapse 1.74.0rc1 (2022-12-13)
==============================

16
Cargo.lock generated
View File

@@ -13,9 +13,9 @@ dependencies = [
[[package]]
name = "anyhow"
version = "1.0.66"
version = "1.0.68"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "216261ddc8289130e551ddcd5ce8a064710c0d064a4d2895c67151c92b5443f6"
checksum = "2cb2f989d18dd141ab8ae82f64d1a8cdd37e0840f73a406896cf5e99502fab61"
[[package]]
name = "arc-swap"
@@ -323,18 +323,18 @@ checksum = "d29ab0c6d3fc0ee92fe66e2d99f700eab17a8d57d1c1d3b748380fb20baa78cd"
[[package]]
name = "serde"
version = "1.0.151"
version = "1.0.152"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "97fed41fc1a24994d044e6db6935e69511a1153b52c15eb42493b26fa87feba0"
checksum = "bb7d1f0d3021d347a83e556fc4683dea2ea09d87bccdf88ff5c12545d89d5efb"
dependencies = [
"serde_derive",
]
[[package]]
name = "serde_derive"
version = "1.0.151"
version = "1.0.152"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "255abe9a125a985c05190d687b320c12f9b1f0b99445e608c21ba0782c719ad8"
checksum = "af487d118eecd09402d70a5d72551860e788df87b464af30e5ea6a38c75c541e"
dependencies = [
"proc-macro2",
"quote",
@@ -343,9 +343,9 @@ dependencies = [
[[package]]
name = "serde_json"
version = "1.0.89"
version = "1.0.91"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "020ff22c755c2ed3f8cf162dbb41a7268d934702f3ed3631656ea597e08fc3db"
checksum = "877c235533714907a8c2464236f5c4b2a17262ef1bd71f38f35ea592c8da6883"
dependencies = [
"itoa",
"ryu",

View File

@@ -1 +0,0 @@
Improve performance of the `/hierarchy` endpoint.

View File

@@ -1 +0,0 @@
Faster remote room joins: stream the un-partial-stating of events over replication.

View File

@@ -1 +0,0 @@
Faster remote room joins: stream the un-partial-stating of events over replication.

View File

@@ -1 +0,0 @@
Fix the *MAU Limits* section of the Grafana dashboard relying on a specific `job` name for the workers of a Synapse deployment.

View File

@@ -1 +0,0 @@
Change `handle_new_client_event` signature so that a 429 does not reach clients on `PartialStateConflictError`, and internally retry when needed instead.

View File

@@ -1 +0,0 @@
Fix a bug introduced in Synapse 1.70.0 which could cause spurious `UNIQUE constraint failed` errors in the `rotate_notifs` background job.

View File

@@ -1 +0,0 @@
Remove dependency on jQuery on reCAPTCHA page.

View File

@@ -1 +0,0 @@
Declare support for Python 3.11.

View File

@@ -1 +0,0 @@
Fix `target_memory_usage` being used in the description for the actual `cache_autotune` sub-option `target_cache_memory_usage`.

View File

@@ -1 +0,0 @@
Faster joins: make `computer_state_after_events` consistent with other state-fetching functions that take a `StateFilter`.

View File

@@ -1 +0,0 @@
Add missing type hints.

View File

@@ -1 +0,0 @@
Improve type annotations for the helper methods on a `CachedFunction`.

View File

@@ -1 +0,0 @@
Bump JasonEtco/create-an-issue from 2.8.1 to 2.8.2.

View File

@@ -1 +0,0 @@
Bump blake2 from 0.10.5 to 0.10.6.

View File

@@ -1 +0,0 @@
Bump serde from 1.0.150 to 1.0.151.

View File

@@ -1 +0,0 @@
Bump hiredis from 2.0.0 to 2.1.0.

View File

@@ -1 +0,0 @@
Bump types-jsonschema from 4.17.0.1 to 4.17.0.2.

View File

@@ -1 +0,0 @@
Bump sentry-sdk from 1.11.1 to 1.12.0.

View File

@@ -1 +0,0 @@
Bump types-setuptools from 65.6.0.1 to 65.6.0.2.

View File

@@ -1 +0,0 @@
Add `.direnv/` directory to .gitignore to prevent local state generated by the [direnv](https://direnv.net/) development tool from being committed.

View File

@@ -0,0 +1 @@
Add a dedicated listener configuration for `health` endpoint.

1
changelog.d/14749.misc Normal file
View File

@@ -0,0 +1 @@
Faster remote room joins (worker mode): do not populate external hosts-in-room cache when sending events as this requires blocking for full state.

1
changelog.d/14773.doc Normal file
View File

@@ -0,0 +1 @@
Remove duplicate commands from the Code Style documentation page; point to the Contributing Guide instead.

1
changelog.d/14799.bugfix Normal file
View File

@@ -0,0 +1 @@
Add index to improve performance of the `/timestamp_to_event` endpoint used for jumping to a specific date in the timeline of a room.

1
changelog.d/14803.doc Normal file
View File

@@ -0,0 +1 @@
Add missing documentation for `tag` to `listeners` section.

1
changelog.d/14807.misc Normal file
View File

@@ -0,0 +1 @@
Add local poetry config files (`poetry.toml`) to `.gitignore`.

View File

@@ -0,0 +1 @@
Per [MSC3925](https://github.com/matrix-org/matrix-spec-proposals/pull/3925), bundle the whole of the replacement with any edited events, and optionally inhibit server-side replacement.

12
debian/changelog vendored
View File

@@ -1,3 +1,15 @@
matrix-synapse-py3 (1.75.0~rc1) stable; urgency=medium
* New Synapse release 1.75.0rc1.
-- Synapse Packaging team <packages@matrix.org> Tue, 10 Jan 2023 12:18:27 +0000
matrix-synapse-py3 (1.74.0) stable; urgency=medium
* New Synapse release 1.74.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 20 Dec 2022 16:07:38 +0000
matrix-synapse-py3 (1.74.0~rc1) stable; urgency=medium
* New dependency on libicu-dev to provide improved results for user

177
devenv.lock Normal file
View File

@@ -0,0 +1,177 @@
{
"nodes": {
"devenv": {
"locked": {
"dir": "src/modules",
"lastModified": 1673960114,
"narHash": "sha256-YNCok1a8cy71nP0idJds2Dwn2B1T6zGw9+2H1A0lNa0=",
"owner": "cachix",
"repo": "devenv",
"rev": "0960585a7221e6ede718cc9a2c2eade7ce75c229",
"type": "github"
},
"original": {
"dir": "src/modules",
"owner": "cachix",
"repo": "devenv",
"type": "github"
}
},
"fenix": {
"inputs": {
"nixpkgs": [
"nixpkgs"
],
"rust-analyzer-src": "rust-analyzer-src"
},
"locked": {
"lastModified": 1674023045,
"narHash": "sha256-btQC+gTeVLmX9cYl/6Kig7BuDltVWJEh7TrITAc6QjA=",
"owner": "nix-community",
"repo": "fenix",
"rev": "75dbe699bc57323cdef636b82bcfef6028bd1530",
"type": "github"
},
"original": {
"owner": "nix-community",
"repo": "fenix",
"type": "github"
}
},
"flake-compat": {
"flake": false,
"locked": {
"lastModified": 1668681692,
"narHash": "sha256-Ht91NGdewz8IQLtWZ9LCeNXMSXHUss+9COoqu6JLmXU=",
"owner": "edolstra",
"repo": "flake-compat",
"rev": "009399224d5e398d03b22badca40a37ac85412a1",
"type": "github"
},
"original": {
"owner": "edolstra",
"repo": "flake-compat",
"type": "github"
}
},
"flake-utils": {
"locked": {
"lastModified": 1667395993,
"narHash": "sha256-nuEHfE/LcWyuSWnS8t12N1wc105Qtau+/OdUAjtQ0rA=",
"owner": "numtide",
"repo": "flake-utils",
"rev": "5aed5285a952e0b949eb3ba02c12fa4fcfef535f",
"type": "github"
},
"original": {
"owner": "numtide",
"repo": "flake-utils",
"type": "github"
}
},
"gitignore": {
"inputs": {
"nixpkgs": [
"pre-commit-hooks",
"nixpkgs"
]
},
"locked": {
"lastModified": 1660459072,
"narHash": "sha256-8DFJjXG8zqoONA1vXtgeKXy68KdJL5UaXR8NtVMUbx8=",
"owner": "hercules-ci",
"repo": "gitignore.nix",
"rev": "a20de23b925fd8264fd7fad6454652e142fd7f73",
"type": "github"
},
"original": {
"owner": "hercules-ci",
"repo": "gitignore.nix",
"type": "github"
}
},
"nixpkgs": {
"locked": {
"lastModified": 1673947312,
"narHash": "sha256-xx/2nRwRy3bXrtry6TtydKpJpqHahjuDB5sFkQ/XNDE=",
"owner": "NixOS",
"repo": "nixpkgs",
"rev": "2d38b664b4400335086a713a0036aafaa002c003",
"type": "github"
},
"original": {
"owner": "NixOS",
"ref": "nixpkgs-unstable",
"repo": "nixpkgs",
"type": "github"
}
},
"nixpkgs-stable": {
"locked": {
"lastModified": 1671271954,
"narHash": "sha256-cSvu+bnvN08sOlTBWbBrKaBHQZq8mvk8bgpt0ZJ2Snc=",
"owner": "NixOS",
"repo": "nixpkgs",
"rev": "d513b448cc2a6da2c8803e3c197c9fc7e67b19e3",
"type": "github"
},
"original": {
"owner": "NixOS",
"ref": "nixos-22.05",
"repo": "nixpkgs",
"type": "github"
}
},
"pre-commit-hooks": {
"inputs": {
"flake-compat": "flake-compat",
"flake-utils": "flake-utils",
"gitignore": "gitignore",
"nixpkgs": [
"nixpkgs"
],
"nixpkgs-stable": "nixpkgs-stable"
},
"locked": {
"lastModified": 1674046351,
"narHash": "sha256-vNErPj4gfO/G1vHuOh5/IbjLaydwePcRlD0fXlnUbmI=",
"owner": "cachix",
"repo": "pre-commit-hooks.nix",
"rev": "756cc26afb75b0f8bfec48bbc54a8836a04953fb",
"type": "github"
},
"original": {
"owner": "cachix",
"repo": "pre-commit-hooks.nix",
"type": "github"
}
},
"root": {
"inputs": {
"devenv": "devenv",
"fenix": "fenix",
"nixpkgs": "nixpkgs",
"pre-commit-hooks": "pre-commit-hooks"
}
},
"rust-analyzer-src": {
"flake": false,
"locked": {
"lastModified": 1673979364,
"narHash": "sha256-ecgQENol9XhhcYF+M9B8FMrsWYQ/ZvRsvgEWi8HI6D0=",
"owner": "rust-lang",
"repo": "rust-analyzer",
"rev": "3a7271336536f2cd558498755254ae8c0e73baa7",
"type": "github"
},
"original": {
"owner": "rust-lang",
"ref": "nightly",
"repo": "rust-analyzer",
"type": "github"
}
}
},
"root": "root",
"version": 7
}

122
devenv.nix Normal file
View File

@@ -0,0 +1,122 @@
{ inputs, pkgs, ... }:
{
# Configure packages to install.
# Search for package names at https://search.nixos.org/packages?channel=unstable
packages = with pkgs; [
# Native dependencies for running Synapse.
icu
libffi
libjpeg
libpqxx
libwebp
libxml2
libxslt
sqlite
# Native dependencies for unit tests (SyTest also requires OpenSSL).
openssl
# Native dependencies for running Complement.
olm
# Development tools.
poetry
];
# Activate (and create if necessary) a poetry virtualenv on startup.
enterShell = ''
. "$(dirname $(poetry run which python))/activate"
'';
# Install dependencies for the additional programming languages
# involved with Synapse development. Python is already available
# from poetry's virtual environment.
#
# * Rust is used for developing and running Synapse.
# * Golang is needed to run the Complement test suite.
# * Perl is needed to run the SyTest test suite.
languages.go.enable = true;
languages.rust.enable = true;
languages.rust.version = "latest";
languages.perl.enable = true;
# Postgres is needed to run Synapse with postgres support and
# to run certain unit tests that require postgres.
services.postgres.enable = true;
# On the first invocation of `devenv up`, create a database for
# Syanpse to store data in.
services.postgres.initdbArgs = ["--locale=C" "--encoding=UTF8"];
services.postgres.initialDatabases = [
{ name = "synapse"; }
];
# Redis is needed in order to run Synapse in worker mode.
services.redis.enable = true;
# We wrap `poetry` with a bash script that disables the download
# of binary wheels for certain packages if the user is running
# NixOS. NixOS is special in that you can have multiple versions
# of packages installed at once, including your libc linker!
#
# Some binaries built for Linux expect those to be in a certain
# filepath, but that is not the case on NixOS. In that case, we
# force compiling those binaries locally instead.
scripts.poetry.exec = ''
if [ -z "$__NIXOS_SET_ENVIRONMENT_DONE" ]; then
# We are running on NixOS.
#
# Prevent poetry from downloading known problematic,
# dynamically-linked binaries for python dependencies.
POETRY_INSTALLER_NO_BINARY=ruff ${pkgs.poetry}/bin/poetry $@
else
${pkgs.poetry}/bin/poetry $@
fi
'';
# Define the perl modules we require to run SyTest.
#
# This list was compiled by cross-referencing https://metacpan.org/
# with the modules defined in './cpanfile' and then finding the
# corresponding nix packages on https://search.nixos.org/packages.
#
# This was done until `./install-deps.pl --dryrun` produced no output.
env.PERL5LIB = "${with pkgs.perl536Packages; makePerlPath [
DBI
ClassMethodModifiers
CryptEd25519
DataDump
DBDPg
DigestHMAC
DigestSHA1
EmailAddressXS
EmailMIME
EmailSimple # required by Email::Mime
EmailMessageID # required by Email::Mime
EmailMIMEContentType # required by Email::Mime
TextUnidecode # required by Email::Mime
ModuleRuntime # required by Email::Mime
EmailMIMEEncodings # required by Email::Mime
FilePath
FileSlurper
Future
GetoptLong
HTTPMessage
IOAsync
IOAsyncSSL
IOSocketSSL
NetSSLeay
JSON
ListUtilsBy
ScalarListUtils
ModulePluggable
NetAsyncHTTP
MetricsAny # required by Net::Async::HTTP
NetAsyncHTTPServer
StructDumb
URI
YAMLLibYAML
]}";
}

8
devenv.yaml Normal file
View File

@@ -0,0 +1,8 @@
inputs:
nixpkgs:
url: github:NixOS/nixpkgs/nixpkgs-unstable
fenix:
url: github:nix-community/fenix
inputs:
nixpkgs:
follows: nixpkgs

View File

@@ -167,6 +167,7 @@ RUN \
libwebp6 \
xmlsec1 \
libjemalloc2 \
libicu67 \
libssl-dev \
openssl \
&& rm -rf /var/lib/apt/lists/*

View File

@@ -36,8 +36,10 @@ RUN env DEBIAN_FRONTEND=noninteractive apt-get install \
wget
# fetch and unpack the package
# We are temporarily using a fork of dh-virtualenv due to an incompatibility with Python 3.11, which ships with
# Debian sid. TODO: Switch back to upstream once https://github.com/spotify/dh-virtualenv/pull/354 has merged.
RUN mkdir /dh-virtualenv
RUN wget -q -O /dh-virtualenv.tar.gz https://github.com/spotify/dh-virtualenv/archive/refs/tags/1.2.2.tar.gz
RUN wget -q -O /dh-virtualenv.tar.gz https://github.com/matrix-org/dh-virtualenv/archive/refs/tags/matrixorg-2023010302.tar.gz
RUN tar -xv --strip-components=1 -C /dh-virtualenv -f /dh-virtualenv.tar.gz
# install its build deps. We do another apt-cache-update here, because we might

View File

@@ -5,7 +5,7 @@ use it, you must enable the account validity feature (under
`account_validity`) in Synapse's configuration.
To use it, you will need to authenticate by providing an `access_token`
for a server admin: see [Admin API](../usage/administration/admin_api).
for a server admin: see [Admin API](../usage/administration/admin_api/).
## Renew account

View File

@@ -3,7 +3,7 @@
This API returns information about reported events.
To use it, you will need to authenticate by providing an `access_token`
for a server admin: see [Admin API](../usage/administration/admin_api).
for a server admin: see [Admin API](../usage/administration/admin_api/).
The api is:
```

View File

@@ -6,7 +6,7 @@ Details about the format of the `media_id` and storage of the media in the file
are documented under [media repository](../media_repository.md).
To use it, you will need to authenticate by providing an `access_token`
for a server admin: see [Admin API](../usage/administration/admin_api).
for a server admin: see [Admin API](../usage/administration/admin_api/).
## List all media in a room

View File

@@ -11,7 +11,7 @@ Note that Synapse requires at least one message in each room, so it will never
delete the last message in a room.
To use it, you will need to authenticate by providing an `access_token`
for a server admin: see [Admin API](../usage/administration/admin_api).
for a server admin: see [Admin API](../usage/administration/admin_api/).
The API is:

View File

@@ -6,7 +6,7 @@ local users. The server administrator must be in the room and have permission to
invite users.
To use it, you will need to authenticate by providing an `access_token`
for a server admin: see [Admin API](../usage/administration/admin_api).
for a server admin: see [Admin API](../usage/administration/admin_api/).
## Parameters

View File

@@ -5,7 +5,7 @@ server. There are various parameters available that allow for filtering and
sorting the returned list. This API supports pagination.
To use it, you will need to authenticate by providing an `access_token`
for a server admin: see [Admin API](../usage/administration/admin_api).
for a server admin: see [Admin API](../usage/administration/admin_api/).
**Parameters**
@@ -400,7 +400,7 @@ sent to a room in a given timeframe. There are various parameters available
that allow for filtering and ordering the returned list. This API supports pagination.
To use it, you will need to authenticate by providing an `access_token`
for a server admin: see [Admin API](../usage/administration/admin_api).
for a server admin: see [Admin API](../usage/administration/admin_api/).
This endpoint mirrors the [Matrix Spec defined Messages API](https://spec.matrix.org/v1.1/client-server-api/#get_matrixclientv3roomsroomidmessages).

View File

@@ -4,7 +4,7 @@ Returns information about all local media usage of users. Gives the
possibility to filter them by time and user.
To use it, you will need to authenticate by providing an `access_token`
for a server admin: see [Admin API](../usage/administration/admin_api).
for a server admin: see [Admin API](../usage/administration/admin_api/).
The API is:

View File

@@ -1,7 +1,7 @@
# User Admin API
To use it, you will need to authenticate by providing an `access_token`
for a server admin: see [Admin API](../usage/administration/admin_api).
for a server admin: see [Admin API](../usage/administration/admin_api/).
## Query User Account

View File

@@ -10,26 +10,17 @@ The necessary tools are:
- [black](https://black.readthedocs.io/en/stable/), a source code formatter;
- [isort](https://pycqa.github.io/isort/), which organises each file's imports;
- [flake8](https://flake8.pycqa.org/en/latest/), which can spot common errors; and
- [ruff](https://github.com/charliermarsh/ruff), which can spot common errors; and
- [mypy](https://mypy.readthedocs.io/en/stable/), a type checker.
Install them with:
```sh
pip install -e ".[lint,mypy]"
```
The easiest way to run the lints is to invoke the linter script as follows.
```sh
scripts-dev/lint.sh
```
See [the contributing guide](development/contributing_guide.md#run-the-linters) for instructions
on how to install the above tools and run the linters.
It's worth noting that modern IDEs and text editors can run these tools
automatically on save. It may be worth looking into whether this
functionality is supported in your editor for a more convenient
development workflow. It is not, however, recommended to run `flake8` or `mypy`
on save as they take a while and can be very resource intensive.
development workflow. It is not, however, recommended to run `mypy`
on save as it takes a while and can be very resource intensive.
## General rules

View File

@@ -24,6 +24,8 @@ The code of Synapse is written in Python 3. To do pretty much anything, you'll n
Synapse can connect to PostgreSQL via the [psycopg2](https://pypi.org/project/psycopg2/) Python library. Building this library from source requires access to PostgreSQL's C header files. On Debian or Ubuntu Linux, these can be installed with `sudo apt install libpq-dev`.
Synapse has an optional, improved user search with better Unicode support. For that you need the development package of `libicu`. On Debian or Ubuntu Linux, this can be installed with `sudo apt install libicu-dev`.
The source code of Synapse is hosted on GitHub. You will also need [a recent version of git](https://github.com/git-guides/install-git).
For some tests, you will need [a recent version of Docker](https://docs.docker.com/get-docker/).
@@ -104,8 +106,8 @@ regarding Synapse's Admin API, which is used mostly by sysadmins and external
service developers.
Synapse's code style is documented [here](../code_style.md). Please follow
it, including the conventions for the [sample configuration
file](../code_style.md#configuration-file-format).
it, including the conventions for [configuration
options and documentation](../code_style.md#configuration-code-and-documentation-format).
We welcome improvements and additions to our documentation itself! When
writing new pages, please
@@ -124,7 +126,7 @@ changes to the Rust code.
# 8. Test, test, test!
<a name="test-test-test"></a>
<a name="test-test-test" id="test-test-test"></a>
While you're developing and before submitting a patch, you'll
want to test your code.
@@ -380,7 +382,7 @@ To prepare a Pull Request, please:
## Changelog
All changes, even minor ones, need a corresponding changelog / newsfragment
entry. These are managed by [Towncrier](https://github.com/hawkowl/towncrier).
entry. These are managed by [Towncrier](https://github.com/twisted/towncrier).
To create a changelog entry, make a new file in the `changelog.d` directory named
in the format of `PRnumber.type`. The type can be one of the following:
@@ -422,8 +424,7 @@ chicken-and-egg problem.
There are two options for solving this:
1. Open the PR without a changelog file, see what number you got, and *then*
add the changelog file to your branch (see [Updating your pull
request](#updating-your-pull-request)), or:
add the changelog file to your branch, or:
1. Look at the [list of all
issues/PRs](https://github.com/matrix-org/synapse/issues?q=), add one to the

View File

@@ -59,8 +59,8 @@ namespace (such as anything under `/_matrix/client` for example). It is strongly
recommended that modules register their web resources under the `/_synapse/client`
namespace.
The provided resource is a Python class that implements Twisted's [IResource](https://twistedmatrix.com/documents/current/api/twisted.web.resource.IResource.html)
interface (such as [Resource](https://twistedmatrix.com/documents/current/api/twisted.web.resource.Resource.html)).
The provided resource is a Python class that implements Twisted's [IResource](https://docs.twistedmatrix.com/en/stable/api/twisted.web.resource.IResource.html)
interface (such as [Resource](https://docs.twistedmatrix.com/en/stable/api/twisted.web.resource.Resource.html)).
Only one resource can be registered for a given path. If several modules attempt to
register a resource for the same path, the module that appears first in Synapse's
@@ -82,4 +82,4 @@ the callback name as the argument name and the function as its value. A
`register_[...]_callbacks` method exists for each category.
Callbacks for each category can be found on their respective page of the
[Synapse documentation website](https://matrix-org.github.io/synapse).
[Synapse documentation website](https://matrix-org.github.io/synapse).

View File

@@ -88,98 +88,41 @@ oidc_providers:
display_name_template: "{{ user.name }}"
```
### Dex
### Apple
[Dex][dex-idp] is a simple, open-source OpenID Connect Provider.
Although it is designed to help building a full-blown provider with an
external database, it can be configured with static passwords in a config file.
Configuring "Sign in with Apple" (SiWA) requires an Apple Developer account.
Follow the [Getting Started guide](https://dexidp.io/docs/getting-started/)
to install Dex.
You will need to create a new "Services ID" for SiWA, and create and download a
private key with "SiWA" enabled.
Edit `examples/config-dev.yaml` config file from the Dex repo to add a client:
As well as the private key file, you will need:
* Client ID: the "identifier" you gave the "Services ID"
* Team ID: a 10-character ID associated with your developer account.
* Key ID: the 10-character identifier for the key.
[Apple's developer documentation](https://help.apple.com/developer-account/?lang=en#/dev77c875b7e)
has more information on setting up SiWA.
The synapse config will look like this:
```yaml
staticClients:
- id: synapse
secret: secret
redirectURIs:
- '[synapse public baseurl]/_synapse/client/oidc/callback'
name: 'Synapse'
```
Run with `dex serve examples/config-dev.yaml`.
Synapse config:
```yaml
oidc_providers:
- idp_id: dex
idp_name: "My Dex server"
skip_verification: true # This is needed as Dex is served on an insecure endpoint
issuer: "http://127.0.0.1:5556/dex"
client_id: "synapse"
client_secret: "secret"
scopes: ["openid", "profile"]
- idp_id: apple
idp_name: Apple
issuer: "https://appleid.apple.com"
client_id: "your-client-id" # Set to the "identifier" for your "ServicesID"
client_auth_method: "client_secret_post"
client_secret_jwt_key:
key_file: "/path/to/AuthKey_KEYIDCODE.p8" # point to your key file
jwt_header:
alg: ES256
kid: "KEYIDCODE" # Set to the 10-char Key ID
jwt_payload:
iss: TEAMIDCODE # Set to the 10-char Team ID
scopes: ["name", "email", "openid"]
authorization_endpoint: https://appleid.apple.com/auth/authorize?response_mode=form_post
user_mapping_provider:
config:
localpart_template: "{{ user.name }}"
display_name_template: "{{ user.name|capitalize }}"
```
### Keycloak
[Keycloak][keycloak-idp] is an opensource IdP maintained by Red Hat.
Keycloak supports OIDC Back-Channel Logout, which sends logout notification to Synapse, so that Synapse users get logged out when they log out from Keycloak.
This can be optionally enabled by setting `backchannel_logout_enabled` to `true` in the Synapse configuration, and by setting the "Backchannel Logout URL" in Keycloak.
Follow the [Getting Started Guide](https://www.keycloak.org/getting-started) to install Keycloak and set up a realm.
1. Click `Clients` in the sidebar and click `Create`
2. Fill in the fields as below:
| Field | Value |
|-----------|-----------|
| Client ID | `synapse` |
| Client Protocol | `openid-connect` |
3. Click `Save`
4. Fill in the fields as below:
| Field | Value |
|-----------|-----------|
| Client ID | `synapse` |
| Enabled | `On` |
| Client Protocol | `openid-connect` |
| Access Type | `confidential` |
| Valid Redirect URIs | `[synapse public baseurl]/_synapse/client/oidc/callback` |
| Backchannel Logout URL (optional) | `[synapse public baseurl]/_synapse/client/oidc/backchannel_logout` |
| Backchannel Logout Session Required (optional) | `On` |
5. Click `Save`
6. On the Credentials tab, update the fields:
| Field | Value |
|-------|-------|
| Client Authenticator | `Client ID and Secret` |
7. Click `Regenerate Secret`
8. Copy Secret
```yaml
oidc_providers:
- idp_id: keycloak
idp_name: "My KeyCloak server"
issuer: "https://127.0.0.1:8443/realms/{realm_name}"
client_id: "synapse"
client_secret: "copy secret generated from above"
scopes: ["openid", "profile"]
user_mapping_provider:
config:
localpart_template: "{{ user.preferred_username }}"
display_name_template: "{{ user.name }}"
backchannel_logout_enabled: true # Optional
email_template: "{{ user.email }}"
```
### Auth0
@@ -262,285 +205,43 @@ oidc_providers:
display_name_template: "{{ user.preferred_username|capitalize }}" # TO BE FILLED: If your users have names in Authentik and you want those in Synapse, this should be replaced with user.name|capitalize.
```
### LemonLDAP
### Dex
[LemonLDAP::NG][lemonldap] is an open-source IdP solution.
[Dex][dex-idp] is a simple, open-source OpenID Connect Provider.
Although it is designed to help building a full-blown provider with an
external database, it can be configured with static passwords in a config file.
1. Create an OpenID Connect Relying Parties in LemonLDAP::NG
2. The parameters are:
- Client ID under the basic menu of the new Relying Parties (`Options > Basic >
Client ID`)
- Client secret (`Options > Basic > Client secret`)
- JWT Algorithm: RS256 within the security menu of the new Relying Parties
(`Options > Security > ID Token signature algorithm` and `Options > Security >
Access Token signature algorithm`)
- Scopes: OpenID, Email and Profile
- Allowed redirection addresses for login (`Options > Basic > Allowed
redirection addresses for login` ) :
`[synapse public baseurl]/_synapse/client/oidc/callback`
Follow the [Getting Started guide](https://dexidp.io/docs/getting-started/)
to install Dex.
Edit `examples/config-dev.yaml` config file from the Dex repo to add a client:
Synapse config:
```yaml
oidc_providers:
- idp_id: lemonldap
idp_name: lemonldap
discover: true
issuer: "https://auth.example.org/" # TO BE FILLED: replace with your domain
client_id: "your client id" # TO BE FILLED
client_secret: "your client secret" # TO BE FILLED
scopes:
- "openid"
- "profile"
- "email"
user_mapping_provider:
config:
localpart_template: "{{ user.preferred_username }}}"
# TO BE FILLED: If your users have names in LemonLDAP::NG and you want those in Synapse, this should be replaced with user.name|capitalize or any valid filter.
display_name_template: "{{ user.preferred_username|capitalize }}"
staticClients:
- id: synapse
secret: secret
redirectURIs:
- '[synapse public baseurl]/_synapse/client/oidc/callback'
name: 'Synapse'
```
### GitHub
[GitHub][github-idp] is a bit special as it is not an OpenID Connect compliant provider, but
just a regular OAuth2 provider.
The [`/user` API endpoint](https://developer.github.com/v3/users/#get-the-authenticated-user)
can be used to retrieve information on the authenticated user. As the Synapse
login mechanism needs an attribute to uniquely identify users, and that endpoint
does not return a `sub` property, an alternative `subject_claim` has to be set.
1. Create a new OAuth application: [https://github.com/settings/applications/new](https://github.com/settings/applications/new).
2. Set the callback URL to `[synapse public baseurl]/_synapse/client/oidc/callback`.
Run with `dex serve examples/config-dev.yaml`.
Synapse config:
```yaml
oidc_providers:
- idp_id: github
idp_name: Github
idp_brand: "github" # optional: styling hint for clients
discover: false
issuer: "https://github.com/"
client_id: "your-client-id" # TO BE FILLED
client_secret: "your-client-secret" # TO BE FILLED
authorization_endpoint: "https://github.com/login/oauth/authorize"
token_endpoint: "https://github.com/login/oauth/access_token"
userinfo_endpoint: "https://api.github.com/user"
scopes: ["read:user"]
user_mapping_provider:
config:
subject_claim: "id"
localpart_template: "{{ user.login }}"
display_name_template: "{{ user.name }}"
```
### Google
[Google][google-idp] is an OpenID certified authentication and authorisation provider.
1. Set up a project in the Google API Console (see
[documentation](https://developers.google.com/identity/protocols/oauth2/openid-connect#appsetup)).
3. Add an "OAuth Client ID" for a Web Application under "Credentials".
4. Copy the Client ID and Client Secret, and add the following to your synapse config:
```yaml
oidc_providers:
- idp_id: google
idp_name: Google
idp_brand: "google" # optional: styling hint for clients
issuer: "https://accounts.google.com/"
client_id: "your-client-id" # TO BE FILLED
client_secret: "your-client-secret" # TO BE FILLED
scopes: ["openid", "profile", "email"] # email is optional, read below
user_mapping_provider:
config:
localpart_template: "{{ user.given_name|lower }}"
display_name_template: "{{ user.name }}"
email_template: "{{ user.email }}" # needs "email" in scopes above
```
4. Back in the Google console, add this Authorized redirect URI: `[synapse
public baseurl]/_synapse/client/oidc/callback`.
### Twitch
1. Setup a developer account on [Twitch](https://dev.twitch.tv/)
2. Obtain the OAuth 2.0 credentials by [creating an app](https://dev.twitch.tv/console/apps/)
3. Add this OAuth Redirect URL: `[synapse public baseurl]/_synapse/client/oidc/callback`
Synapse config:
```yaml
oidc_providers:
- idp_id: twitch
idp_name: Twitch
issuer: "https://id.twitch.tv/oauth2/"
client_id: "your-client-id" # TO BE FILLED
client_secret: "your-client-secret" # TO BE FILLED
client_auth_method: "client_secret_post"
user_mapping_provider:
config:
localpart_template: "{{ user.preferred_username }}"
display_name_template: "{{ user.name }}"
```
### GitLab
1. Create a [new application](https://gitlab.com/profile/applications).
2. Add the `read_user` and `openid` scopes.
3. Add this Callback URL: `[synapse public baseurl]/_synapse/client/oidc/callback`
Synapse config:
```yaml
oidc_providers:
- idp_id: gitlab
idp_name: Gitlab
idp_brand: "gitlab" # optional: styling hint for clients
issuer: "https://gitlab.com/"
client_id: "your-client-id" # TO BE FILLED
client_secret: "your-client-secret" # TO BE FILLED
client_auth_method: "client_secret_post"
scopes: ["openid", "read_user"]
user_profile_method: "userinfo_endpoint"
user_mapping_provider:
config:
localpart_template: '{{ user.nickname }}'
display_name_template: '{{ user.name }}'
```
### Facebook
0. You will need a Facebook developer account. You can register for one
[here](https://developers.facebook.com/async/registration/).
1. On the [apps](https://developers.facebook.com/apps/) page of the developer
console, "Create App", and choose "Build Connected Experiences".
2. Once the app is created, add "Facebook Login" and choose "Web". You don't
need to go through the whole form here.
3. In the left-hand menu, open "Products"/"Facebook Login"/"Settings".
* Add `[synapse public baseurl]/_synapse/client/oidc/callback` as an OAuth Redirect
URL.
4. In the left-hand menu, open "Settings/Basic". Here you can copy the "App ID"
and "App Secret" for use below.
Synapse config:
```yaml
- idp_id: facebook
idp_name: Facebook
idp_brand: "facebook" # optional: styling hint for clients
discover: false
issuer: "https://www.facebook.com"
client_id: "your-client-id" # TO BE FILLED
client_secret: "your-client-secret" # TO BE FILLED
scopes: ["openid", "email"]
authorization_endpoint: "https://facebook.com/dialog/oauth"
token_endpoint: "https://graph.facebook.com/v9.0/oauth/access_token"
jwks_uri: "https://www.facebook.com/.well-known/oauth/openid/jwks/"
user_mapping_provider:
config:
display_name_template: "{{ user.name }}"
email_template: "{{ user.email }}"
```
Relevant documents:
* [Manually Build a Login Flow](https://developers.facebook.com/docs/facebook-login/manually-build-a-login-flow)
* [Using Facebook's Graph API](https://developers.facebook.com/docs/graph-api/using-graph-api/)
* [Reference to the User endpoint](https://developers.facebook.com/docs/graph-api/reference/user)
Facebook do have an [OIDC discovery endpoint](https://www.facebook.com/.well-known/openid-configuration),
but it has a `response_types_supported` which excludes "code" (which we rely on, and
is even mentioned in their [documentation](https://developers.facebook.com/docs/facebook-login/manually-build-a-login-flow#login)),
so we have to disable discovery and configure the URIs manually.
### Gitea
Gitea is, like Github, not an OpenID provider, but just an OAuth2 provider.
The [`/user` API endpoint](https://try.gitea.io/api/swagger#/user/userGetCurrent)
can be used to retrieve information on the authenticated user. As the Synapse
login mechanism needs an attribute to uniquely identify users, and that endpoint
does not return a `sub` property, an alternative `subject_claim` has to be set.
1. Create a new application.
2. Add this Callback URL: `[synapse public baseurl]/_synapse/client/oidc/callback`
Synapse config:
```yaml
oidc_providers:
- idp_id: gitea
idp_name: Gitea
discover: false
issuer: "https://your-gitea.com/"
client_id: "your-client-id" # TO BE FILLED
client_secret: "your-client-secret" # TO BE FILLED
client_auth_method: client_secret_post
scopes: [] # Gitea doesn't support Scopes
authorization_endpoint: "https://your-gitea.com/login/oauth/authorize"
token_endpoint: "https://your-gitea.com/login/oauth/access_token"
userinfo_endpoint: "https://your-gitea.com/api/v1/user"
user_mapping_provider:
config:
subject_claim: "id"
localpart_template: "{{ user.login }}"
display_name_template: "{{ user.full_name }}"
```
### XWiki
Install [OpenID Connect Provider](https://extensions.xwiki.org/xwiki/bin/view/Extension/OpenID%20Connect/OpenID%20Connect%20Provider/) extension in your [XWiki](https://www.xwiki.org) instance.
Synapse config:
```yaml
oidc_providers:
- idp_id: xwiki
idp_name: "XWiki"
issuer: "https://myxwikihost/xwiki/oidc/"
client_id: "your-client-id" # TO BE FILLED
client_auth_method: none
- idp_id: dex
idp_name: "My Dex server"
skip_verification: true # This is needed as Dex is served on an insecure endpoint
issuer: "http://127.0.0.1:5556/dex"
client_id: "synapse"
client_secret: "secret"
scopes: ["openid", "profile"]
user_profile_method: "userinfo_endpoint"
user_mapping_provider:
config:
localpart_template: "{{ user.preferred_username }}"
display_name_template: "{{ user.name }}"
```
### Apple
Configuring "Sign in with Apple" (SiWA) requires an Apple Developer account.
You will need to create a new "Services ID" for SiWA, and create and download a
private key with "SiWA" enabled.
As well as the private key file, you will need:
* Client ID: the "identifier" you gave the "Services ID"
* Team ID: a 10-character ID associated with your developer account.
* Key ID: the 10-character identifier for the key.
[Apple's developer documentation](https://help.apple.com/developer-account/?lang=en#/dev77c875b7e)
has more information on setting up SiWA.
The synapse config will look like this:
```yaml
- idp_id: apple
idp_name: Apple
issuer: "https://appleid.apple.com"
client_id: "your-client-id" # Set to the "identifier" for your "ServicesID"
client_auth_method: "client_secret_post"
client_secret_jwt_key:
key_file: "/path/to/AuthKey_KEYIDCODE.p8" # point to your key file
jwt_header:
alg: ES256
kid: "KEYIDCODE" # Set to the 10-char Key ID
jwt_payload:
iss: TEAMIDCODE # Set to the 10-char Team ID
scopes: ["name", "email", "openid"]
authorization_endpoint: https://appleid.apple.com/auth/authorize?response_mode=form_post
user_mapping_provider:
config:
email_template: "{{ user.email }}"
localpart_template: "{{ user.name }}"
display_name_template: "{{ user.name|capitalize }}"
```
### Django OAuth Toolkit
@@ -591,6 +292,263 @@ oidc_providers:
email_template: "{{ user.email }}"
```
### Facebook
0. You will need a Facebook developer account. You can register for one
[here](https://developers.facebook.com/async/registration/).
1. On the [apps](https://developers.facebook.com/apps/) page of the developer
console, "Create App", and choose "Build Connected Experiences".
2. Once the app is created, add "Facebook Login" and choose "Web". You don't
need to go through the whole form here.
3. In the left-hand menu, open "Products"/"Facebook Login"/"Settings".
* Add `[synapse public baseurl]/_synapse/client/oidc/callback` as an OAuth Redirect
URL.
4. In the left-hand menu, open "Settings/Basic". Here you can copy the "App ID"
and "App Secret" for use below.
Synapse config:
```yaml
- idp_id: facebook
idp_name: Facebook
idp_brand: "facebook" # optional: styling hint for clients
discover: false
issuer: "https://www.facebook.com"
client_id: "your-client-id" # TO BE FILLED
client_secret: "your-client-secret" # TO BE FILLED
scopes: ["openid", "email"]
authorization_endpoint: "https://facebook.com/dialog/oauth"
token_endpoint: "https://graph.facebook.com/v9.0/oauth/access_token"
jwks_uri: "https://www.facebook.com/.well-known/oauth/openid/jwks/"
user_mapping_provider:
config:
display_name_template: "{{ user.name }}"
email_template: "{{ user.email }}"
```
Relevant documents:
* [Manually Build a Login Flow](https://developers.facebook.com/docs/facebook-login/manually-build-a-login-flow)
* [Using Facebook's Graph API](https://developers.facebook.com/docs/graph-api/using-graph-api/)
* [Reference to the User endpoint](https://developers.facebook.com/docs/graph-api/reference/user)
Facebook do have an [OIDC discovery endpoint](https://www.facebook.com/.well-known/openid-configuration),
but it has a `response_types_supported` which excludes "code" (which we rely on, and
is even mentioned in their [documentation](https://developers.facebook.com/docs/facebook-login/manually-build-a-login-flow#login)),
so we have to disable discovery and configure the URIs manually.
### GitHub
[GitHub][github-idp] is a bit special as it is not an OpenID Connect compliant provider, but
just a regular OAuth2 provider.
The [`/user` API endpoint](https://developer.github.com/v3/users/#get-the-authenticated-user)
can be used to retrieve information on the authenticated user. As the Synapse
login mechanism needs an attribute to uniquely identify users, and that endpoint
does not return a `sub` property, an alternative `subject_claim` has to be set.
1. Create a new OAuth application: [https://github.com/settings/applications/new](https://github.com/settings/applications/new).
2. Set the callback URL to `[synapse public baseurl]/_synapse/client/oidc/callback`.
Synapse config:
```yaml
oidc_providers:
- idp_id: github
idp_name: Github
idp_brand: "github" # optional: styling hint for clients
discover: false
issuer: "https://github.com/"
client_id: "your-client-id" # TO BE FILLED
client_secret: "your-client-secret" # TO BE FILLED
authorization_endpoint: "https://github.com/login/oauth/authorize"
token_endpoint: "https://github.com/login/oauth/access_token"
userinfo_endpoint: "https://api.github.com/user"
scopes: ["read:user"]
user_mapping_provider:
config:
subject_claim: "id"
localpart_template: "{{ user.login }}"
display_name_template: "{{ user.name }}"
```
### GitLab
1. Create a [new application](https://gitlab.com/profile/applications).
2. Add the `read_user` and `openid` scopes.
3. Add this Callback URL: `[synapse public baseurl]/_synapse/client/oidc/callback`
Synapse config:
```yaml
oidc_providers:
- idp_id: gitlab
idp_name: Gitlab
idp_brand: "gitlab" # optional: styling hint for clients
issuer: "https://gitlab.com/"
client_id: "your-client-id" # TO BE FILLED
client_secret: "your-client-secret" # TO BE FILLED
client_auth_method: "client_secret_post"
scopes: ["openid", "read_user"]
user_profile_method: "userinfo_endpoint"
user_mapping_provider:
config:
localpart_template: '{{ user.nickname }}'
display_name_template: '{{ user.name }}'
```
### Gitea
Gitea is, like Github, not an OpenID provider, but just an OAuth2 provider.
The [`/user` API endpoint](https://try.gitea.io/api/swagger#/user/userGetCurrent)
can be used to retrieve information on the authenticated user. As the Synapse
login mechanism needs an attribute to uniquely identify users, and that endpoint
does not return a `sub` property, an alternative `subject_claim` has to be set.
1. Create a new application.
2. Add this Callback URL: `[synapse public baseurl]/_synapse/client/oidc/callback`
Synapse config:
```yaml
oidc_providers:
- idp_id: gitea
idp_name: Gitea
discover: false
issuer: "https://your-gitea.com/"
client_id: "your-client-id" # TO BE FILLED
client_secret: "your-client-secret" # TO BE FILLED
client_auth_method: client_secret_post
scopes: [] # Gitea doesn't support Scopes
authorization_endpoint: "https://your-gitea.com/login/oauth/authorize"
token_endpoint: "https://your-gitea.com/login/oauth/access_token"
userinfo_endpoint: "https://your-gitea.com/api/v1/user"
user_mapping_provider:
config:
subject_claim: "id"
localpart_template: "{{ user.login }}"
display_name_template: "{{ user.full_name }}"
```
### Google
[Google][google-idp] is an OpenID certified authentication and authorisation provider.
1. Set up a project in the Google API Console (see
[documentation](https://developers.google.com/identity/protocols/oauth2/openid-connect#appsetup)).
3. Add an "OAuth Client ID" for a Web Application under "Credentials".
4. Copy the Client ID and Client Secret, and add the following to your synapse config:
```yaml
oidc_providers:
- idp_id: google
idp_name: Google
idp_brand: "google" # optional: styling hint for clients
issuer: "https://accounts.google.com/"
client_id: "your-client-id" # TO BE FILLED
client_secret: "your-client-secret" # TO BE FILLED
scopes: ["openid", "profile", "email"] # email is optional, read below
user_mapping_provider:
config:
localpart_template: "{{ user.given_name|lower }}"
display_name_template: "{{ user.name }}"
email_template: "{{ user.email }}" # needs "email" in scopes above
```
4. Back in the Google console, add this Authorized redirect URI: `[synapse
public baseurl]/_synapse/client/oidc/callback`.
### Keycloak
[Keycloak][keycloak-idp] is an opensource IdP maintained by Red Hat.
Keycloak supports OIDC Back-Channel Logout, which sends logout notification to Synapse, so that Synapse users get logged out when they log out from Keycloak.
This can be optionally enabled by setting `backchannel_logout_enabled` to `true` in the Synapse configuration, and by setting the "Backchannel Logout URL" in Keycloak.
Follow the [Getting Started Guide](https://www.keycloak.org/guides) to install Keycloak and set up a realm.
1. Click `Clients` in the sidebar and click `Create`
2. Fill in the fields as below:
| Field | Value |
|-----------|-----------|
| Client ID | `synapse` |
| Client Protocol | `openid-connect` |
3. Click `Save`
4. Fill in the fields as below:
| Field | Value |
|-----------|-----------|
| Client ID | `synapse` |
| Enabled | `On` |
| Client Protocol | `openid-connect` |
| Access Type | `confidential` |
| Valid Redirect URIs | `[synapse public baseurl]/_synapse/client/oidc/callback` |
| Backchannel Logout URL (optional) | `[synapse public baseurl]/_synapse/client/oidc/backchannel_logout` |
| Backchannel Logout Session Required (optional) | `On` |
5. Click `Save`
6. On the Credentials tab, update the fields:
| Field | Value |
|-------|-------|
| Client Authenticator | `Client ID and Secret` |
7. Click `Regenerate Secret`
8. Copy Secret
```yaml
oidc_providers:
- idp_id: keycloak
idp_name: "My KeyCloak server"
issuer: "https://127.0.0.1:8443/realms/{realm_name}"
client_id: "synapse"
client_secret: "copy secret generated from above"
scopes: ["openid", "profile"]
user_mapping_provider:
config:
localpart_template: "{{ user.preferred_username }}"
display_name_template: "{{ user.name }}"
backchannel_logout_enabled: true # Optional
```
### LemonLDAP
[LemonLDAP::NG][lemonldap] is an open-source IdP solution.
1. Create an OpenID Connect Relying Parties in LemonLDAP::NG
2. The parameters are:
- Client ID under the basic menu of the new Relying Parties (`Options > Basic >
Client ID`)
- Client secret (`Options > Basic > Client secret`)
- JWT Algorithm: RS256 within the security menu of the new Relying Parties
(`Options > Security > ID Token signature algorithm` and `Options > Security >
Access Token signature algorithm`)
- Scopes: OpenID, Email and Profile
- Allowed redirection addresses for login (`Options > Basic > Allowed
redirection addresses for login` ) :
`[synapse public baseurl]/_synapse/client/oidc/callback`
Synapse config:
```yaml
oidc_providers:
- idp_id: lemonldap
idp_name: lemonldap
discover: true
issuer: "https://auth.example.org/" # TO BE FILLED: replace with your domain
client_id: "your client id" # TO BE FILLED
client_secret: "your client secret" # TO BE FILLED
scopes:
- "openid"
- "profile"
- "email"
user_mapping_provider:
config:
localpart_template: "{{ user.preferred_username }}}"
# TO BE FILLED: If your users have names in LemonLDAP::NG and you want those in Synapse, this should be replaced with user.name|capitalize or any valid filter.
display_name_template: "{{ user.preferred_username|capitalize }}"
```
### Mastodon
[Mastodon](https://docs.joinmastodon.org/) instances provide an [OAuth API](https://docs.joinmastodon.org/spec/oauth/), allowing those instances to be used as a single sign-on provider for Synapse.
@@ -631,3 +589,81 @@ oidc_providers:
```
Note that the fields `client_id` and `client_secret` are taken from the CURL response above.
### Twitch
1. Setup a developer account on [Twitch](https://dev.twitch.tv/)
2. Obtain the OAuth 2.0 credentials by [creating an app](https://dev.twitch.tv/console/apps/)
3. Add this OAuth Redirect URL: `[synapse public baseurl]/_synapse/client/oidc/callback`
Synapse config:
```yaml
oidc_providers:
- idp_id: twitch
idp_name: Twitch
issuer: "https://id.twitch.tv/oauth2/"
client_id: "your-client-id" # TO BE FILLED
client_secret: "your-client-secret" # TO BE FILLED
client_auth_method: "client_secret_post"
user_mapping_provider:
config:
localpart_template: "{{ user.preferred_username }}"
display_name_template: "{{ user.name }}"
```
### Twitter
*Using Twitter as an identity provider requires using Synapse 1.75.0 or later.*
1. Setup a developer account on [Twitter](https://developer.twitter.com/en/portal/dashboard)
2. Create a project & app.
3. Enable user authentication and under "Type of App" choose "Web App, Automated App or Bot".
4. Under "App info" set the callback URL to `[synapse public baseurl]/_synapse/client/oidc/callback`.
5. Obtain the OAuth 2.0 credentials under the "Keys and tokens" tab, copy the "OAuth 2.0 Client ID and Client Secret"
Synapse config:
```yaml
oidc_providers:
- idp_id: twitter
idp_name: Twitter
idp_brand: "twitter" # optional: styling hint for clients
discover: false # Twitter is not OpenID compliant.
issuer: "https://twitter.com/"
client_id: "your-client-id" # TO BE FILLED
client_secret: "your-client-secret" # TO BE FILLED
pkce_method: "always"
# offline.access providers refresh tokens, tweet.read and users.read needed for userinfo request.
scopes: ["offline.access", "tweet.read", "users.read"]
authorization_endpoint: https://twitter.com/i/oauth2/authorize
token_endpoint: https://api.twitter.com/2/oauth2/token
userinfo_endpoint: https://api.twitter.com/2/users/me?user.fields=profile_image_url
user_mapping_provider:
config:
subject_template: "{{ user.data.id }}"
localpart_template: "{{ user.data.username }}"
display_name_template: "{{ user.data.name }}"
picture_template: "{{ user.data.profile_image_url }}"
```
### XWiki
Install [OpenID Connect Provider](https://extensions.xwiki.org/xwiki/bin/view/Extension/OpenID%20Connect/OpenID%20Connect%20Provider/) extension in your [XWiki](https://www.xwiki.org) instance.
Synapse config:
```yaml
oidc_providers:
- idp_id: xwiki
idp_name: "XWiki"
issuer: "https://myxwikihost/xwiki/oidc/"
client_id: "your-client-id" # TO BE FILLED
client_auth_method: none
scopes: ["openid", "profile"]
user_profile_method: "userinfo_endpoint"
user_mapping_provider:
config:
localpart_template: "{{ user.preferred_username }}"
display_name_template: "{{ user.name }}"
```

View File

@@ -16,7 +16,7 @@ connect to a postgres database.
- For other pre-built packages, please consult the documentation from
the relevant package.
- If you installed synapse [in a
virtualenv](setup/installation.md#installing-from-source), you can install
virtualenv](setup/installation.md#installing-as-a-python-module-from-pypi), you can install
the library with:
~/synapse/env/bin/pip install "matrix-synapse[postgres]"

View File

@@ -46,7 +46,7 @@ when using a containerized Synapse, as that will prevent it from responding
to proxied traffic.)
Optionally, you can also set
[`request_id_header`](../usage/configuration/config_documentation.md#listeners)
[`request_id_header`](./usage/configuration/config_documentation.md#listeners)
so that the server extracts and re-uses the same request ID format that the
reverse proxy is using.

View File

@@ -136,7 +136,7 @@ Unofficial package are built for SLES 15 in the openSUSE:Backports:SLE-15 reposi
#### ArchLinux
The quickest way to get up and running with ArchLinux is probably with the community package
<https://www.archlinux.org/packages/community/any/matrix-synapse/>, which should pull in most of
<https://archlinux.org/packages/community/x86_64/matrix-synapse/>, which should pull in most of
the necessary dependencies.
pip may be outdated (6.0.7-1 and needs to be upgraded to 6.0.8-1 ):
@@ -278,7 +278,7 @@ Installing prerequisites on Ubuntu or Debian:
```sh
sudo apt install build-essential python3-dev libffi-dev \
python3-pip python3-setuptools sqlite3 \
libssl-dev virtualenv libjpeg-dev libxslt1-dev
libssl-dev virtualenv libjpeg-dev libxslt1-dev libicu-dev
```
##### ArchLinux
@@ -287,7 +287,7 @@ Installing prerequisites on ArchLinux:
```sh
sudo pacman -S base-devel python python-pip \
python-setuptools python-virtualenv sqlite3
python-setuptools python-virtualenv sqlite3 icu
```
##### CentOS/Fedora
@@ -297,7 +297,8 @@ Installing prerequisites on CentOS or Fedora Linux:
```sh
sudo dnf install libtiff-devel libjpeg-devel libzip-devel freetype-devel \
libwebp-devel libxml2-devel libxslt-devel libpq-devel \
python3-virtualenv libffi-devel openssl-devel python3-devel
python3-virtualenv libffi-devel openssl-devel python3-devel \
libicu-devel
sudo dnf groupinstall "Development Tools"
```
@@ -310,8 +311,12 @@ You may need to install the latest Xcode developer tools:
xcode-select --install
```
On ARM-based Macs you may need to install libjpeg and libpq.
You can use Homebrew (https://brew.sh):
Some extra dependencies may be needed. You can use Homebrew (https://brew.sh) for them.
You may need to install icu, and make the icu binaries and libraries accessible.
Please follow [the official instructions of PyICU](https://pypi.org/project/PyICU/) to do so.
On ARM-based Macs you may also need to install libjpeg and libpq:
```sh
brew install jpeg libpq
```
@@ -332,7 +337,8 @@ Installing prerequisites on openSUSE:
```sh
sudo zypper in -t pattern devel_basis
sudo zypper in python-pip python-setuptools sqlite3 python-virtualenv \
python-devel libffi-devel libopenssl-devel libjpeg62-devel
python-devel libffi-devel libopenssl-devel libjpeg62-devel \
libicu-devel
```
##### OpenBSD

View File

@@ -120,7 +120,7 @@ specified in the config. It is located at
## SAML Mapping Providers
The SAML mapping provider can be customized by editing the
[`saml2_config.user_mapping_provider.module`](docs/usage/configuration/config_documentation.md#saml2_config)
[`saml2_config.user_mapping_provider.module`](usage/configuration/config_documentation.md#saml2_config)
config option.
`saml2_config.user_mapping_provider.config` allows you to provide custom

View File

@@ -88,6 +88,22 @@ process, for example:
dpkg -i matrix-synapse-py3_1.3.0+stretch1_amd64.deb
```
# Upgrading to v1.74.0
## Unicode support in user search
This version introduces optional support for an [improved user search dealing with Unicode characters](https://github.com/matrix-org/synapse/pull/14464).
If you want to take advantage of this feature you need to install PyICU,
the ICU native dependency and its development headers
so that PyICU can build since no prebuilt wheels are available.
You can follow [the PyICU documentation](https://pypi.org/project/PyICU/) to do so,
and then do `pip install matrix-synapse[user-search]` for a PyPI install.
Docker images and Debian packages need nothing specific as they already
include or specify ICU as an explicit dependency.
# Upgrading to v1.73.0
## Legacy Prometheus metric names have now been removed
@@ -873,8 +889,8 @@ Any scripts still using the above APIs should be converted to use the
## User-interactive authentication fallback templates can now display errors
This may affect you if you make use of custom HTML templates for the
[reCAPTCHA](../synapse/res/templates/recaptcha.html) or
[terms](../synapse/res/templates/terms.html) fallback pages.
[reCAPTCHA (`synapse/res/templates/recaptcha.html`)](https://github.com/matrix-org/synapse/tree/develop/synapse/res/templates/recaptcha.html) or
[terms (`synapse/res/templates/terms.html`)](https://github.com/matrix-org/synapse/tree/develop/synapse/res/templates/terms.html) fallback pages.
The template is now provided an `error` variable if the authentication
process failed. See the default templates linked above for an example.
@@ -1472,7 +1488,7 @@ New templates (`sso_auth_confirm.html`, `sso_auth_success.html`, and
is configured to use SSO and a custom
`sso_redirect_confirm_template_dir` configuration then these templates
will need to be copied from
[synapse/res/templates](synapse/res/templates) into that directory.
[`synapse/res/templates`](https://github.com/matrix-org/synapse/tree/develop/synapse/res/templates) into that directory.
## Synapse SSO Plugins Method Deprecation

View File

@@ -7,7 +7,7 @@ server admin. (Note that a server admin is distinct from a room admin.)
An existing user can be marked as a server admin by updating the database directly.
Check your [database settings](config_documentation.md#database) in the configuration file, connect to the correct database using either `psql [database name]` (if using PostgreSQL) or `sqlite3 path/to/your/database.db` (if using SQLite) and elevate the user `@foo:bar.com` to administrator.
Check your [database settings](../../configuration/config_documentation.md#database) in the configuration file, connect to the correct database using either `psql [database name]` (if using PostgreSQL) or `sqlite3 path/to/your/database.db` (if using SQLite) and elevate the user `@foo:bar.com` to administrator.
```sql
UPDATE users SET admin = 1 WHERE name = '@foo:bar.com';
```
@@ -32,10 +32,10 @@ curl --header "Authorization: Bearer <access_token>" <the_rest_of_your_API_reque
```
For example, suppose we want to
[query the account](user_admin_api.md#query-user-account) of the user
[query the account](../../../admin_api/user_admin_api.md#query-user-account) of the user
`@foo:bar.com`. We need an admin access token (e.g.
`syt_AjfVef2_L33JNpafeif_0feKJfeaf0CQpoZk`), and we need to know which port
Synapse's [`client` listener](config_documentation.md#listeners) is listening
Synapse's [`client` listener](../../configuration/config_documentation.md#listeners) is listening
on (e.g. `8008`). Then we can use the following command to request the account
information from the Admin API.

View File

@@ -81,7 +81,7 @@ The following fields are returned in the JSON response body:
- `failure_ts` - nullable integer - The first time Synapse tried and failed to reach the
remote server, in ms. This is `null` if communication with the remote server has never failed.
- `last_successful_stream_ordering` - nullable integer - The stream ordering of the most
recent successfully-sent [PDU](understanding_synapse_through_grafana_graphs.md#federation)
recent successfully-sent [PDU](../understanding_synapse_through_grafana_graphs.md#federation)
to this destination, or `null` if this information has not been tracked yet.
- `next_token`: string representing a positive integer - Indication for pagination. See above.
- `total` - integer - Total number of destinations.
@@ -174,7 +174,7 @@ The following fields are returned in the JSON response body:
Room objects contain the following fields:
- `room_id` - string - The ID of the room.
- `stream_ordering` - integer - The stream ordering of the most recent
successfully-sent [PDU](understanding_synapse_through_grafana_graphs.md#federation)
successfully-sent [PDU](../understanding_synapse_through_grafana_graphs.md#federation)
to this destination in this room.
- `next_token`: string representing a positive integer - Indication for pagination. See above.
- `total` - integer - Total number of destinations.

View File

@@ -6,7 +6,7 @@ registration requests, as proposed in
and stabilised in version 1.2 of the Matrix specification.
To use it, you will need to enable the `registration_requires_token` config
option, and authenticate by providing an `access_token` for a server admin:
see [Admin API](../admin_api).
see [Admin API](../admin_api/).
## Registration token objects

View File

@@ -2,7 +2,7 @@
How do I become a server admin?
---
If your server already has an admin account you should use the [User Admin API](../../admin_api/user_admin_api.md#Change-whether-a-user-is-a-server-administrator-or-not) to promote other accounts to become admins.
If your server already has an admin account you should use the [User Admin API](../../admin_api/user_admin_api.md#change-whether-a-user-is-a-server-administrator-or-not) to promote other accounts to become admins.
If you don't have any admin accounts yet you won't be able to use the admin API, so you'll have to edit the database manually. Manually editing the database is generally not recommended so once you have an admin account: use the admin APIs to make further changes.
@@ -115,7 +115,7 @@ something like the following in their logs:
2019-09-11 19:32:04,271 - synapse.federation.transport.server - 288 - WARNING - GET-11752 - authenticate_request failed: 401: Invalid signature for server <server> with key ed25519:a_EqML: Unable to verify signature for <server>
This is normally caused by a misconfiguration in your reverse-proxy. See [the reverse proxy docs](docs/reverse_proxy.md) and double-check that your settings are correct.
This is normally caused by a misconfiguration in your reverse-proxy. See [the reverse proxy docs](../../reverse_proxy.md) and double-check that your settings are correct.
Help!! Synapse is slow and eats all my RAM/CPU!

View File

@@ -78,4 +78,4 @@ If you would like to set up your own statistics collection server and send metri
consider using one of the following known implementations:
* [Matrix.org's Panopticon](https://github.com/matrix-org/panopticon)
* [Famedly's Barad-dûr](https://gitlab.com/famedly/company/devops/services/barad-dur)
* [Famedly's Barad-dûr](https://gitlab.com/famedly/infra/services/barad-dur)

View File

@@ -1,6 +1,6 @@
# Request log format
HTTP request logs are written by synapse (see [`site.py`](../synapse/http/site.py) for details).
HTTP request logs are written by synapse (see [`synapse/http/site.py`](https://github.com/matrix-org/synapse/tree/develop/synapse/http/site.py) for details).
See the following for how to decode the dense data available from the default logging configuration.
@@ -10,10 +10,10 @@ See the following for how to decode the dense data available from the default lo
```
| Part | Explanation |
| Part | Explanation |
| ----- | ------------ |
| AAAA | Timestamp request was logged (not received) |
| BBBB | Logger name (`synapse.access.(http\|https).<tag>`, where 'tag' is defined in the `listeners` config section, normally the port) |
| BBBB | Logger name (`synapse.access.(http\|https).<tag>`, where 'tag' is defined in the [`listeners`](../configuration/config_documentation.md#listeners) config section, normally the port) |
| CCCC | Line number in code |
| DDDD | Log Level |
| EEEE | Request Identifier (This identifier is shared by related log lines)|

View File

@@ -422,6 +422,10 @@ Sub-options for each listener include:
* `port`: the TCP port to bind to.
* `tag`: An alias for the port in the logger name. If set the tag is logged instead
of the port. Default to `None`, is optional and only valid for listener with `type: http`.
See the docs [request log format](../administration/request_log.md).
* `bind_addresses`: a list of local addresses to listen on. The default is
'all local interfaces'.
@@ -476,6 +480,12 @@ Valid resource names are:
* `static`: static resources under synapse/static (/_matrix/static). (Mostly useful for 'fallback authentication'.)
* `health`: the [health check endpoint](../../reverse_proxy.md#health-check-endpoint). This endpoint
is by default active for all other resources and does not have to be activated separately.
This is only useful if you want to use the health endpoint explicitly on a dedicated port or
for [workers](../../workers.md) and containers without listener e.g.
[application services](../../workers.md#notifying-application-services).
Example configuration #1:
```yaml
listeners:
@@ -569,6 +579,115 @@ Example configuration:
```yaml
delete_stale_devices_after: 1y
```
---
### `email`
Configuration for sending emails from Synapse.
Server admins can configure custom templates for email content. See
[here](../../templates.md) for more information.
This setting has the following sub-options:
* `smtp_host`: The hostname of the outgoing SMTP server to use. Defaults to 'localhost'.
* `smtp_port`: The port on the mail server for outgoing SMTP. Defaults to 465 if `force_tls` is true, else 25.
_Changed in Synapse 1.64.0:_ the default port is now aware of `force_tls`.
* `smtp_user` and `smtp_pass`: Username/password for authentication to the SMTP server. By default, no
authentication is attempted.
* `force_tls`: By default, Synapse connects over plain text and then optionally upgrades
to TLS via STARTTLS. If this option is set to true, TLS is used from the start (Implicit TLS),
and the option `require_transport_security` is ignored.
It is recommended to enable this if supported by your mail server.
_New in Synapse 1.64.0._
* `require_transport_security`: Set to true to require TLS transport security for SMTP.
By default, Synapse will connect over plain text, and will then switch to
TLS via STARTTLS *if the SMTP server supports it*. If this option is set,
Synapse will refuse to connect unless the server supports STARTTLS.
* `enable_tls`: By default, if the server supports TLS, it will be used, and the server
must present a certificate that is valid for 'smtp_host'. If this option
is set to false, TLS will not be used.
* `notif_from`: defines the "From" address to use when sending emails.
It must be set if email sending is enabled. The placeholder '%(app)s' will be replaced by the application name,
which is normally set in `app_name`, but may be overridden by the
Matrix client application. Note that the placeholder must be written '%(app)s', including the
trailing 's'.
* `app_name`: `app_name` defines the default value for '%(app)s' in `notif_from` and email
subjects. It defaults to 'Matrix'.
* `enable_notifs`: Set to true to enable sending emails for messages that the user
has missed. Disabled by default.
* `notif_for_new_users`: Set to false to disable automatic subscription to email
notifications for new users. Enabled by default.
* `client_base_url`: Custom URL for client links within the email notifications. By default
links will be based on "https://matrix.to". (This setting used to be called `riot_base_url`;
the old name is still supported for backwards-compatibility but is now deprecated.)
* `validation_token_lifetime`: Configures the time that a validation email will expire after sending.
Defaults to 1h.
* `invite_client_location`: The web client location to direct users to during an invite. This is passed
to the identity server as the `org.matrix.web_client_location` key. Defaults
to unset, giving no guidance to the identity server.
* `subjects`: Subjects to use when sending emails from Synapse. The placeholder '%(app)s' will
be replaced with the value of the `app_name` setting, or by a value dictated by the Matrix client application.
In addition, each subject can use the following placeholders: '%(person)s', which will be replaced by the displayname
of the user(s) that sent the message(s), e.g. "Alice and Bob", and '%(room)s', which will be replaced by the name of the room the
message(s) have been sent to, e.g. "My super room". In addition, emails related to account administration will
can use the '%(server_name)s' placeholder, which will be replaced by the value of the
`server_name` setting in your Synapse configuration.
Here is a list of subjects for notification emails that can be set:
* `message_from_person_in_room`: Subject to use to notify about one message from one or more user(s) in a
room which has a name. Defaults to "[%(app)s] You have a message on %(app)s from %(person)s in the %(room)s room..."
* `message_from_person`: Subject to use to notify about one message from one or more user(s) in a
room which doesn't have a name. Defaults to "[%(app)s] You have a message on %(app)s from %(person)s..."
* `messages_from_person`: Subject to use to notify about multiple messages from one or more users in
a room which doesn't have a name. Defaults to "[%(app)s] You have messages on %(app)s from %(person)s..."
* `messages_in_room`: Subject to use to notify about multiple messages in a room which has a
name. Defaults to "[%(app)s] You have messages on %(app)s in the %(room)s room..."
* `messages_in_room_and_others`: Subject to use to notify about multiple messages in multiple rooms.
Defaults to "[%(app)s] You have messages on %(app)s in the %(room)s room and others..."
* `messages_from_person_and_others`: Subject to use to notify about multiple messages from multiple persons in
multiple rooms. This is similar to the setting above except it's used when
the room in which the notification was triggered has no name. Defaults to
"[%(app)s] You have messages on %(app)s from %(person)s and others..."
* `invite_from_person_to_room`: Subject to use to notify about an invite to a room which has a name.
Defaults to "[%(app)s] %(person)s has invited you to join the %(room)s room on %(app)s..."
* `invite_from_person`: Subject to use to notify about an invite to a room which doesn't have a
name. Defaults to "[%(app)s] %(person)s has invited you to chat on %(app)s..."
* `password_reset`: Subject to use when sending a password reset email. Defaults to "[%(server_name)s] Password reset"
* `email_validation`: Subject to use when sending a verification email to assert an address's
ownership. Defaults to "[%(server_name)s] Validate your email"
Example configuration:
```yaml
email:
smtp_host: mail.server
smtp_port: 587
smtp_user: "exampleusername"
smtp_pass: "examplepassword"
force_tls: true
require_transport_security: true
enable_tls: false
notif_from: "Your Friendly %(app)s homeserver <noreply@example.com>"
app_name: my_branded_matrix_server
enable_notifs: true
notif_for_new_users: false
client_base_url: "http://localhost/riot"
validation_token_lifetime: 15m
invite_client_location: https://app.element.io
subjects:
message_from_person_in_room: "[%(app)s] You have a message on %(app)s from %(person)s in the %(room)s room..."
message_from_person: "[%(app)s] You have a message on %(app)s from %(person)s..."
messages_from_person: "[%(app)s] You have messages on %(app)s from %(person)s..."
messages_in_room: "[%(app)s] You have messages on %(app)s in the %(room)s room..."
messages_in_room_and_others: "[%(app)s] You have messages on %(app)s in the %(room)s room and others..."
messages_from_person_and_others: "[%(app)s] You have messages on %(app)s from %(person)s and others..."
invite_from_person_to_room: "[%(app)s] %(person)s has invited you to join the %(room)s room on %(app)s..."
invite_from_person: "[%(app)s] %(person)s has invited you to chat on %(app)s..."
password_reset: "[%(server_name)s] Password reset"
email_validation: "[%(server_name)s] Validate your email"
```
## Homeserver blocking
Useful options for Synapse admins.
@@ -1212,7 +1331,7 @@ Associated sub-options:
connection pool. For a reference to valid arguments, see:
* for [sqlite](https://docs.python.org/3/library/sqlite3.html#sqlite3.connect)
* for [postgres](https://www.postgresql.org/docs/current/libpq-connect.html#LIBPQ-PARAMKEYWORDS)
* for [the connection pool](https://twistedmatrix.com/documents/current/api/twisted.enterprise.adbapi.ConnectionPool.html#__init__)
* for [the connection pool](https://docs.twistedmatrix.com/en/stable/api/twisted.enterprise.adbapi.ConnectionPool.html#__init__)
For more information on using Synapse with Postgres,
see [here](../../postgres.md).
@@ -2514,18 +2633,18 @@ state events are shared with users:
- `m.room.topic`
To change the default behavior, use the following sub-options:
* `disable_default_event_types`: boolean. Set to `true` to disable the above
* `disable_default_event_types`: boolean. Set to `true` to disable the above
defaults. If this is enabled, only the event types listed in
`additional_event_types` are shared. Defaults to `false`.
* `additional_event_types`: A list of additional state events to include in the
events to be shared. By default, this list is empty (so only the default event
* `additional_event_types`: A list of additional state events to include in the
events to be shared. By default, this list is empty (so only the default event
types are shared).
Each entry in this list should be either a single string or a list of two
strings.
strings.
* A standalone string `t` represents all events with type `t` (i.e.
with no restrictions on state keys).
* A pair of strings `[t, s]` represents a single event with type `t` and
* A pair of strings `[t, s]` represents a single event with type `t` and
state key `s`. The same type can appear in two entries with different state
keys: in this situation, both state keys are included in prejoin state.
@@ -2944,8 +3063,13 @@ Options for each entry include:
values are `client_secret_basic` (default), `client_secret_post` and
`none`.
* `pkce_method`: Whether to use proof key for code exchange when requesting
and exchanging the token. Valid values are: `auto`, `always`, or `never`. Defaults
to `auto`, which uses PKCE if supported during metadata discovery. Set to `always`
to force enable PKCE or `never` to force disable PKCE.
* `scopes`: list of scopes to request. This should normally include the "openid"
scope. Defaults to ["openid"].
scope. Defaults to `["openid"]`.
* `authorization_endpoint`: the oauth2 authorization endpoint. Required if
provider discovery is disabled.
@@ -2989,17 +3113,35 @@ Options for each entry include:
For the default provider, the following settings are available:
* `subject_template`: Jinja2 template for a unique identifier for the user.
Defaults to `{{ user.sub }}`, which OpenID Connect compliant providers should provide.
This replaces and overrides `subject_claim`.
* `subject_claim`: name of the claim containing a unique identifier
for the user. Defaults to 'sub', which OpenID Connect
compliant providers should provide.
*Deprecated in Synapse v1.75.0.*
* `picture_template`: Jinja2 template for an url for the user's profile picture.
Defaults to `{{ user.picture }}`, which OpenID Connect compliant providers should
provide and has to refer to a direct image file such as PNG, JPEG, or GIF image file.
This replaces and overrides `picture_claim`.
Currently only supported in monolithic (single-process) server configurations
where the media repository runs within the Synapse process.
* `picture_claim`: name of the claim containing an url for the user's profile picture.
Defaults to 'picture', which OpenID Connect compliant providers should provide
and has to refer to a direct image file such as PNG, JPEG, or GIF image file.
Currently only supported in monolithic (single-process) server configurations
where the media repository runs within the Synapse process.
*Deprecated in Synapse v1.75.0.*
* `localpart_template`: Jinja2 template for the localpart of the MXID.
If this is not set, the user will be prompted to choose their
own username (see the documentation for the `sso_auth_account_details.html`
@@ -3259,114 +3401,6 @@ ui_auth:
session_timeout: "15s"
```
---
### `email`
Configuration for sending emails from Synapse.
Server admins can configure custom templates for email content. See
[here](../../templates.md) for more information.
This setting has the following sub-options:
* `smtp_host`: The hostname of the outgoing SMTP server to use. Defaults to 'localhost'.
* `smtp_port`: The port on the mail server for outgoing SMTP. Defaults to 465 if `force_tls` is true, else 25.
_Changed in Synapse 1.64.0:_ the default port is now aware of `force_tls`.
* `smtp_user` and `smtp_pass`: Username/password for authentication to the SMTP server. By default, no
authentication is attempted.
* `force_tls`: By default, Synapse connects over plain text and then optionally upgrades
to TLS via STARTTLS. If this option is set to true, TLS is used from the start (Implicit TLS),
and the option `require_transport_security` is ignored.
It is recommended to enable this if supported by your mail server.
_New in Synapse 1.64.0._
* `require_transport_security`: Set to true to require TLS transport security for SMTP.
By default, Synapse will connect over plain text, and will then switch to
TLS via STARTTLS *if the SMTP server supports it*. If this option is set,
Synapse will refuse to connect unless the server supports STARTTLS.
* `enable_tls`: By default, if the server supports TLS, it will be used, and the server
must present a certificate that is valid for 'smtp_host'. If this option
is set to false, TLS will not be used.
* `notif_from`: defines the "From" address to use when sending emails.
It must be set if email sending is enabled. The placeholder '%(app)s' will be replaced by the application name,
which is normally set in `app_name`, but may be overridden by the
Matrix client application. Note that the placeholder must be written '%(app)s', including the
trailing 's'.
* `app_name`: `app_name` defines the default value for '%(app)s' in `notif_from` and email
subjects. It defaults to 'Matrix'.
* `enable_notifs`: Set to true to enable sending emails for messages that the user
has missed. Disabled by default.
* `notif_for_new_users`: Set to false to disable automatic subscription to email
notifications for new users. Enabled by default.
* `client_base_url`: Custom URL for client links within the email notifications. By default
links will be based on "https://matrix.to". (This setting used to be called `riot_base_url`;
the old name is still supported for backwards-compatibility but is now deprecated.)
* `validation_token_lifetime`: Configures the time that a validation email will expire after sending.
Defaults to 1h.
* `invite_client_location`: The web client location to direct users to during an invite. This is passed
to the identity server as the `org.matrix.web_client_location` key. Defaults
to unset, giving no guidance to the identity server.
* `subjects`: Subjects to use when sending emails from Synapse. The placeholder '%(app)s' will
be replaced with the value of the `app_name` setting, or by a value dictated by the Matrix client application.
In addition, each subject can use the following placeholders: '%(person)s', which will be replaced by the displayname
of the user(s) that sent the message(s), e.g. "Alice and Bob", and '%(room)s', which will be replaced by the name of the room the
message(s) have been sent to, e.g. "My super room". In addition, emails related to account administration will
can use the '%(server_name)s' placeholder, which will be replaced by the value of the
`server_name` setting in your Synapse configuration.
Here is a list of subjects for notification emails that can be set:
* `message_from_person_in_room`: Subject to use to notify about one message from one or more user(s) in a
room which has a name. Defaults to "[%(app)s] You have a message on %(app)s from %(person)s in the %(room)s room..."
* `message_from_person`: Subject to use to notify about one message from one or more user(s) in a
room which doesn't have a name. Defaults to "[%(app)s] You have a message on %(app)s from %(person)s..."
* `messages_from_person`: Subject to use to notify about multiple messages from one or more users in
a room which doesn't have a name. Defaults to "[%(app)s] You have messages on %(app)s from %(person)s..."
* `messages_in_room`: Subject to use to notify about multiple messages in a room which has a
name. Defaults to "[%(app)s] You have messages on %(app)s in the %(room)s room..."
* `messages_in_room_and_others`: Subject to use to notify about multiple messages in multiple rooms.
Defaults to "[%(app)s] You have messages on %(app)s in the %(room)s room and others..."
* `messages_from_person_and_others`: Subject to use to notify about multiple messages from multiple persons in
multiple rooms. This is similar to the setting above except it's used when
the room in which the notification was triggered has no name. Defaults to
"[%(app)s] You have messages on %(app)s from %(person)s and others..."
* `invite_from_person_to_room`: Subject to use to notify about an invite to a room which has a name.
Defaults to "[%(app)s] %(person)s has invited you to join the %(room)s room on %(app)s..."
* `invite_from_person`: Subject to use to notify about an invite to a room which doesn't have a
name. Defaults to "[%(app)s] %(person)s has invited you to chat on %(app)s..."
* `password_reset`: Subject to use when sending a password reset email. Defaults to "[%(server_name)s] Password reset"
* `email_validation`: Subject to use when sending a verification email to assert an address's
ownership. Defaults to "[%(server_name)s] Validate your email"
Example configuration:
```yaml
email:
smtp_host: mail.server
smtp_port: 587
smtp_user: "exampleusername"
smtp_pass: "examplepassword"
force_tls: true
require_transport_security: true
enable_tls: false
notif_from: "Your Friendly %(app)s homeserver <noreply@example.com>"
app_name: my_branded_matrix_server
enable_notifs: true
notif_for_new_users: false
client_base_url: "http://localhost/riot"
validation_token_lifetime: 15m
invite_client_location: https://app.element.io
subjects:
message_from_person_in_room: "[%(app)s] You have a message on %(app)s from %(person)s in the %(room)s room..."
message_from_person: "[%(app)s] You have a message on %(app)s from %(person)s..."
messages_from_person: "[%(app)s] You have messages on %(app)s from %(person)s..."
messages_in_room: "[%(app)s] You have messages on %(app)s in the %(room)s room..."
messages_in_room_and_others: "[%(app)s] You have messages on %(app)s in the %(room)s room and others..."
messages_from_person_and_others: "[%(app)s] You have messages on %(app)s from %(person)s and others..."
invite_from_person_to_room: "[%(app)s] %(person)s has invited you to join the %(room)s room on %(app)s..."
invite_from_person: "[%(app)s] %(person)s has invited you to chat on %(app)s..."
password_reset: "[%(server_name)s] Password reset"
email_validation: "[%(server_name)s] Validate your email"
```
---
## Push
Configuration settings related to push notifications
@@ -3840,6 +3874,48 @@ Example configuration:
```yaml
run_background_tasks_on: worker1
```
---
### `update_user_directory_from_worker`
The [worker](../../workers.md#updating-the-user-directory) that is used to
update the user directory. If not provided this defaults to the main process.
Example configuration:
```yaml
update_user_directory_from_worker: worker1
```
_Added in Synapse 1.59.0._
---
### `notify_appservices_from_worker`
The [worker](../../workers.md#notifying-application-services) that is used to
send output traffic to Application Services. If not provided this defaults
to the main process.
Example configuration:
```yaml
notify_appservices_from_worker: worker1
```
_Added in Synapse 1.59.0._
---
### `media_instance_running_background_jobs`
The [worker](../../workers.md#synapseappmedia_repository) that is used to run
background tasks for media repository. If running multiple media repositories
you must configure a single instance to run the background tasks. If not provided
this defaults to the main process or your single `media_repository` worker.
Example configuration:
```yaml
media_instance_running_background_jobs: worker1
```
_Added in Synapse 1.16.0._
---
### `redis`
@@ -3957,7 +4033,7 @@ worker_listeners:
### `worker_daemonize`
Specifies whether the worker should be started as a daemon process.
If Synapse is being managed by [systemd](../../systemd-with-workers/README.md), this option
If Synapse is being managed by [systemd](../../systemd-with-workers/), this option
must be omitted or set to `false`.
Defaults to `false`.

View File

@@ -157,7 +157,7 @@ Finally, you need to start your worker processes. This can be done with either
`synctl` or your distribution's preferred service manager such as `systemd`. We
recommend the use of `systemd` where available: for information on setting up
`systemd` to start synapse workers, see
[Systemd with Workers](systemd-with-workers). To use `synctl`, see
[Systemd with Workers](systemd-with-workers/). To use `synctl`, see
[Using synctl with Workers](synctl_workers.md).
@@ -386,7 +386,7 @@ so. It will then pass those events over HTTP replication to any configured event
persisters (or the main process if none are configured).
Note that `event_creator`s and `event_persister`s are implemented using the same
[`synapse.app.generic_worker`](#synapse.app.generic_worker).
[`synapse.app.generic_worker`](#synapseappgeneric_worker).
An example [`stream_writers`](usage/configuration/config_documentation.md#stream_writers)
configuration with multiple writers:
@@ -465,7 +465,8 @@ An example for a dedicated background worker instance:
You can designate one generic worker to update the user directory.
Specify its name in the shared configuration as follows:
Specify its name in the [shared configuration](usage/configuration/config_documentation.md#update_user_directory_from_worker)
as follows:
```yaml
update_user_directory_from_worker: worker_name
@@ -490,7 +491,8 @@ worker application type.
You can designate one generic worker to send output traffic to Application Services.
Doesn't handle any REST endpoints itself, but you should specify its name in the
shared configuration as follows:
[shared configuration](usage/configuration/config_documentation.md#notify_appservices_from_worker)
as follows:
```yaml
notify_appservices_from_worker: worker_name
@@ -502,11 +504,38 @@ after setting this option in the shared configuration!
This style of configuration supersedes the legacy `synapse.app.appservice`
worker application type.
#### Push Notifications
You can designate generic worker to sending push notifications to
a [push gateway](https://spec.matrix.org/v1.5/push-gateway-api/) such as
[sygnal](https://github.com/matrix-org/sygnal) and email.
This will stop the main process sending push notifications.
The workers responsible for sending push notifications can be defined using the
[`pusher_instances`](usage/configuration/config_documentation.md#pusher_instances)
option. For example:
```yaml
pusher_instances:
- pusher_worker1
- pusher_worker2
```
Multiple workers can be added to this map, in which case the work is balanced
across them. Ensure the main process and all pusher workers are restarted after changing
this option.
These workers don't need to accept incoming HTTP requests to send push notifications,
so no additional reverse proxy configuration is required for pusher workers.
This style of configuration supersedes the legacy `synapse.app.pusher`
worker application type.
### `synapse.app.pusher`
It is likely this option will be deprecated in the future and is not recommended for new
installations. Instead, [use `synapse.app.generic_worker` with the `pusher_instances`](usage/configuration/config_documentation.md#pusher_instances).
installations. Instead, [use `synapse.app.generic_worker` with the `pusher_instances`](#push-notifications).
Handles sending push notifications to sygnal and email. Doesn't handle any
REST endpoints itself, but you should set
@@ -547,7 +576,7 @@ Note this worker cannot be load-balanced: only one instance should be active.
### `synapse.app.federation_sender`
It is likely this option will be deprecated in the future and not recommended for
new installations. Instead, [use `synapse.app.generic_worker` with the `federation_sender_instances`](usage/configuration/config_documentation.md#federation_sender_instances).
new installations. Instead, [use `synapse.app.generic_worker` with the `federation_sender_instances`](usage/configuration/config_documentation.md#federation_sender_instances).
Handles sending federation traffic to other servers. Doesn't handle any
REST endpoints itself, but you should set
@@ -606,7 +635,9 @@ expose the `media` resource. For example:
```
Note that if running multiple media repositories they must be on the same server
and you must configure a single instance to run the background tasks, e.g.:
and you must specify a single instance to run the background tasks in the
[shared configuration](usage/configuration/config_documentation.md#media_instance_running_background_jobs),
e.g.:
```yaml
media_instance_running_background_jobs: "media-repository-1"

View File

@@ -36,7 +36,6 @@ exclude = (?x)
|tests/api/test_ratelimiting.py
|tests/app/test_openid_listener.py
|tests/appservice/test_scheduler.py
|tests/crypto/test_keyring.py
|tests/events/test_presence_router.py
|tests/events/test_utils.py
|tests/federation/test_federation_catch_up.py
@@ -49,9 +48,6 @@ exclude = (?x)
|tests/logging/__init__.py
|tests/logging/test_terse_json.py
|tests/module_api/test_api.py
|tests/push/test_email.py
|tests/push/test_presentable_names.py
|tests/push/test_push_rule_evaluator.py
|tests/rest/client/test_transactions.py
|tests/rest/media/v1/test_media_storage.py
|tests/server.py
@@ -90,16 +86,19 @@ disallow_untyped_defs = False
[mypy-tests.config.*]
disallow_untyped_defs = True
[mypy-tests.crypto.*]
disallow_untyped_defs = True
[mypy-tests.federation.transport.test_client]
disallow_untyped_defs = True
[mypy-tests.handlers.*]
disallow_untyped_defs = True
[mypy-tests.metrics.test_background_process_metrics]
[mypy-tests.metrics.*]
disallow_untyped_defs = True
[mypy-tests.push.test_bulk_push_rule_evaluator]
[mypy-tests.push.*]
disallow_untyped_defs = True
[mypy-tests.rest.*]

872
poetry.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -40,6 +40,46 @@ target-version = ['py37', 'py38', 'py39', 'py310']
# https://black.readthedocs.io/en/stable/usage_and_configuration/file_collection_and_discovery.html#gitignore
# Use `extend-exclude` if you want to exclude something in addition to this.
[tool.ruff]
line-length = 88
# See https://github.com/charliermarsh/ruff/#pycodestyle
# for error codes. The ones we ignore are:
# E731: do not assign a lambda expression, use a def
# E501: Line too long (black enforces this for us)
#
# See https://github.com/charliermarsh/ruff/#pyflakes
# F401: unused import
# F811: Redefinition of unused
# F821: Undefined name
#
# flake8-bugbear compatible checks. Its error codes are described at
# https://github.com/charliermarsh/ruff/#flake8-bugbear
# B019: Use of functools.lru_cache or functools.cache on methods can lead to memory leaks
# B023: Functions defined inside a loop must not use variables redefined in the loop
# B024: Abstract base class with no abstract method.
ignore = [
"B019",
"B023",
"B024",
"E501",
"E731",
"F401",
"F811",
"F821",
]
select = [
# pycodestyle checks.
"E",
"W",
# pyflakes checks.
"F",
# flake8-bugbear checks.
"B0",
# flake8-comprehensions checks.
"C4",
]
[tool.isort]
line_length = 88
sections = ["FUTURE", "STDLIB", "THIRDPARTY", "TWISTED", "FIRSTPARTY", "TESTS", "LOCALFOLDER"]
@@ -57,7 +97,7 @@ manifest-path = "rust/Cargo.toml"
[tool.poetry]
name = "matrix-synapse"
version = "1.74.0rc1"
version = "1.75.0rc1"
description = "Homeserver for the Matrix decentralised comms protocol"
authors = ["Matrix.org Team and Contributors <packages@matrix.org>"]
license = "Apache-2.0"
@@ -136,7 +176,7 @@ Twisted = {extras = ["tls"], version = ">=18.9.0"}
treq = ">=15.1"
# Twisted has required pyopenssl 16.0 since about Twisted 16.6.
pyOpenSSL = ">=16.0.0"
PyYAML = ">=3.11"
PyYAML = ">=3.13"
pyasn1 = ">=0.1.9"
pyasn1-modules = ">=0.0.7"
bcrypt = ">=3.1.7"
@@ -274,12 +314,10 @@ all = [
]
[tool.poetry.dev-dependencies]
## We pin black so that our tests don't start failing on new releases.
# We pin black so that our tests don't start failing on new releases.
isort = ">=5.10.1"
black = ">=22.3.0"
flake8-comprehensions = "*"
flake8-bugbear = ">=21.3.2"
flake8 = "*"
ruff = "0.0.215"
# Typechecking
mypy = "*"

View File

@@ -1,9 +1,8 @@
#!/usr/bin/env bash
#
# Runs linting scripts over the local Synapse checkout
# isort - sorts import statements
# black - opinionated code formatter
# flake8 - lints and finds mistakes
# ruff - lints and finds mistakes
set -e
@@ -105,6 +104,7 @@ set -x
isort "${files[@]}"
python3 -m black "${files[@]}"
./scripts-dev/config-lint.sh
flake8 "${files[@]}"
# --quiet suppresses the update check.
ruff --quiet "${files[@]}"
./scripts-dev/check_pydantic_models.py lint
mypy

View File

@@ -14,6 +14,8 @@
# Stub for frozendict.
from __future__ import annotations
from typing import Any, Hashable, Iterable, Iterator, Mapping, Tuple, TypeVar, overload
_KT = TypeVar("_KT", bound=Hashable) # Key type.

View File

@@ -14,6 +14,8 @@
# Stub for PyICU.
from __future__ import annotations
class Locale:
@staticmethod
def getDefault() -> Locale: ...

View File

@@ -2,6 +2,8 @@
# https://github.com/grantjenks/python-sortedcontainers/blob/eea42df1f7bad2792e8da77335ff888f04b9e5ae/sortedcontainers/sorteddict.pyi
# (from https://github.com/grantjenks/python-sortedcontainers/pull/107)
from __future__ import annotations
from typing import (
Any,
Callable,

View File

@@ -2,6 +2,8 @@
# https://github.com/grantjenks/python-sortedcontainers/blob/a419ffbd2b1c935b09f11f0971696e537fd0c510/sortedcontainers/sortedlist.pyi
# (from https://github.com/grantjenks/python-sortedcontainers/pull/107)
from __future__ import annotations
from typing import (
Any,
Callable,

View File

@@ -2,6 +2,8 @@
# https://github.com/grantjenks/python-sortedcontainers/blob/d0a225d7fd0fb4c54532b8798af3cbeebf97e2d5/sortedcontainers/sortedset.pyi
# (from https://github.com/grantjenks/python-sortedcontainers/pull/107)
from __future__ import annotations
from typing import (
AbstractSet,
Any,

View File

@@ -1,4 +1,4 @@
from typing import Any, Collection, Dict, Mapping, Optional, Sequence, Tuple, Union
from typing import Any, Collection, Dict, Mapping, Optional, Sequence, Set, Tuple, Union
from synapse.types import JsonDict
@@ -54,3 +54,6 @@ class PushRuleEvaluator:
user_id: Optional[str],
display_name: Optional[str],
) -> Collection[Union[Mapping, str]]: ...
def matches(
self, condition: JsonDict, user_id: Optional[str], display_name: Optional[str]
) -> bool: ...

View File

@@ -1307,7 +1307,7 @@ def main() -> None:
sqlite_config = {
"name": "sqlite3",
"args": {
"database": args.sqlite_database,
"database": "file:{}?mode=rw".format(args.sqlite_database),
"cp_min": 1,
"cp_max": 1,
"check_same_thread": False,

View File

@@ -283,6 +283,9 @@ class FilterCollection:
await self._room_filter.filter(events)
)
def blocks_all_rooms(self) -> bool:
return self._room_filter.filters_all_rooms()
def blocks_all_presence(self) -> bool:
return (
self._presence_filter.filters_all_types()
@@ -351,13 +354,13 @@ class Filter:
self.not_rel_types = filter_json.get("org.matrix.msc3874.not_rel_types", [])
def filters_all_types(self) -> bool:
return "*" in self.not_types
return self.types == [] or "*" in self.not_types
def filters_all_senders(self) -> bool:
return "*" in self.not_senders
return self.senders == [] or "*" in self.not_senders
def filters_all_rooms(self) -> bool:
return "*" in self.not_rooms
return self.rooms == [] or "*" in self.not_rooms
def _check(self, event: FilterEvent) -> bool:
"""Checks whether the filter matches the given event.
@@ -450,8 +453,8 @@ class Filter:
if any(map(match_func, disallowed_values)):
return False
# Other the event does not match at least one of the allowed values,
# reject it.
# Otherwise if the event does not match at least one of the allowed
# values, reject it.
allowed_values = getattr(self, name)
if allowed_values is not None:
if not any(map(match_func, allowed_values)):

View File

@@ -199,6 +199,9 @@ class GenericWorkerServer(HomeServer):
"A 'media' listener is configured but the media"
" repository is disabled. Ignoring."
)
elif name == "health":
# Skip loading, health resource is always included
continue
if name == "openid" and "federation" not in res.names:
# Only load the openid resource separately if federation resource

View File

@@ -96,6 +96,9 @@ class SynapseHomeServer(HomeServer):
# Skip loading openid resource if federation is defined
# since federation resource will include openid
continue
if name == "health":
# Skip loading, health resource is always included
continue
resources.update(self._configure_named_resource(name, res.compress))
additional_resources = listener_config.http_options.additional_resources

View File

@@ -1,3 +1,5 @@
from __future__ import annotations
import argparse
from typing import (
Any,

View File

@@ -139,3 +139,6 @@ class ExperimentalConfig(Config):
# MSC3391: Removing account data.
self.msc3391_enabled = experimental.get("msc3391_enabled", False)
# MSC3925: do not replace events with their edits
self.msc3925_inhibit_edit = experimental.get("msc3925_inhibit_edit", False)

View File

@@ -117,6 +117,7 @@ OIDC_PROVIDER_CONFIG_SCHEMA = {
# to avoid importing authlib here.
"enum": ["client_secret_basic", "client_secret_post", "none"],
},
"pkce_method": {"type": "string", "enum": ["auto", "always", "never"]},
"scopes": {"type": "array", "items": {"type": "string"}},
"authorization_endpoint": {"type": "string"},
"token_endpoint": {"type": "string"},
@@ -289,6 +290,7 @@ def _parse_oidc_config_dict(
client_secret=oidc_config.get("client_secret"),
client_secret_jwt_key=client_secret_jwt_key,
client_auth_method=oidc_config.get("client_auth_method", "client_secret_basic"),
pkce_method=oidc_config.get("pkce_method", "auto"),
scopes=oidc_config.get("scopes", ["openid"]),
authorization_endpoint=oidc_config.get("authorization_endpoint"),
token_endpoint=oidc_config.get("token_endpoint"),
@@ -357,6 +359,10 @@ class OidcProviderConfig:
# 'none'.
client_auth_method: str
# Whether to enable PKCE when exchanging the authorization & token.
# Valid values are 'auto', 'always', and 'never'.
pkce_method: str
# list of scopes to request
scopes: Collection[str]

View File

@@ -403,6 +403,14 @@ class EventClientSerializer:
clients.
"""
def __init__(self, inhibit_replacement_via_edits: bool = False):
"""
Args:
inhibit_replacement_via_edits: If this is set to True, then events are
never replaced by their edits.
"""
self._inhibit_replacement_via_edits = inhibit_replacement_via_edits
def serialize_event(
self,
event: Union[JsonDict, EventBase],
@@ -422,6 +430,8 @@ class EventClientSerializer:
into the event.
apply_edits: Whether the content of the event should be modified to reflect
any replacement in `bundle_aggregations[<event_id>].replace`.
See also the `inhibit_replacement_via_edits` constructor arg: if that is
set to True, then this argument is ignored.
Returns:
The serialized event
"""
@@ -495,7 +505,8 @@ class EventClientSerializer:
again for additional events in a recursive manner.
serialized_event: The serialized event which may be modified.
apply_edits: Whether the content of the event should be modified to reflect
any replacement in `aggregations.replace`.
any replacement in `aggregations.replace` (subject to the
`inhibit_replacement_via_edits` constructor arg).
"""
# We have already checked that aggregations exist for this event.
@@ -518,15 +529,21 @@ class EventClientSerializer:
if event_aggregations.replace:
# If there is an edit, optionally apply it to the event.
edit = event_aggregations.replace
if apply_edits:
if apply_edits and not self._inhibit_replacement_via_edits:
self._apply_edit(event, serialized_event, edit)
# Include information about it in the relations dict.
serialized_aggregations[RelationTypes.REPLACE] = {
"event_id": edit.event_id,
"origin_server_ts": edit.origin_server_ts,
"sender": edit.sender,
}
#
# Matrix spec v1.5 (https://spec.matrix.org/v1.5/client-server-api/#server-side-aggregation-of-mreplace-relationships)
# said that we should only include the `event_id`, `origin_server_ts` and
# `sender` of the edit; however MSC3925 proposes extending it to the whole
# of the edit, which is what we do here.
serialized_aggregations[RelationTypes.REPLACE] = self.serialize_event(
edit,
time_now,
config=config,
apply_edits=False,
)
# Include any threaded replies to this event.
if event_aggregations.thread:

View File

@@ -14,6 +14,7 @@
# See the License for the specific language governing permissions and
# limitations under the License.
import logging
from http import HTTPStatus
from typing import (
TYPE_CHECKING,
Any,
@@ -33,6 +34,7 @@ from synapse.api.errors import (
Codes,
FederationDeniedError,
HttpResponseException,
InvalidAPICallError,
RequestSendFailed,
SynapseError,
)
@@ -45,6 +47,7 @@ from synapse.types import (
JsonDict,
StreamKeyType,
StreamToken,
UserID,
get_domain_from_id,
get_verify_key_from_cross_signing_key,
)
@@ -893,12 +896,47 @@ class DeviceListWorkerUpdater:
def __init__(self, hs: "HomeServer"):
from synapse.replication.http.devices import (
ReplicationMultiUserDevicesResyncRestServlet,
ReplicationUserDevicesResyncRestServlet,
)
self._user_device_resync_client = (
ReplicationUserDevicesResyncRestServlet.make_client(hs)
)
self._multi_user_device_resync_client = (
ReplicationMultiUserDevicesResyncRestServlet.make_client(hs)
)
async def multi_user_device_resync(
self, user_ids: List[str], mark_failed_as_stale: bool = True
) -> Dict[str, Optional[JsonDict]]:
"""
Like `user_device_resync` but operates on multiple users **from the same origin**
at once.
Returns:
Dict from User ID to the same Dict as `user_device_resync`.
"""
# mark_failed_as_stale is not sent. Ensure this doesn't break expectations.
assert mark_failed_as_stale
if not user_ids:
# Shortcut empty requests
return {}
try:
return await self._multi_user_device_resync_client(user_ids=user_ids)
except SynapseError as err:
if not (
err.code == HTTPStatus.NOT_FOUND and err.errcode == Codes.UNRECOGNIZED
):
raise
# Fall back to single requests
result: Dict[str, Optional[JsonDict]] = {}
for user_id in user_ids:
result[user_id] = await self._user_device_resync_client(user_id=user_id)
return result
async def user_device_resync(
self, user_id: str, mark_failed_as_stale: bool = True
@@ -913,8 +951,10 @@ class DeviceListWorkerUpdater:
A dict with device info as under the "devices" in the result of this
request:
https://matrix.org/docs/spec/server_server/r0.1.2#get-matrix-federation-v1-user-devices-userid
None when we weren't able to fetch the device info for some reason,
e.g. due to a connection problem.
"""
return await self._user_device_resync_client(user_id=user_id)
return (await self.multi_user_device_resync([user_id]))[user_id]
class DeviceListUpdater(DeviceListWorkerUpdater):
@@ -1160,19 +1200,66 @@ class DeviceListUpdater(DeviceListWorkerUpdater):
# Allow future calls to retry resyncinc out of sync device lists.
self._resync_retry_in_progress = False
async def multi_user_device_resync(
self, user_ids: List[str], mark_failed_as_stale: bool = True
) -> Dict[str, Optional[JsonDict]]:
"""
Like `user_device_resync` but operates on multiple users **from the same origin**
at once.
Returns:
Dict from User ID to the same Dict as `user_device_resync`.
"""
if not user_ids:
return {}
origins = {UserID.from_string(user_id).domain for user_id in user_ids}
if len(origins) != 1:
raise InvalidAPICallError(f"Only one origin permitted, got {origins!r}")
result = {}
failed = set()
# TODO(Perf): Actually batch these up
for user_id in user_ids:
user_result, user_failed = await self._user_device_resync_returning_failed(
user_id
)
result[user_id] = user_result
if user_failed:
failed.add(user_id)
if mark_failed_as_stale:
await self.store.mark_remote_users_device_caches_as_stale(failed)
return result
async def user_device_resync(
self, user_id: str, mark_failed_as_stale: bool = True
) -> Optional[JsonDict]:
result, failed = await self._user_device_resync_returning_failed(user_id)
if failed and mark_failed_as_stale:
# Mark the remote user's device list as stale so we know we need to retry
# it later.
await self.store.mark_remote_users_device_caches_as_stale((user_id,))
return result
async def _user_device_resync_returning_failed(
self, user_id: str
) -> Tuple[Optional[JsonDict], bool]:
"""Fetches all devices for a user and updates the device cache with them.
Args:
user_id: The user's id whose device_list will be updated.
mark_failed_as_stale: Whether to mark the user's device list as stale
if the attempt to resync failed.
Returns:
A dict with device info as under the "devices" in the result of this
request:
https://matrix.org/docs/spec/server_server/r0.1.2#get-matrix-federation-v1-user-devices-userid
- A dict with device info as under the "devices" in the result of this
request:
https://matrix.org/docs/spec/server_server/r0.1.2#get-matrix-federation-v1-user-devices-userid
None when we weren't able to fetch the device info for some reason,
e.g. due to a connection problem.
- True iff the resync failed and the device list should be marked as stale.
"""
logger.debug("Attempting to resync the device list for %s", user_id)
log_kv({"message": "Doing resync to update device list."})
@@ -1181,12 +1268,7 @@ class DeviceListUpdater(DeviceListWorkerUpdater):
try:
result = await self.federation.query_user_devices(origin, user_id)
except NotRetryingDestination:
if mark_failed_as_stale:
# Mark the remote user's device list as stale so we know we need to retry
# it later.
await self.store.mark_remote_user_device_cache_as_stale(user_id)
return None
return None, True
except (RequestSendFailed, HttpResponseException) as e:
logger.warning(
"Failed to handle device list update for %s: %s",
@@ -1194,23 +1276,18 @@ class DeviceListUpdater(DeviceListWorkerUpdater):
e,
)
if mark_failed_as_stale:
# Mark the remote user's device list as stale so we know we need to retry
# it later.
await self.store.mark_remote_user_device_cache_as_stale(user_id)
# We abort on exceptions rather than accepting the update
# as otherwise synapse will 'forget' that its device list
# is out of date. If we bail then we will retry the resync
# next time we get a device list update for this user_id.
# This makes it more likely that the device lists will
# eventually become consistent.
return None
return None, True
except FederationDeniedError as e:
set_tag("error", True)
log_kv({"reason": "FederationDeniedError"})
logger.info(e)
return None
return None, False
except Exception as e:
set_tag("error", True)
log_kv(
@@ -1218,12 +1295,7 @@ class DeviceListUpdater(DeviceListWorkerUpdater):
)
logger.exception("Failed to handle device list update for %s", user_id)
if mark_failed_as_stale:
# Mark the remote user's device list as stale so we know we need to retry
# it later.
await self.store.mark_remote_user_device_cache_as_stale(user_id)
return None
return None, True
log_kv({"result": result})
stream_id = result["stream_id"]
devices = result["devices"]
@@ -1305,7 +1377,7 @@ class DeviceListUpdater(DeviceListWorkerUpdater):
# point.
self._seen_updates[user_id] = {stream_id}
return result
return result, False
async def process_cross_signing_key_update(
self,

View File

@@ -195,7 +195,7 @@ class DeviceMessageHandler:
sender_user_id,
unknown_devices,
)
await self.store.mark_remote_user_device_cache_as_stale(sender_user_id)
await self.store.mark_remote_users_device_caches_as_stale((sender_user_id,))
# Immediately attempt a resync in the background
run_in_background(self._user_device_resync, user_id=sender_user_id)

View File

@@ -36,8 +36,8 @@ from synapse.types import (
get_domain_from_id,
get_verify_key_from_cross_signing_key,
)
from synapse.util import json_decoder, unwrapFirstError
from synapse.util.async_helpers import Linearizer, delay_cancellation
from synapse.util import json_decoder
from synapse.util.async_helpers import Linearizer, concurrently_execute
from synapse.util.cancellation import cancellable
from synapse.util.retryutils import NotRetryingDestination
@@ -238,24 +238,28 @@ class E2eKeysHandler:
# Now fetch any devices that we don't have in our cache
# TODO It might make sense to propagate cancellations into the
# deferreds which are querying remote homeservers.
await make_deferred_yieldable(
delay_cancellation(
defer.gatherResults(
[
run_in_background(
self._query_devices_for_destination,
results,
cross_signing_keys,
failures,
destination,
queries,
timeout,
)
for destination, queries in remote_queries_not_in_cache.items()
],
consumeErrors=True,
).addErrback(unwrapFirstError)
logger.debug(
"%d destinations to query devices for", len(remote_queries_not_in_cache)
)
async def _query(
destination_queries: Tuple[str, Dict[str, Iterable[str]]]
) -> None:
destination, queries = destination_queries
return await self._query_devices_for_destination(
results,
cross_signing_keys,
failures,
destination,
queries,
timeout,
)
await concurrently_execute(
_query,
remote_queries_not_in_cache.items(),
10,
delay_cancellation=True,
)
ret = {"device_keys": results, "failures": failures}
@@ -300,28 +304,41 @@ class E2eKeysHandler:
# queries. We use the more efficient batched query_client_keys for all
# remaining users
user_ids_updated = []
for (user_id, device_list) in destination_query.items():
if user_id in user_ids_updated:
continue
if device_list:
continue
# Perform a user device resync for each user only once and only as long as:
# - they have an empty device_list
# - they are in some rooms that this server can see
users_to_resync_devices = {
user_id
for (user_id, device_list) in destination_query.items()
if (not device_list) and (await self.store.get_rooms_for_user(user_id))
}
room_ids = await self.store.get_rooms_for_user(user_id)
if not room_ids:
continue
logger.debug(
"%d users to resync devices for from destination %s",
len(users_to_resync_devices),
destination,
)
# We've decided we're sharing a room with this user and should
# probably be tracking their device lists. However, we haven't
# done an initial sync on the device list so we do it now.
try:
resync_results = (
await self.device_handler.device_list_updater.user_device_resync(
user_id
)
try:
user_resync_results = (
await self.device_handler.device_list_updater.multi_user_device_resync(
list(users_to_resync_devices)
)
)
for user_id in users_to_resync_devices:
resync_results = user_resync_results[user_id]
if resync_results is None:
raise ValueError("Device resync failed")
# TODO: It's weird that we'll store a failure against a
# destination, yet continue processing users from that
# destination.
# We might want to consider changing this, but for now
# I'm leaving it as I found it.
failures[destination] = _exception_to_failure(
ValueError(f"Device resync failed for {user_id!r}")
)
continue
# Add the device keys to the results.
user_devices = resync_results["devices"]
@@ -339,8 +356,8 @@ class E2eKeysHandler:
if self_signing_key:
cross_signing_keys["self_signing_keys"][user_id] = self_signing_key
except Exception as e:
failures[destination] = _exception_to_failure(e)
except Exception as e:
failures[destination] = _exception_to_failure(e)
if len(destination_query) == len(user_ids_updated):
# We've updated all the users in the query and we do not need to

View File

@@ -1423,7 +1423,7 @@ class FederationEventHandler:
"""
try:
await self._store.mark_remote_user_device_cache_as_stale(sender)
await self._store.mark_remote_users_device_caches_as_stale((sender,))
# Immediately attempt a resync in the background
if self._config.worker.worker_app:

View File

@@ -1531,12 +1531,23 @@ class EventCreationHandler:
external federation senders don't have to recalculate it themselves.
"""
for event, _ in events_and_context:
if not self._external_cache.is_enabled():
return
if not self._external_cache.is_enabled():
return
# If external cache is enabled we should always have this.
assert self._external_cache_joined_hosts_updates is not None
# If external cache is enabled we should always have this.
assert self._external_cache_joined_hosts_updates is not None
for event, event_context in events_and_context:
if event_context.partial_state:
# To populate the cache for a partial-state event, we either have to
# block until full state, which the code below does, or change the
# meaning of cache values to be the list of hosts to which we plan to
# send events and calculate that instead.
#
# The federation senders don't use the external cache when sending
# events in partial-state rooms anyway, so let's not bother populating
# the cache.
continue
# We actually store two mappings, event ID -> prev state group,
# state group -> joined hosts, which is much more space efficient

View File

@@ -36,6 +36,7 @@ from authlib.jose import JsonWebToken, JWTClaims
from authlib.jose.errors import InvalidClaimError, JoseError, MissingClaimError
from authlib.oauth2.auth import ClientAuth
from authlib.oauth2.rfc6749.parameters import prepare_grant_uri
from authlib.oauth2.rfc7636.challenge import create_s256_code_challenge
from authlib.oidc.core import CodeIDToken, UserInfo
from authlib.oidc.discovery import OpenIDProviderMetadata, get_well_known_url
from jinja2 import Environment, Template
@@ -475,6 +476,16 @@ class OidcProvider:
)
)
# If PKCE support is advertised ensure the wanted method is available.
if m.get("code_challenge_methods_supported") is not None:
m.validate_code_challenge_methods_supported()
if "S256" not in m["code_challenge_methods_supported"]:
raise ValueError(
'"S256" not in "code_challenge_methods_supported" ({supported!r})'.format(
supported=m["code_challenge_methods_supported"],
)
)
if m.get("response_types_supported") is not None:
m.validate_response_types_supported()
@@ -602,6 +613,11 @@ class OidcProvider:
if self._config.jwks_uri:
metadata["jwks_uri"] = self._config.jwks_uri
if self._config.pkce_method == "always":
metadata["code_challenge_methods_supported"] = ["S256"]
elif self._config.pkce_method == "never":
metadata.pop("code_challenge_methods_supported", None)
self._validate_metadata(metadata)
return metadata
@@ -653,7 +669,7 @@ class OidcProvider:
return jwk_set
async def _exchange_code(self, code: str) -> Token:
async def _exchange_code(self, code: str, code_verifier: str) -> Token:
"""Exchange an authorization code for a token.
This calls the ``token_endpoint`` with the authorization code we
@@ -666,6 +682,7 @@ class OidcProvider:
Args:
code: The authorization code we got from the callback.
code_verifier: The PKCE code verifier to send, blank if unused.
Returns:
A dict containing various tokens.
@@ -696,6 +713,8 @@ class OidcProvider:
"code": code,
"redirect_uri": self._callback_url,
}
if code_verifier:
args["code_verifier"] = code_verifier
body = urlencode(args, True)
# Fill the body/headers with credentials
@@ -914,11 +933,14 @@ class OidcProvider:
- ``scope``: the list of scopes set in ``oidc_config.scopes``
- ``state``: a random string
- ``nonce``: a random string
- ``code_challenge``: a RFC7636 code challenge (if PKCE is supported)
In addition generating a redirect URL, we are setting a cookie with
a signed macaroon token containing the state, the nonce and the
client_redirect_url params. Those are then checked when the client
comes back from the provider.
In addition to generating a redirect URL, we are setting a cookie with
a signed macaroon token containing the state, the nonce, the
client_redirect_url, and (optionally) the code_verifier params. The state,
nonce, and client_redirect_url are then checked when the client comes back
from the provider. The code_verifier is passed back to the server during
the token exchange and compared to the code_challenge sent in this request.
Args:
request: the incoming request from the browser.
@@ -935,10 +957,25 @@ class OidcProvider:
state = generate_token()
nonce = generate_token()
code_verifier = ""
if not client_redirect_url:
client_redirect_url = b""
metadata = await self.load_metadata()
# Automatically enable PKCE if it is supported.
extra_grant_values = {}
if metadata.get("code_challenge_methods_supported"):
code_verifier = generate_token(48)
# Note that we verified the server supports S256 earlier (in
# OidcProvider._validate_metadata).
extra_grant_values = {
"code_challenge_method": "S256",
"code_challenge": create_s256_code_challenge(code_verifier),
}
cookie = self._macaroon_generaton.generate_oidc_session_token(
state=state,
session_data=OidcSessionData(
@@ -946,6 +983,7 @@ class OidcProvider:
nonce=nonce,
client_redirect_url=client_redirect_url.decode(),
ui_auth_session_id=ui_auth_session_id or "",
code_verifier=code_verifier,
),
)
@@ -966,7 +1004,6 @@ class OidcProvider:
)
)
metadata = await self.load_metadata()
authorization_endpoint = metadata.get("authorization_endpoint")
return prepare_grant_uri(
authorization_endpoint,
@@ -976,6 +1013,7 @@ class OidcProvider:
scope=self._scopes,
state=state,
nonce=nonce,
**extra_grant_values,
)
async def handle_oidc_callback(
@@ -1003,7 +1041,9 @@ class OidcProvider:
# Exchange the code with the provider
try:
logger.debug("Exchanging OAuth2 code for a token")
token = await self._exchange_code(code)
token = await self._exchange_code(
code, code_verifier=session_data.code_verifier
)
except OidcError as e:
logger.warning("Could not exchange OAuth2 code: %s", e)
self._sso_handler.render_error(request, e.error, e.error_description)
@@ -1520,8 +1560,8 @@ env.filters.update(
@attr.s(slots=True, frozen=True, auto_attribs=True)
class JinjaOidcMappingConfig:
subject_claim: str
picture_claim: str
subject_template: Template
picture_template: Template
localpart_template: Optional[Template]
display_name_template: Optional[Template]
email_template: Optional[Template]
@@ -1540,8 +1580,23 @@ class JinjaOidcMappingProvider(OidcMappingProvider[JinjaOidcMappingConfig]):
@staticmethod
def parse_config(config: dict) -> JinjaOidcMappingConfig:
subject_claim = config.get("subject_claim", "sub")
picture_claim = config.get("picture_claim", "picture")
def parse_template_config_with_claim(
option_name: str, default_claim: str
) -> Template:
template_name = f"{option_name}_template"
template = config.get(template_name)
if not template:
# Convert the legacy subject_claim into a template.
claim = config.get(f"{option_name}_claim", default_claim)
template = "{{ user.%s }}" % (claim,)
try:
return env.from_string(template)
except Exception as e:
raise ConfigError("invalid jinja template", path=[template_name]) from e
subject_template = parse_template_config_with_claim("subject", "sub")
picture_template = parse_template_config_with_claim("picture", "picture")
def parse_template_config(option_name: str) -> Optional[Template]:
if option_name not in config:
@@ -1574,8 +1629,8 @@ class JinjaOidcMappingProvider(OidcMappingProvider[JinjaOidcMappingConfig]):
raise ConfigError("must be a bool", path=["confirm_localpart"])
return JinjaOidcMappingConfig(
subject_claim=subject_claim,
picture_claim=picture_claim,
subject_template=subject_template,
picture_template=picture_template,
localpart_template=localpart_template,
display_name_template=display_name_template,
email_template=email_template,
@@ -1584,7 +1639,7 @@ class JinjaOidcMappingProvider(OidcMappingProvider[JinjaOidcMappingConfig]):
)
def get_remote_user_id(self, userinfo: UserInfo) -> str:
return userinfo[self._config.subject_claim]
return self._config.subject_template.render(user=userinfo).strip()
async def map_user_attributes(
self, userinfo: UserInfo, token: Token, failures: int
@@ -1615,7 +1670,7 @@ class JinjaOidcMappingProvider(OidcMappingProvider[JinjaOidcMappingConfig]):
if email:
emails.append(email)
picture = userinfo.get("picture")
picture = self._config.picture_template.render(user=userinfo).strip()
return UserAttributeDict(
localpart=localpart,

View File

@@ -275,7 +275,7 @@ class SearchHandler:
)
room_ids = {r.room_id for r in rooms}
# If doing a subset of all rooms seearch, check if any of the rooms
# If doing a subset of all rooms search, check if any of the rooms
# are from an upgraded room, and search their contents as well
if search_filter.rooms:
historical_room_ids: List[str] = []

View File

@@ -37,6 +37,7 @@ from synapse.api.presence import UserPresenceState
from synapse.api.room_versions import KNOWN_ROOM_VERSIONS
from synapse.events import EventBase
from synapse.handlers.relations import BundledAggregations
from synapse.logging import issue9533_logger
from synapse.logging.context import current_context
from synapse.logging.opentracing import (
SynapseTags,
@@ -1402,11 +1403,14 @@ class SyncHandler:
logger.debug("Fetching room data")
res = await self._generate_sync_entry_for_rooms(
(
newly_joined_rooms,
newly_joined_or_invited_or_knocked_users,
newly_left_rooms,
newly_left_users,
) = await self._generate_sync_entry_for_rooms(
sync_result_builder, account_data_by_room
)
newly_joined_rooms, newly_joined_or_invited_or_knocked_users, _, _ = res
_, _, newly_left_rooms, newly_left_users = res
block_all_presence_data = (
since_token is None and sync_config.filter_collection.blocks_all_presence()
@@ -1623,13 +1627,18 @@ class SyncHandler:
}
)
logger.debug(
"Returning %d to-device messages between %d and %d (current token: %d)",
len(messages),
since_stream_id,
stream_id,
now_token.to_device_key,
)
if messages and issue9533_logger.isEnabledFor(logging.DEBUG):
issue9533_logger.debug(
"Returning to-device messages with stream_ids (%d, %d]; now: %d;"
" msgids: %s",
since_stream_id,
stream_id,
now_token.to_device_key,
[
message["content"].get(EventContentFields.TO_DEVICE_MSGID)
for message in messages
],
)
sync_result_builder.now_token = now_token.copy_and_replace(
StreamKeyType.TO_DEVICE, stream_id
)
@@ -1783,6 +1792,11 @@ class SyncHandler:
- newly_left_rooms
- newly_left_users
"""
# If the request doesn't care about rooms then nothing to do!
if sync_result_builder.sync_config.filter_collection.blocks_all_rooms():
return set(), set(), set(), set()
since_token = sync_result_builder.since_token
# 1. Start by fetching all ephemeral events in rooms we've joined (if required).

View File

@@ -18,6 +18,7 @@ from typing import (
TYPE_CHECKING,
Any,
Callable,
Collection,
Dict,
Generator,
Iterable,
@@ -126,7 +127,7 @@ from synapse.types import (
from synapse.types.state import StateFilter
from synapse.util import Clock
from synapse.util.async_helpers import maybe_awaitable
from synapse.util.caches.descriptors import CachedFunction, cached
from synapse.util.caches.descriptors import CachedFunction, cached as _cached
from synapse.util.frozenutils import freeze
if TYPE_CHECKING:
@@ -136,6 +137,7 @@ if TYPE_CHECKING:
T = TypeVar("T")
P = ParamSpec("P")
F = TypeVar("F", bound=Callable[..., Any])
"""
This package defines the 'stable' API which can be used by extension modules which
@@ -185,6 +187,42 @@ class UserIpAndAgent:
last_seen: int
def cached(
*,
max_entries: int = 1000,
num_args: Optional[int] = None,
uncached_args: Optional[Collection[str]] = None,
) -> Callable[[F], CachedFunction[F]]:
"""Returns a decorator that applies a memoizing cache around the function. This
decorator behaves similarly to functools.lru_cache.
Example:
@cached()
def foo('a', 'b'):
...
Added in Synapse v1.74.0.
Args:
max_entries: The maximum number of entries in the cache. If the cache is full
and a new entry is added, the least recently accessed entry will be evicted
from the cache.
num_args: The number of positional arguments (excluding `self`) to use as cache
keys. Defaults to all named args of the function.
uncached_args: A list of argument names to not use as the cache key. (`self` is
always ignored.) Cannot be used with num_args.
Returns:
A decorator that applies a memoizing cache around the function.
"""
return _cached(
max_entries=max_entries,
num_args=num_args,
uncached_args=uncached_args,
)
class ModuleApi:
"""A proxy object that gets passed to various plugin modules so they
can register new users etc if necessary.

View File

@@ -26,10 +26,7 @@ def format_push_rules_for_user(
"""Converts a list of rawrules and a enabled map into nested dictionaries
to match the Matrix client-server format for push rules"""
rules: Dict[str, Dict[str, List[Dict[str, Any]]]] = {
"global": {},
"device": {},
}
rules: Dict[str, Dict[str, List[Dict[str, Any]]]] = {"global": {}}
rules["global"] = _add_empty_priority_class_arrays(rules["global"])

View File

@@ -49,7 +49,6 @@ class ReplicationAddUserAccountDataRestServlet(ReplicationEndpoint):
super().__init__(hs)
self.handler = hs.get_account_data_handler()
self.clock = hs.get_clock()
@staticmethod
async def _serialize_payload( # type: ignore[override]
@@ -94,7 +93,6 @@ class ReplicationRemoveUserAccountDataRestServlet(ReplicationEndpoint):
super().__init__(hs)
self.handler = hs.get_account_data_handler()
self.clock = hs.get_clock()
@staticmethod
async def _serialize_payload( # type: ignore[override]
@@ -133,7 +131,6 @@ class ReplicationAddRoomAccountDataRestServlet(ReplicationEndpoint):
super().__init__(hs)
self.handler = hs.get_account_data_handler()
self.clock = hs.get_clock()
@staticmethod
async def _serialize_payload( # type: ignore[override]
@@ -178,7 +175,6 @@ class ReplicationRemoveRoomAccountDataRestServlet(ReplicationEndpoint):
super().__init__(hs)
self.handler = hs.get_account_data_handler()
self.clock = hs.get_clock()
@staticmethod
async def _serialize_payload( # type: ignore[override]
@@ -217,7 +213,6 @@ class ReplicationAddTagRestServlet(ReplicationEndpoint):
super().__init__(hs)
self.handler = hs.get_account_data_handler()
self.clock = hs.get_clock()
@staticmethod
async def _serialize_payload( # type: ignore[override]
@@ -264,7 +259,6 @@ class ReplicationRemoveTagRestServlet(ReplicationEndpoint):
super().__init__(hs)
self.handler = hs.get_account_data_handler()
self.clock = hs.get_clock()
@staticmethod
async def _serialize_payload(user_id: str, room_id: str, tag: str) -> JsonDict: # type: ignore[override]
@@ -285,8 +279,10 @@ class ReplicationRemoveTagRestServlet(ReplicationEndpoint):
def register_servlets(hs: "HomeServer", http_server: HttpServer) -> None:
ReplicationAddUserAccountDataRestServlet(hs).register(http_server)
ReplicationRemoveUserAccountDataRestServlet(hs).register(http_server)
ReplicationAddRoomAccountDataRestServlet(hs).register(http_server)
ReplicationRemoveRoomAccountDataRestServlet(hs).register(http_server)
ReplicationAddTagRestServlet(hs).register(http_server)
ReplicationRemoveTagRestServlet(hs).register(http_server)
if hs.config.experimental.msc3391_enabled:
ReplicationRemoveUserAccountDataRestServlet(hs).register(http_server)
ReplicationRemoveRoomAccountDataRestServlet(hs).register(http_server)

Some files were not shown because too many files have changed in this diff Show More