1
0

Compare commits

..

15 Commits

Author SHA1 Message Date
Andrew Morgan
4c781a1662 Changelog 2020-11-13 16:23:00 +00:00
Andrew Morgan
55163a0c80 Add knock membership events to stats generation 2020-11-13 16:23:00 +00:00
Andrew Morgan
a2ab56a068 Implement locally rescinding a federated knock
As mentioned in the MSC, a user can rescind (take back) a knock while it
is pending by sending a leave event to the room. This will set their
membership to leave instead of knock.

Now, this functionality already worked before this commit for rooms that
the homeserver was already in. What didn't work was:

* Rescinding a knock over federation to a room with active homeservers
* Rescinding a knock over federation to a room with inactive homeservers

This commit addresses the second bullet point, and leaves the first
bullet point as a TODO (as it is an edge case an not immediately obvious
how it would be done).

What this commit does is crib off the same functionality as locally
rejecting an invite. That occurs when we are unable to contact the
homeserver that originally sent us an invite. Instead an out-of-band
leave membership event will be generated and sent to clients locally.

The same is happening here. You can mostly ignore the new
generate_local_out_of_band_membership methods, those are just some
structural bits to allow us to call that method from RoomMemberHandler.

The real meat of this commit is moving about and adding some logic in
`update_membership_locked`, specifically for when we're updating a
user's membership to "leave". There was already some code in there to
check whether the room to send the leave to was a room the homeserver is
not currently a part of. In that case, we'd remote reject the knock.
This commit just extends that to also rescind knocks if the user's
membership in the room is currently "knock". We skip the remote attempt
for now and go straight to generating a local leave membership event.
2020-11-13 16:23:00 +00:00
Andrew Morgan
9bdd420f9b Generalise _locally_reject_invite
We'll be using this to rescind knocks locally too, so generalise the method.
2020-11-13 16:23:00 +00:00
Andrew Morgan
f10ee502c7 Add some handy db methods for knocking
This just adds some database methods that we'll need to use when
implementing rescinding of knocks of clients. They are equivalent to
their invite-related counterparts.
2020-11-13 16:23:00 +00:00
Andrew Morgan
fa7f378bba Auto-add displaynames to knock events if they're missing
Tiny commit to just bring knocking up to feature parity.
2020-11-13 16:23:00 +00:00
Andrew Morgan
3dbe05d022 Extend sync to inform clients about the progress of their knocks
So we've got federation so that homeservers can communicate knocking
information between them - but how does that information actually get
down to the client? The client knows that it knocked successfully from a
200 in its original request, but what else does it need? This commit
adds a new "knock" section to /sync (in addition to "invite", "join",
and "leave") all help give the client the information it needs.

The new "knock" section is used for sending down the stripped state
events we collected earlier. The client will use these to display the
room and its metadata in a little "pending knocks" section or similar.

This is all this commit adds. If the user's knock has been accepted or
rejected, they will receive that information in the "join" or "leave"
sections of /sync.

Most of this code is just cribbing off the invite and join sync code yet
again, with some minor differences. For instance, we don't need to
exclude knock events from sync if the sender is in your ignore list, as
you are the only ones that can send knocks for yourself.

The structure of the "knock" dict in sync is modeled after "invite", as
clients also receive stripped state in that. The structure can be viewed
in the linked MSC.
2020-11-13 16:23:00 +00:00
Andrew Morgan
09e1942b04 Send stripped state events back to the knocking homeserver
Here we finally send the stripped state events back to the knocking
homeserver, which then ingests and stores those events.

A future commit will actually start sending those events down /sync to
the relevant user.
2020-11-13 16:23:00 +00:00
Andrew Morgan
a4dafd407d Federation: make_knock and send_knock implementations
Most of this is explained in the linked MSC (and don't miss the sequence
diagram in the MSC comments), but roughly knocking takes inspiration from
room joins and room invites. This commit is the room join stuff.

First the knocking homeserver POSTs to the make_knock endpoint on
another homeserver. The other homeserver will send back a knock event
that is valid for the knocking user and the room that they are knocking
on. The knocking homeserver will sign the event and send it back, before
the other homeserver takes that event and then sends it into the room on
the knocking homeserver's behalf.

It's worth noting that the accepting/rejecting knocks all happen over
existing room invite/leave flows. A homeserver rescinding its knock as
well is also just sending a leave.

Once the event has been inserted into the room, the homeserver that's in
the room will send back a 200 and an empty JSON dict to confirm
everything went well to the knocker. In a future commit, this dict will
instead be filled with some stripped state events from the room which
the knocking homeserver will pass back to the knocking user.

And yes, the logging statements in this commit are intentional. They're
consistent with the rest of the file :)
2020-11-13 16:22:56 +00:00
Andrew Morgan
fa1de1d858 Rename maybe_store_room_on_{invite,outlier_membership}
There's a handy function called maybe_store_room_on_invite which allows us to
create an entry in the rooms table for a room and its version for which we
aren't joined to yet, but we can reference when ingesting events about.

This is currently used for invites where we receive some stripped state about
the room and pass it down via /sync to the client, without us being in the
room yet.

There is a similar requirement for knocking, where we will eventually do the
same thing, and need an entry in the rooms table as well. Thus, reusing this
function works, however its name needs to be generalised a bit.

So that is what this commit does.
2020-11-11 17:19:13 +00:00
Andrew Morgan
63c767ac10 Add room_knock_state_types config option
This option serves the same purpose as the existing
room_invite_state_types option, which defines what state events are sent
over to a user that is invited to a room. This information is necessary
for the user - who isn't in the room yet - to get some metadata about
the room in order to display it in a pretty fashion in the user's
pending-invites list.

It includes information such as the room's name, avatar, topic,
canonical alias, room encryption state etc. as well as the invite
membership event which the invited user's homeserver can reference.

This new option is the exact same, but is sent by a homeserver in the
room to the knocker during the knock process. This option will actually
be utilised in a later commit.
2020-11-11 16:46:17 +00:00
Andrew Morgan
c4ff5ddd00 Add CS /_matrix/client/r0/knock/{roomIdOrAlias} endpoint
We're ditching the usual idea of having two endpoints for each
membership-related endpoint as per the MSC. Thus knocking only gets the
more powerful variant (the one that supports room aliases as well as
IDs. The reason is also optional.

The other small change is just to ensure displaynames get added to the
content of this particular membership event.
2020-11-11 16:40:41 +00:00
Andrew Morgan
6bd57f4bf3 Update the event auth rules for knocking
Hopefully most of these changes are explained through the added comments and error
messages. The changes are also described conceptually in the MSC:
https://github.com/Sorunome/matrix-doc/blob/soru/knock/proposals/2403-knock.md#join-rules
2020-11-11 16:40:38 +00:00
Andrew Morgan
f7d11de227 Add xyz.amorgan.knock /versions string 2020-11-10 19:04:49 +00:00
Andrew Morgan
8b79ea03e5 Add new experimental room version for knocking 2020-11-10 19:04:10 +00:00
886 changed files with 18220 additions and 45156 deletions

View File

@@ -1,4 +1,5 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# Copyright 2019 The Matrix.org Foundation C.I.C.
#
# Licensed under the Apache License, Version 2.0 (the "License");
@@ -14,7 +15,6 @@
# limitations under the License.
import logging
from synapse.storage.engines import create_engine
logger = logging.getLogger("create_postgres_db")

View File

@@ -1,16 +1,13 @@
#!/usr/bin/env bash
#!/bin/bash
# this script is run by buildkite in a plain `bionic` container; it installs the
# minimal requirements for tox and hands over to the py3-old tox environment.
# this script is run by buildkite in a plain `xenial` container; it installs the
# minimal requirements for tox and hands over to the py35-old tox environment.
set -ex
apt-get update
apt-get install -y python3 python3-dev python3-pip libxml2-dev libxslt-dev xmlsec1 zlib1g-dev tox
apt-get install -y python3.5 python3.5-dev python3-pip libxml2-dev libxslt-dev zlib1g-dev tox
export LANG="C.UTF-8"
# Prevent virtualenv from auto-updating pip to an incompatible version
export VIRTUALENV_NO_DOWNLOAD=1
exec tox -e py3-old,combine
exec tox -e py35-old,combine

View File

@@ -1,4 +1,4 @@
#!/usr/bin/env bash
#!/bin/bash
#
# Test script for 'synapse_port_db', which creates a virtualenv, installs Synapse along
# with additional dependencies needed for the test (such as coverage or the PostgreSQL

Binary file not shown.

View File

@@ -5,29 +5,30 @@ jobs:
- image: docker:git
steps:
- checkout
- setup_remote_docker
- docker_prepare
- run: docker login --username $DOCKER_HUB_USERNAME --password $DOCKER_HUB_PASSWORD
# for release builds, we want to get the amd64 image out asap, so first
# we do an amd64-only build, before following up with a multiarch build.
- docker_build:
tag: -t matrixdotorg/synapse:${CIRCLE_TAG}
platforms: linux/amd64
- docker_build:
tag: -t matrixdotorg/synapse:${CIRCLE_TAG}
platforms: linux/amd64,linux/arm64
platforms: linux/amd64,linux/arm/v7,linux/arm64
dockerhubuploadlatest:
docker:
- image: docker:git
steps:
- checkout
- setup_remote_docker
- docker_prepare
- run: docker login --username $DOCKER_HUB_USERNAME --password $DOCKER_HUB_PASSWORD
# for `latest`, we don't want the arm images to disappear, so don't update the tag
# until all of the platforms are built.
- docker_build:
tag: -t matrixdotorg/synapse:latest
platforms: linux/amd64,linux/arm64
platforms: linux/amd64
- docker_build:
tag: -t matrixdotorg/synapse:latest
platforms: linux/amd64,linux/arm/v7,linux/arm64
workflows:
build:
@@ -45,16 +46,12 @@ workflows:
commands:
docker_prepare:
description: Sets up a remote docker server, downloads the buildx cli plugin, and enables multiarch images
description: Downloads the buildx cli plugin and enables multiarch images
parameters:
buildx_version:
type: string
default: "v0.4.1"
steps:
- setup_remote_docker:
# 19.03.13 was the most recent available on circleci at the time of
# writing.
version: 19.03.13
- run: apk add --no-cache curl
- run: mkdir -vp ~/.docker/cli-plugins/ ~/dockercache
- run: curl --silent -L "https://github.com/docker/buildx/releases/download/<< parameters.buildx_version >>/buildx-<< parameters.buildx_version >>.linux-amd64" > ~/.docker/cli-plugins/docker-buildx

View File

@@ -1,8 +0,0 @@
# Black reformatting (#5482).
32e7c9e7f20b57dd081023ac42d6931a8da9b3a3
# Target Python 3.5 with black (#8664).
aff1eb7c671b0a3813407321d2702ec46c71fa56
# Update black to 20.8b1 (#9381).
0a00b7ff14890987f09112a2ae696c61001e6cf1

View File

@@ -1,322 +0,0 @@
name: Tests
on:
push:
branches: ["develop", "release-*"]
pull_request:
jobs:
lint:
runs-on: ubuntu-latest
strategy:
matrix:
toxenv:
- "check-sampleconfig"
- "check_codestyle"
- "check_isort"
- "mypy"
- "packaging"
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- run: pip install tox
- run: tox -e ${{ matrix.toxenv }}
lint-crlf:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Check line endings
run: scripts-dev/check_line_terminators.sh
lint-newsfile:
if: ${{ github.base_ref == 'develop' || contains(github.base_ref, 'release-') }}
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- run: pip install tox
- name: Patch Buildkite-specific test script
run: |
sed -i -e 's/\$BUILDKITE_PULL_REQUEST/${{ github.event.number }}/' \
scripts-dev/check-newsfragment
- run: scripts-dev/check-newsfragment
lint-sdist:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: "3.x"
- run: pip install wheel
- run: python setup.py sdist bdist_wheel
- uses: actions/upload-artifact@v2
with:
name: Python Distributions
path: dist/*
# Dummy step to gate other tests on without repeating the whole list
linting-done:
if: ${{ always() }} # Run this even if prior jobs were skipped
needs: [lint, lint-crlf, lint-newsfile, lint-sdist]
runs-on: ubuntu-latest
steps:
- run: "true"
trial:
if: ${{ !failure() }} # Allow previous steps to be skipped, but not fail
needs: linting-done
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.6", "3.7", "3.8", "3.9"]
database: ["sqlite"]
include:
# Newest Python without optional deps
- python-version: "3.9"
toxenv: "py-noextras,combine"
# Oldest Python with PostgreSQL
- python-version: "3.6"
database: "postgres"
postgres-version: "9.6"
# Newest Python with PostgreSQL
- python-version: "3.9"
database: "postgres"
postgres-version: "13"
steps:
- uses: actions/checkout@v2
- run: sudo apt-get -qq install xmlsec1
- name: Set up PostgreSQL ${{ matrix.postgres-version }}
if: ${{ matrix.postgres-version }}
run: |
docker run -d -p 5432:5432 \
-e POSTGRES_PASSWORD=postgres \
-e POSTGRES_INITDB_ARGS="--lc-collate C --lc-ctype C --encoding UTF8" \
postgres:${{ matrix.postgres-version }}
- uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- run: pip install tox
- name: Await PostgreSQL
if: ${{ matrix.postgres-version }}
timeout-minutes: 2
run: until pg_isready -h localhost; do sleep 1; done
- run: tox -e py,combine
env:
TRIAL_FLAGS: "--jobs=2"
SYNAPSE_POSTGRES: ${{ matrix.database == 'postgres' || '' }}
SYNAPSE_POSTGRES_HOST: localhost
SYNAPSE_POSTGRES_USER: postgres
SYNAPSE_POSTGRES_PASSWORD: postgres
- name: Dump logs
# Note: Dumps to workflow logs instead of using actions/upload-artifact
# This keeps logs colocated with failing jobs
# It also ignores find's exit code; this is a best effort affair
run: >-
find _trial_temp -name '*.log'
-exec echo "::group::{}" \;
-exec cat {} \;
-exec echo "::endgroup::" \;
|| true
trial-olddeps:
if: ${{ !failure() }} # Allow previous steps to be skipped, but not fail
needs: linting-done
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Test with old deps
uses: docker://ubuntu:bionic # For old python and sqlite
with:
workdir: /github/workspace
entrypoint: .buildkite/scripts/test_old_deps.sh
env:
TRIAL_FLAGS: "--jobs=2"
- name: Dump logs
# Note: Dumps to workflow logs instead of using actions/upload-artifact
# This keeps logs colocated with failing jobs
# It also ignores find's exit code; this is a best effort affair
run: >-
find _trial_temp -name '*.log'
-exec echo "::group::{}" \;
-exec cat {} \;
-exec echo "::endgroup::" \;
|| true
trial-pypy:
# Very slow; only run if the branch name includes 'pypy'
if: ${{ contains(github.ref, 'pypy') && !failure() }}
needs: linting-done
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["pypy-3.6"]
steps:
- uses: actions/checkout@v2
- run: sudo apt-get -qq install xmlsec1 libxml2-dev libxslt-dev
- uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- run: pip install tox
- run: tox -e py,combine
env:
TRIAL_FLAGS: "--jobs=2"
- name: Dump logs
# Note: Dumps to workflow logs instead of using actions/upload-artifact
# This keeps logs colocated with failing jobs
# It also ignores find's exit code; this is a best effort affair
run: >-
find _trial_temp -name '*.log'
-exec echo "::group::{}" \;
-exec cat {} \;
-exec echo "::endgroup::" \;
|| true
sytest:
if: ${{ !failure() }}
needs: linting-done
runs-on: ubuntu-latest
container:
image: matrixdotorg/sytest-synapse:${{ matrix.sytest-tag }}
volumes:
- ${{ github.workspace }}:/src
env:
BUILDKITE_BRANCH: ${{ github.head_ref }}
POSTGRES: ${{ matrix.postgres && 1}}
MULTI_POSTGRES: ${{ (matrix.postgres == 'multi-postgres') && 1}}
WORKERS: ${{ matrix.workers && 1 }}
REDIS: ${{ matrix.redis && 1 }}
BLACKLIST: ${{ matrix.workers && 'synapse-blacklist-with-workers' }}
strategy:
fail-fast: false
matrix:
include:
- sytest-tag: bionic
- sytest-tag: bionic
postgres: postgres
- sytest-tag: testing
postgres: postgres
- sytest-tag: bionic
postgres: multi-postgres
workers: workers
- sytest-tag: buster
postgres: multi-postgres
workers: workers
- sytest-tag: buster
postgres: postgres
workers: workers
redis: redis
steps:
- uses: actions/checkout@v2
- name: Prepare test blacklist
run: cat sytest-blacklist .buildkite/worker-blacklist > synapse-blacklist-with-workers
- name: Run SyTest
run: /bootstrap.sh synapse
working-directory: /src
- name: Dump results.tap
if: ${{ always() }}
run: cat /logs/results.tap
- name: Upload SyTest logs
uses: actions/upload-artifact@v2
if: ${{ always() }}
with:
name: Sytest Logs - ${{ job.status }} - (${{ join(matrix.*, ', ') }})
path: |
/logs/results.tap
/logs/**/*.log*
portdb:
if: ${{ !failure() }} # Allow previous steps to be skipped, but not fail
needs: linting-done
runs-on: ubuntu-latest
strategy:
matrix:
include:
- python-version: "3.6"
postgres-version: "9.6"
- python-version: "3.9"
postgres-version: "13"
services:
postgres:
image: postgres:${{ matrix.postgres-version }}
ports:
- 5432:5432
env:
POSTGRES_PASSWORD: "postgres"
POSTGRES_INITDB_ARGS: "--lc-collate C --lc-ctype C --encoding UTF8"
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
steps:
- uses: actions/checkout@v2
- run: sudo apt-get -qq install xmlsec1
- uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Patch Buildkite-specific test scripts
run: |
sed -i -e 's/host="postgres"/host="localhost"/' .buildkite/scripts/create_postgres_db.py
sed -i -e 's/host: postgres/host: localhost/' .buildkite/postgres-config.yaml
sed -i -e 's|/src/||' .buildkite/{sqlite,postgres}-config.yaml
sed -i -e 's/\$TOP/\$GITHUB_WORKSPACE/' .coveragerc
- run: .buildkite/scripts/test_synapse_port_db.sh
complement:
if: ${{ !failure() }}
needs: linting-done
runs-on: ubuntu-latest
container:
# https://github.com/matrix-org/complement/blob/master/dockerfiles/ComplementCIBuildkite.Dockerfile
image: matrixdotorg/complement:latest
env:
CI: true
ports:
- 8448:8448
volumes:
- /var/run/docker.sock:/var/run/docker.sock
steps:
- name: Run actions/checkout@v2 for synapse
uses: actions/checkout@v2
with:
path: synapse
- name: Run actions/checkout@v2 for complement
uses: actions/checkout@v2
with:
repository: "matrix-org/complement"
path: complement
# Build initial Synapse image
- run: docker build -t matrixdotorg/synapse:latest -f docker/Dockerfile .
working-directory: synapse
# Build a ready-to-run Synapse image based on the initial image above.
# This new image includes a config file, keys for signing and TLS, and
# other settings to make it suitable for testing under Complement.
- run: docker build -t complement-synapse -f Synapse.Dockerfile .
working-directory: complement/dockerfiles
# Run Complement
- run: go test -v -tags synapse_blacklist ./tests
env:
COMPLEMENT_BASE_IMAGE: complement-synapse:latest
working-directory: complement

5
.gitignore vendored
View File

@@ -6,19 +6,16 @@
*.egg
*.egg-info
*.lock
*.py[cod]
*.pyc
*.snap
*.tac
_trial_temp/
_trial_temp*/
/out
.DS_Store
__pycache__/
# stuff that is likely to exist when you run a server locally
/*.db
/*.log
/*.log.*
/*.log.config
/*.pid
/.python-version

1369
CHANGES.md

File diff suppressed because it is too large Load Diff

View File

@@ -1,31 +1,4 @@
Welcome to Synapse
This document aims to get you started with contributing to this repo!
- [1. Who can contribute to Synapse?](#1-who-can-contribute-to-synapse)
- [2. What do I need?](#2-what-do-i-need)
- [3. Get the source.](#3-get-the-source)
- [4. Install the dependencies](#4-install-the-dependencies)
* [Under Unix (macOS, Linux, BSD, ...)](#under-unix-macos-linux-bsd-)
* [Under Windows](#under-windows)
- [5. Get in touch.](#5-get-in-touch)
- [6. Pick an issue.](#6-pick-an-issue)
- [7. Turn coffee and documentation into code and documentation!](#7-turn-coffee-and-documentation-into-code-and-documentation)
- [8. Test, test, test!](#8-test-test-test)
* [Run the linters.](#run-the-linters)
* [Run the unit tests.](#run-the-unit-tests)
* [Run the integration tests.](#run-the-integration-tests)
- [9. Submit your patch.](#9-submit-your-patch)
* [Changelog](#changelog)
+ [How do I know what to call the changelog file before I create the PR?](#how-do-i-know-what-to-call-the-changelog-file-before-i-create-the-pr)
+ [Debian changelog](#debian-changelog)
* [Sign off](#sign-off)
- [10. Turn feedback into better code.](#10-turn-feedback-into-better-code)
- [11. Find a new issue.](#11-find-a-new-issue)
- [Notes for maintainers on merging PRs etc](#notes-for-maintainers-on-merging-prs-etc)
- [Conclusion](#conclusion)
# 1. Who can contribute to Synapse?
# Contributing code to Synapse
Everyone is welcome to contribute code to [matrix.org
projects](https://github.com/matrix-org), provided that they are willing to
@@ -36,179 +9,70 @@ license the code under the same terms as the project's overall 'outbound'
license - in our case, this is almost always Apache Software License v2 (see
[LICENSE](LICENSE)).
# 2. What do I need?
The code of Synapse is written in Python 3. To do pretty much anything, you'll need [a recent version of Python 3](https://wiki.python.org/moin/BeginnersGuide/Download).
The source code of Synapse is hosted on GitHub. You will also need [a recent version of git](https://github.com/git-guides/install-git).
For some tests, you will need [a recent version of Docker](https://docs.docker.com/get-docker/).
# 3. Get the source.
## How to contribute
The preferred and easiest way to contribute changes is to fork the relevant
project on GitHub, and then [create a pull request](
project on github, and then [create a pull request](
https://help.github.com/articles/using-pull-requests/) to ask us to pull your
changes into our repo.
Please base your changes on the `develop` branch.
Some other points to follow:
```sh
git clone git@github.com:YOUR_GITHUB_USER_NAME/synapse.git
git checkout develop
```
* Please base your changes on the `develop` branch.
If you need help getting started with git, this is beyond the scope of the document, but you
can find many good git tutorials on the web.
* Please follow the [code style requirements](#code-style).
# 4. Install the dependencies
* Please include a [changelog entry](#changelog) with each PR.
## Under Unix (macOS, Linux, BSD, ...)
* Please [sign off](#sign-off) your contribution.
Once you have installed Python 3 and added the source, please open a terminal and
setup a *virtualenv*, as follows:
* Please keep an eye on the pull request for feedback from the [continuous
integration system](#continuous-integration-and-testing) and try to fix any
errors that come up.
```sh
cd path/where/you/have/cloned/the/repository
python3 -m venv ./env
source ./env/bin/activate
pip install -e ".[all,lint,mypy,test]"
pip install tox
```
* If you need to [update your PR](#updating-your-pull-request), just add new
commits to your branch rather than rebasing.
This will install the developer dependencies for the project.
## Under Windows
TBD
# 5. Get in touch.
Join our developer community on Matrix: #synapse-dev:matrix.org !
# 6. Pick an issue.
Fix your favorite problem or perhaps find a [Good First Issue](https://github.com/matrix-org/synapse/issues?q=is%3Aopen+is%3Aissue+label%3A%22Good+First+Issue%22)
to work on.
# 7. Turn coffee and documentation into code and documentation!
## Code style
Synapse's code style is documented [here](docs/code_style.md). Please follow
it, including the conventions for the [sample configuration
file](docs/code_style.md#configuration-file-format).
There is a growing amount of documentation located in the [docs](docs)
directory. This documentation is intended primarily for sysadmins running their
own Synapse instance, as well as developers interacting externally with
Synapse. [docs/dev](docs/dev) exists primarily to house documentation for
Synapse developers. [docs/admin_api](docs/admin_api) houses documentation
regarding Synapse's Admin API, which is used mostly by sysadmins and external
service developers.
Many of the conventions are enforced by scripts which are run as part of the
[continuous integration system](#continuous-integration-and-testing). To help
check if you have followed the code style, you can run `scripts-dev/lint.sh`
locally. You'll need python 3.6 or later, and to install a number of tools:
If you add new files added to either of these folders, please use [GitHub-Flavoured
Markdown](https://guides.github.com/features/mastering-markdown/).
```
# Install the dependencies
pip install -e ".[lint,mypy]"
Some documentation also exists in [Synapse's GitHub
Wiki](https://github.com/matrix-org/synapse/wiki), although this is primarily
contributed to by community authors.
# 8. Test, test, test!
<a name="test-test-test"></a>
While you're developing and before submitting a patch, you'll
want to test your code.
## Run the linters.
The linters look at your code and do two things:
- ensure that your code follows the coding style adopted by the project;
- catch a number of errors in your code.
They're pretty fast, don't hesitate!
```sh
source ./env/bin/activate
# Run the linter script
./scripts-dev/lint.sh
```
Note that this script *will modify your files* to fix styling errors.
Make sure that you have saved all your files.
**Note that the script does not just test/check, but also reformats code, so you
may wish to ensure any new code is committed first**.
If you wish to restrict the linters to only the files changed since the last commit
(much faster!), you can instead run:
By default, this script checks all files and can take some time; if you alter
only certain files, you might wish to specify paths as arguments to reduce the
run-time:
```sh
source ./env/bin/activate
./scripts-dev/lint.sh -d
```
Or if you know exactly which files you wish to lint, you can instead run:
```sh
source ./env/bin/activate
./scripts-dev/lint.sh path/to/file1.py path/to/file2.py path/to/folder
```
## Run the unit tests.
You can also provide the `-d` option, which will lint the files that have been
changed since the last git commit. This will often be significantly faster than
linting the whole codebase.
The unit tests run parts of Synapse, including your changes, to see if anything
was broken. They are slower than the linters but will typically catch more errors.
```sh
source ./env/bin/activate
trial tests
```
If you wish to only run *some* unit tests, you may specify
another module instead of `tests` - or a test class or a method:
```sh
source ./env/bin/activate
trial tests.rest.admin.test_room tests.handlers.test_admin.ExfiltrateData.test_invite
```
If your tests fail, you may wish to look at the logs:
```sh
less _trial_temp/test.log
```
## Run the integration tests.
The integration tests are a more comprehensive suite of tests. They
run a full version of Synapse, including your changes, to check if
anything was broken. They are slower than the unit tests but will
typically catch more errors.
The following command will let you run the integration test with the most common
configuration:
```sh
$ docker run --rm -it -v /path/where/you/have/cloned/the/repository\:/src:ro -v /path/to/where/you/want/logs\:/logs matrixdotorg/sytest-synapse:py37
```
This configuration should generally cover your needs. For more details about other configurations, see [documentation in the SyTest repo](https://github.com/matrix-org/sytest/blob/develop/docker/README.md).
# 9. Submit your patch.
Once you're happy with your patch, it's time to prepare a Pull Request.
To prepare a Pull Request, please:
1. verify that [all the tests pass](#test-test-test), including the coding style;
2. [sign off](#sign-off) your contribution;
3. `git push` your commit to your fork of Synapse;
4. on GitHub, [create the Pull Request](https://docs.github.com/en/github/collaborating-with-issues-and-pull-requests/creating-a-pull-request);
5. add a [changelog entry](#changelog) and push it to your Pull Request;
6. for most contributors, that's all - however, if you are a member of the organization `matrix-org`, on GitHub, please request a review from `matrix.org / Synapse Core`.
Before pushing new changes, ensure they don't produce linting errors. Commit any
files that were corrected.
Please ensure your changes match the cosmetic style of the existing project,
and **never** mix cosmetic and functional changes in the same commit, as it
makes it horribly hard to review otherwise.
## Changelog
@@ -292,6 +156,24 @@ directory, you will need both a regular newsfragment *and* an entry in the
debian changelog. (Though typically such changes should be submitted as two
separate pull requests.)
## Documentation
There is a growing amount of documentation located in the [docs](docs)
directory. This documentation is intended primarily for sysadmins running their
own Synapse instance, as well as developers interacting externally with
Synapse. [docs/dev](docs/dev) exists primarily to house documentation for
Synapse developers. [docs/admin_api](docs/admin_api) houses documentation
regarding Synapse's Admin API, which is used mostly by sysadmins and external
service developers.
New files added to both folders should be written in [Github-Flavoured
Markdown](https://guides.github.com/features/mastering-markdown/), and attempts
should be made to migrate existing documents to markdown where possible.
Some documentation also exists in [Synapse's Github
Wiki](https://github.com/matrix-org/synapse/wiki), although this is primarily
contributed to by community authors.
## Sign off
In order to have a concrete record that your contribution is intentional
@@ -358,36 +240,47 @@ Git allows you to add this signoff automatically when using the `-s`
flag to `git commit`, which uses the name and email set in your
`user.name` and `user.email` git configs.
## Continuous integration and testing
# 10. Turn feedback into better code.
[Buildkite](https://buildkite.com/matrix-dot-org/synapse) will automatically
run a series of checks and tests against any PR which is opened against the
project; if your change breaks the build, this will be shown in GitHub, with
links to the build results. If your build fails, please try to fix the errors
and update your branch.
Once the Pull Request is opened, you will see a few things:
To run unit tests in a local development environment, you can use:
1. our automated CI (Continuous Integration) pipeline will run (again) the linters, the unit tests, the integration tests and more;
2. one or more of the developers will take a look at your Pull Request and offer feedback.
- ``tox -e py35`` (requires tox to be installed by ``pip install tox``)
for SQLite-backed Synapse on Python 3.5.
- ``tox -e py36`` for SQLite-backed Synapse on Python 3.6.
- ``tox -e py36-postgres`` for PostgreSQL-backed Synapse on Python 3.6
(requires a running local PostgreSQL with access to create databases).
- ``./test_postgresql.sh`` for PostgreSQL-backed Synapse on Python 3.5
(requires Docker). Entirely self-contained, recommended if you don't want to
set up PostgreSQL yourself.
From this point, you should:
Docker images are available for running the integration tests (SyTest) locally,
see the [documentation in the SyTest repo](
https://github.com/matrix-org/sytest/blob/develop/docker/README.md) for more
information.
1. Look at the results of the CI pipeline.
- If there is any error, fix the error.
2. If a developer has requested changes, make these changes and let us know if it is ready for a developer to review again.
3. Create a new commit with the changes.
- Please do NOT overwrite the history. New commits make the reviewer's life easier.
- Push this commits to your Pull Request.
4. Back to 1.
## Updating your pull request
Once both the CI and the developers are happy, the patch will be merged into Synapse and released shortly!
If you decide to make changes to your pull request - perhaps to address issues
raised in a review, or to fix problems highlighted by [continuous
integration](#continuous-integration-and-testing) - just add new commits to your
branch, and push to GitHub. The pull request will automatically be updated.
# 11. Find a new issue.
Please **avoid** rebasing your branch, especially once the PR has been
reviewed: doing so makes it very difficult for a reviewer to see what has
changed since a previous review.
By now, you know the drill!
# Notes for maintainers on merging PRs etc
## Notes for maintainers on merging PRs etc
There are some notes for those with commit access to the project on how we
manage git [here](docs/dev/git.md).
# Conclusion
## Conclusion
That's it! Matrix is a very open and collaborative project as you might expect
given our obsession with open communication. If we're going to successfully

View File

@@ -1,45 +1,19 @@
# Installation Instructions
- [Choosing your server name](#choosing-your-server-name)
- [Picking a database engine](#picking-a-database-engine)
- [Installing Synapse](#installing-synapse)
- [Installing from source](#installing-from-source)
- [Platform-Specific Instructions](#platform-specific-instructions)
- [Prebuilt packages](#prebuilt-packages)
- [Setting up Synapse](#setting-up-synapse)
- [TLS certificates](#tls-certificates)
- [Client Well-Known URI](#client-well-known-uri)
- [Email](#email)
- [Registering a user](#registering-a-user)
- [Setting up a TURN server](#setting-up-a-turn-server)
- [URL previews](#url-previews)
- [Troubleshooting Installation](#troubleshooting-installation)
There are 3 steps to follow under **Installation Instructions**.
- [Installation Instructions](#installation-instructions)
- [Choosing your server name](#choosing-your-server-name)
- [Installing Synapse](#installing-synapse)
- [Installing from source](#installing-from-source)
- [Platform-specific prerequisites](#platform-specific-prerequisites)
- [Debian/Ubuntu/Raspbian](#debianubunturaspbian)
- [ArchLinux](#archlinux)
- [CentOS/Fedora](#centosfedora)
- [macOS](#macos)
- [OpenSUSE](#opensuse)
- [OpenBSD](#openbsd)
- [Windows](#windows)
- [Prebuilt packages](#prebuilt-packages)
- [Docker images and Ansible playbooks](#docker-images-and-ansible-playbooks)
- [Debian/Ubuntu](#debianubuntu)
- [Matrix.org packages](#matrixorg-packages)
- [Downstream Debian packages](#downstream-debian-packages)
- [Downstream Ubuntu packages](#downstream-ubuntu-packages)
- [Fedora](#fedora)
- [OpenSUSE](#opensuse-1)
- [SUSE Linux Enterprise Server](#suse-linux-enterprise-server)
- [ArchLinux](#archlinux-1)
- [Void Linux](#void-linux)
- [FreeBSD](#freebsd)
- [OpenBSD](#openbsd-1)
- [NixOS](#nixos)
- [Setting up Synapse](#setting-up-synapse)
- [Using PostgreSQL](#using-postgresql)
- [TLS certificates](#tls-certificates)
- [Client Well-Known URI](#client-well-known-uri)
- [Email](#email)
- [Registering a user](#registering-a-user)
- [Setting up a TURN server](#setting-up-a-turn-server)
- [URL previews](#url-previews)
- [Troubleshooting Installation](#troubleshooting-installation)
## Choosing your server name
# Choosing your server name
It is important to choose the name for your server before you install Synapse,
because it cannot be changed later.
@@ -55,24 +29,46 @@ that your email address is probably `user@example.com` rather than
`user@email.example.com`) - but doing so may require more advanced setup: see
[Setting up Federation](docs/federate.md).
## Installing Synapse
# Picking a database engine
### Installing from source
Synapse offers two database engines:
* [PostgreSQL](https://www.postgresql.org)
* [SQLite](https://sqlite.org/)
Almost all installations should opt to use PostgreSQL. Advantages include:
* significant performance improvements due to the superior threading and
caching model, smarter query optimiser
* allowing the DB to be run on separate hardware
For information on how to install and use PostgreSQL, please see
[docs/postgres.md](docs/postgres.md)
By default Synapse uses SQLite and in doing so trades performance for convenience.
SQLite is only recommended in Synapse for testing purposes or for servers with
light workloads.
# Installing Synapse
## Installing from source
(Prebuilt packages are available for some platforms - see [Prebuilt packages](#prebuilt-packages).)
When installing from source please make sure that the [Platform-specific prerequisites](#platform-specific-prerequisites) are already installed.
System requirements:
- POSIX-compliant system (tested on Linux & OS X)
- Python 3.5.2 or later, up to Python 3.9.
- At least 1GB of free RAM if you want to join large public rooms like #matrix:matrix.org
Synapse is written in Python but some of the libraries it uses are written in
C. So before we can install Synapse itself we need a working C compiler and the
header files for Python C extensions. See [Platform-Specific
Instructions](#platform-specific-instructions) for information on installing
these on various platforms.
To install the Synapse homeserver run:
```sh
```
mkdir -p ~/synapse
virtualenv -p python3 ~/synapse/env
source ~/synapse/env/bin/activate
@@ -89,7 +85,7 @@ prefer.
This Synapse installation can then be later upgraded by using pip again with the
update flag:
```sh
```
source ~/synapse/env/bin/activate
pip install -U matrix-synapse
```
@@ -97,7 +93,7 @@ pip install -U matrix-synapse
Before you can start Synapse, you will need to generate a configuration
file. To do this, run (in your virtualenv, as before):
```sh
```
cd ~/synapse
python -m synapse.app.homeserver \
--server-name my.domain.name \
@@ -115,58 +111,70 @@ wise to back them up somewhere safe. (If, for whatever reason, you do need to
change your homeserver's keys, you may find that other homeserver have the
old key cached. If you update the signing key, you should change the name of the
key in the `<server name>.signing.key` file (the second word) to something
different. See the [spec](https://matrix.org/docs/spec/server_server/latest.html#retrieving-server-keys) for more information on key management).
different. See the
[spec](https://matrix.org/docs/spec/server_server/latest.html#retrieving-server-keys)
for more information on key management).
To actually run your new homeserver, pick a working directory for Synapse to
run (e.g. `~/synapse`), and:
```sh
```
cd ~/synapse
source env/bin/activate
synctl start
```
#### Platform-specific prerequisites
### Platform-Specific Instructions
Synapse is written in Python but some of the libraries it uses are written in
C. So before we can install Synapse itself we need a working C compiler and the
header files for Python C extensions.
##### Debian/Ubuntu/Raspbian
#### Debian/Ubuntu/Raspbian
Installing prerequisites on Ubuntu or Debian:
```sh
sudo apt install build-essential python3-dev libffi-dev \
```
sudo apt-get install build-essential python3-dev libffi-dev \
python3-pip python3-setuptools sqlite3 \
libssl-dev virtualenv libjpeg-dev libxslt1-dev
```
##### ArchLinux
#### ArchLinux
Installing prerequisites on ArchLinux:
```sh
```
sudo pacman -S base-devel python python-pip \
python-setuptools python-virtualenv sqlite3
```
##### CentOS/Fedora
#### CentOS/Fedora
Installing prerequisites on CentOS or Fedora Linux:
Installing prerequisites on CentOS 8 or Fedora>26:
```sh
```
sudo dnf install libtiff-devel libjpeg-devel libzip-devel freetype-devel \
libwebp-devel libxml2-devel libxslt-devel libpq-devel \
python3-virtualenv libffi-devel openssl-devel python3-devel
libwebp-devel tk-devel redhat-rpm-config \
python3-virtualenv libffi-devel openssl-devel
sudo dnf groupinstall "Development Tools"
```
##### macOS
Installing prerequisites on CentOS 7 or Fedora<=25:
```
sudo yum install libtiff-devel libjpeg-devel libzip-devel freetype-devel \
lcms2-devel libwebp-devel tcl-devel tk-devel redhat-rpm-config \
python3-virtualenv libffi-devel openssl-devel
sudo yum groupinstall "Development Tools"
```
Note that Synapse does not support versions of SQLite before 3.11, and CentOS 7
uses SQLite 3.7. You may be able to work around this by installing a more
recent SQLite version, but it is recommended that you instead use a Postgres
database: see [docs/postgres.md](docs/postgres.md).
#### macOS
Installing prerequisites on macOS:
```sh
```
xcode-select --install
sudo easy_install pip
sudo pip install virtualenv
@@ -176,23 +184,22 @@ brew install pkg-config libffi
On macOS Catalina (10.15) you may need to explicitly install OpenSSL
via brew and inform `pip` about it so that `psycopg2` builds:
```sh
```
brew install openssl@1.1
export LDFLAGS="-L/usr/local/opt/openssl/lib"
export CPPFLAGS="-I/usr/local/opt/openssl/include"
export LDFLAGS=-L/usr/local/Cellar/openssl\@1.1/1.1.1d/lib/
```
##### OpenSUSE
#### OpenSUSE
Installing prerequisites on openSUSE:
```sh
```
sudo zypper in -t pattern devel_basis
sudo zypper in python-pip python-setuptools sqlite3 python-virtualenv \
python-devel libffi-devel libopenssl-devel libjpeg62-devel
```
##### OpenBSD
#### OpenBSD
A port of Synapse is available under `net/synapse`. The filesystem
underlying the homeserver directory (defaults to `/var/synapse`) has to be
@@ -206,72 +213,73 @@ mounted with `wxallowed` (cf. `mount(8)`).
Creating a `WRKOBJDIR` for building python under `/usr/local` (which on a
default OpenBSD installation is mounted with `wxallowed`):
```sh
```
doas mkdir /usr/local/pobj_wxallowed
```
Assuming `PORTS_PRIVSEP=Yes` (cf. `bsd.port.mk(5)`) and `SUDO=doas` are
configured in `/etc/mk.conf`:
```sh
```
doas chown _pbuild:_pbuild /usr/local/pobj_wxallowed
```
Setting the `WRKOBJDIR` for building python:
```sh
```
echo WRKOBJDIR_lang/python/3.7=/usr/local/pobj_wxallowed \\nWRKOBJDIR_lang/python/2.7=/usr/local/pobj_wxallowed >> /etc/mk.conf
```
Building Synapse:
```sh
```
cd /usr/ports/net/synapse
make install
```
##### Windows
#### Windows
If you wish to run or develop Synapse on Windows, the Windows Subsystem For
Linux provides a Linux environment on Windows 10 which is capable of using the
Debian, Fedora, or source installation methods. More information about WSL can
be found at <https://docs.microsoft.com/en-us/windows/wsl/install-win10> for
Windows 10 and <https://docs.microsoft.com/en-us/windows/wsl/install-on-server>
be found at https://docs.microsoft.com/en-us/windows/wsl/install-win10 for
Windows 10 and https://docs.microsoft.com/en-us/windows/wsl/install-on-server
for Windows Server.
### Prebuilt packages
## Prebuilt packages
As an alternative to installing from source, prebuilt packages are available
for a number of platforms.
#### Docker images and Ansible playbooks
### Docker images and Ansible playbooks
There is an official synapse image available at
<https://hub.docker.com/r/matrixdotorg/synapse> which can be used with
There is an offical synapse image available at
https://hub.docker.com/r/matrixdotorg/synapse which can be used with
the docker-compose file available at [contrib/docker](contrib/docker). Further
information on this including configuration options is available in the README
on hub.docker.com.
Alternatively, Andreas Peters (previously Silvio Fricke) has contributed a
Dockerfile to automate a synapse server in a single Docker image, at
<https://hub.docker.com/r/avhost/docker-matrix/tags/>
https://hub.docker.com/r/avhost/docker-matrix/tags/
Slavi Pantaleev has created an Ansible playbook,
which installs the offical Docker image of Matrix Synapse
along with many other Matrix-related services (Postgres database, Element, coturn,
ma1sd, SSL support, etc.).
For more details, see
<https://github.com/spantaleev/matrix-docker-ansible-deploy>
https://github.com/spantaleev/matrix-docker-ansible-deploy
#### Debian/Ubuntu
##### Matrix.org packages
### Debian/Ubuntu
#### Matrix.org packages
Matrix.org provides Debian/Ubuntu packages of the latest stable version of
Synapse via <https://packages.matrix.org/debian/>. They are available for Debian
Synapse via https://packages.matrix.org/debian/. They are available for Debian
9 (Stretch), Ubuntu 16.04 (Xenial), and later. To use them:
```sh
```
sudo apt install -y lsb-release wget apt-transport-https
sudo wget -O /usr/share/keyrings/matrix-org-archive-keyring.gpg https://packages.matrix.org/debian/matrix-org-archive-keyring.gpg
echo "deb [signed-by=/usr/share/keyrings/matrix-org-archive-keyring.gpg] https://packages.matrix.org/debian/ $(lsb_release -cs) main" |
@@ -291,7 +299,7 @@ The fingerprint of the repository signing key (as shown by `gpg
/usr/share/keyrings/matrix-org-archive-keyring.gpg`) is
`AAF9AE843A7584B5A3E4CD2BCF45A512DE2DA058`.
##### Downstream Debian packages
#### Downstream Debian packages
We do not recommend using the packages from the default Debian `buster`
repository at this time, as they are old and suffer from known security
@@ -303,49 +311,49 @@ for information on how to use backports.
If you are using Debian `sid` or testing, Synapse is available in the default
repositories and it should be possible to install it simply with:
```sh
```
sudo apt install matrix-synapse
```
##### Downstream Ubuntu packages
#### Downstream Ubuntu packages
We do not recommend using the packages in the default Ubuntu repository
at this time, as they are old and suffer from known security vulnerabilities.
The latest version of Synapse can be installed from [our repository](#matrixorg-packages).
#### Fedora
### Fedora
Synapse is in the Fedora repositories as `matrix-synapse`:
```sh
```
sudo dnf install matrix-synapse
```
Oleg Girko provides Fedora RPMs at
<https://obs.infoserver.lv/project/monitor/matrix-synapse>
https://obs.infoserver.lv/project/monitor/matrix-synapse
#### OpenSUSE
### OpenSUSE
Synapse is in the OpenSUSE repositories as `matrix-synapse`:
```sh
```
sudo zypper install matrix-synapse
```
#### SUSE Linux Enterprise Server
### SUSE Linux Enterprise Server
Unofficial package are built for SLES 15 in the openSUSE:Backports:SLE-15 repository at
<https://download.opensuse.org/repositories/openSUSE:/Backports:/SLE-15/standard/>
https://download.opensuse.org/repositories/openSUSE:/Backports:/SLE-15/standard/
#### ArchLinux
### ArchLinux
The quickest way to get up and running with ArchLinux is probably with the community package
<https://www.archlinux.org/packages/community/any/matrix-synapse/>, which should pull in most of
https://www.archlinux.org/packages/community/any/matrix-synapse/, which should pull in most of
the necessary dependencies.
pip may be outdated (6.0.7-1 and needs to be upgraded to 6.0.8-1 ):
```sh
```
sudo pip install --upgrade pip
```
@@ -354,28 +362,28 @@ ELFCLASS32 (x64 Systems), you may need to reinstall py-bcrypt to correctly
compile it under the right architecture. (This should not be needed if
installing under virtualenv):
```sh
```
sudo pip uninstall py-bcrypt
sudo pip install py-bcrypt
```
#### Void Linux
### Void Linux
Synapse can be found in the void repositories as 'synapse':
```sh
```
xbps-install -Su
xbps-install -S synapse
```
#### FreeBSD
### FreeBSD
Synapse can be installed via FreeBSD Ports or Packages contributed by Brendan Molloy from:
- Ports: `cd /usr/ports/net-im/py-matrix-synapse && make install clean`
- Packages: `pkg install py37-matrix-synapse`
- Ports: `cd /usr/ports/net-im/py-matrix-synapse && make install clean`
- Packages: `pkg install py37-matrix-synapse`
#### OpenBSD
### OpenBSD
As of OpenBSD 6.7 Synapse is available as a pre-compiled binary. The filesystem
underlying the homeserver directory (defaults to `/var/synapse`) has to be
@@ -384,35 +392,20 @@ and mounting it to `/var/synapse` should be taken into consideration.
Installing Synapse:
```sh
```
doas pkg_add synapse
```
#### NixOS
### NixOS
Robin Lambertz has packaged Synapse for NixOS at:
<https://github.com/NixOS/nixpkgs/blob/master/nixos/modules/services/misc/matrix-synapse.nix>
https://github.com/NixOS/nixpkgs/blob/master/nixos/modules/services/misc/matrix-synapse.nix
## Setting up Synapse
# Setting up Synapse
Once you have installed synapse as above, you will need to configure it.
### Using PostgreSQL
By default Synapse uses [SQLite](https://sqlite.org/) and in doing so trades performance for convenience.
SQLite is only recommended in Synapse for testing purposes or for servers with
very light workloads.
Almost all installations should opt to use [PostgreSQL](https://www.postgresql.org). Advantages include:
- significant performance improvements due to the superior threading and
caching model, smarter query optimiser
- allowing the DB to be run on separate hardware
For information on how to install and use PostgreSQL in Synapse, please see
[docs/postgres.md](docs/postgres.md)
### TLS certificates
## TLS certificates
The default configuration exposes a single HTTP port on the local
interface: `http://localhost:8008`. It is suitable for local testing,
@@ -426,19 +419,19 @@ The recommended way to do so is to set up a reverse proxy on port
Alternatively, you can configure Synapse to expose an HTTPS port. To do
so, you will need to edit `homeserver.yaml`, as follows:
- First, under the `listeners` section, uncomment the configuration for the
* First, under the `listeners` section, uncomment the configuration for the
TLS-enabled listener. (Remove the hash sign (`#`) at the start of
each line). The relevant lines are like this:
```yaml
- port: 8448
type: http
tls: true
resources:
- names: [client, federation]
```
- port: 8448
type: http
tls: true
resources:
- names: [client, federation]
```
- You will also need to uncomment the `tls_certificate_path` and
* You will also need to uncomment the `tls_certificate_path` and
`tls_private_key_path` lines under the `TLS` section. You will need to manage
provisioning of these certificates yourself — Synapse had built-in ACME
support, but the ACMEv1 protocol Synapse implements is deprecated, not
@@ -453,7 +446,7 @@ so, you will need to edit `homeserver.yaml`, as follows:
For a more detailed guide to configuring your server for federation, see
[federate.md](docs/federate.md).
### Client Well-Known URI
## Client Well-Known URI
Setting up the client Well-Known URI is optional but if you set it up, it will
allow users to enter their full username (e.g. `@user:<server_name>`) into clients
@@ -464,7 +457,7 @@ about the actual homeserver URL you are using.
The URL `https://<server_name>/.well-known/matrix/client` should return JSON in
the following format.
```json
```
{
"m.homeserver": {
"base_url": "https://<matrix.example.com>"
@@ -474,7 +467,7 @@ the following format.
It can optionally contain identity server information as well.
```json
```
{
"m.homeserver": {
"base_url": "https://<matrix.example.com>"
@@ -491,11 +484,10 @@ Cross-Origin Resource Sharing (CORS) headers. A recommended value would be
view it.
In nginx this would be something like:
```nginx
```
location /.well-known/matrix/client {
return 200 '{"m.homeserver": {"base_url": "https://<matrix.example.com>"}}';
default_type application/json;
add_header Content-Type application/json;
add_header Access-Control-Allow-Origin *;
}
```
@@ -505,11 +497,11 @@ correctly. `public_baseurl` should be set to the URL that clients will use to
connect to your server. This is the same URL you put for the `m.homeserver`
`base_url` above.
```yaml
```
public_baseurl: "https://<matrix.example.com>"
```
### Email
## Email
It is desirable for Synapse to have the capability to send email. This allows
Synapse to send password reset emails, send verifications when an email address
@@ -524,28 +516,18 @@ and `notif_from` fields filled out. You may also need to set `smtp_user`,
If email is not configured, password reset, registration and notifications via
email will be disabled.
### Registering a user
## Registering a user
The easiest way to create a new user is to do so from a client like [Element](https://element.io/).
Alternatively, you can do so from the command line. This can be done as follows:
Alternatively you can do so from the command line if you have installed via pip.
1. If synapse was installed via pip, activate the virtualenv as follows (if Synapse was
installed via a prebuilt package, `register_new_matrix_user` should already be
on the search path):
```sh
cd ~/synapse
source env/bin/activate
synctl start # if not already running
```
2. Run the following command:
```sh
register_new_matrix_user -c homeserver.yaml http://localhost:8008
```
This can be done as follows:
This will prompt you to add details for the new user, and will then connect to
the running Synapse to create the new user. For example:
```
$ source ~/synapse/env/bin/activate
$ synctl start # if not already running
$ register_new_matrix_user -c homeserver.yaml http://localhost:8008
New user localpart: erikj
Password:
Confirm password:
@@ -560,12 +542,12 @@ value is generated by `--generate-config`), but it should be kept secret, as
anyone with knowledge of it can register users, including admin accounts,
on your server even if `enable_registration` is `false`.
### Setting up a TURN server
## Setting up a TURN server
For reliable VoIP calls to be routed via this homeserver, you MUST configure
a TURN server. See [docs/turn-howto.md](docs/turn-howto.md) for details.
### URL previews
## URL previews
Synapse includes support for previewing URLs, which is disabled by default. To
turn it on you must enable the `url_preview_enabled: True` config parameter
@@ -575,18 +557,19 @@ This is critical from a security perspective to stop arbitrary Matrix users
spidering 'internal' URLs on your network. At the very least we recommend that
your loopback and RFC1918 IP addresses are blacklisted.
This also requires the optional `lxml` python dependency to be installed. This
in turn requires the `libxml2` library to be available - on Debian/Ubuntu this
means `apt-get install libxml2-dev`, or equivalent for your OS.
This also requires the optional `lxml` and `netaddr` python dependencies to be
installed. This in turn requires the `libxml2` library to be available - on
Debian/Ubuntu this means `apt-get install libxml2-dev`, or equivalent for
your OS.
### Troubleshooting Installation
# Troubleshooting Installation
`pip` seems to leak *lots* of memory during installation. For instance, a Linux
host with 512MB of RAM may run out of memory whilst installing Twisted. If this
happens, you will have to individually install the dependencies which are
failing, e.g.:
```sh
```
pip install twisted
```

View File

@@ -20,10 +20,9 @@ recursive-include scripts *
recursive-include scripts-dev *
recursive-include synapse *.pyi
recursive-include tests *.py
recursive-include tests *.pem
recursive-include tests *.p8
recursive-include tests *.crt
recursive-include tests *.key
include tests/http/ca.crt
include tests/http/ca.key
include tests/http/server.key
recursive-include synapse/res *
recursive-include synapse/static *.css

View File

@@ -183,9 +183,8 @@ Using a reverse proxy with Synapse
It is recommended to put a reverse proxy such as
`nginx <https://nginx.org/en/docs/http/ngx_http_proxy_module.html>`_,
`Apache <https://httpd.apache.org/docs/current/mod/mod_proxy_http.html>`_,
`Caddy <https://caddyserver.com/docs/quick-starts/reverse-proxy>`_,
`HAProxy <https://www.haproxy.org/>`_ or
`relayd <https://man.openbsd.org/relayd.8>`_ in front of Synapse. One advantage of
`Caddy <https://caddyserver.com/docs/quick-starts/reverse-proxy>`_ or
`HAProxy <https://www.haproxy.org/>`_ in front of Synapse. One advantage of
doing so is that it means that you can expose the default https port (443) to
Matrix clients without needing to run Synapse with root privileges.
@@ -244,8 +243,6 @@ Then update the ``users`` table in the database::
Synapse Development
===================
Join our developer community on Matrix: `#synapse-dev:matrix.org <https://matrix.to/#/#synapse-dev:matrix.org>`_
Before setting up a development environment for synapse, make sure you have the
system dependencies (such as the python header files) installed - see
`Installing from source <INSTALL.md#installing-from-source>`_.
@@ -264,43 +261,18 @@ to install using pip and a virtualenv::
pip install -e ".[all,test]"
This will run a process of downloading and installing all the needed
dependencies into a virtual env. If any dependencies fail to install,
try installing the failing modules individually::
dependencies into a virtual env.
pip install -e "module-name"
Once this is done, you may wish to run Synapse's unit tests to
check that everything is installed correctly::
Once this is done, you may wish to run Synapse's unit tests, to
check that everything is installed as it should be::
python -m twisted.trial tests
This should end with a 'PASSED' result (note that exact numbers will
differ)::
Ran 1337 tests in 716.064s
PASSED (skips=15, successes=1322)
We recommend using the demo which starts 3 federated instances running on ports `8080` - `8082`
./demo/start.sh
(to stop, you can use `./demo/stop.sh`)
If you just want to start a single instance of the app and run it directly::
# Create the homeserver.yaml config once
python -m synapse.app.homeserver \
--server-name my.domain.name \
--config-path homeserver.yaml \
--generate-config \
--report-stats=[yes|no]
# Start the app
python -m synapse.app.homeserver --config-path homeserver.yaml
This should end with a 'PASSED' result::
Ran 1266 tests in 643.930s
PASSED (skips=15, successes=1251)
Running the Integration Tests
=============================
@@ -314,15 +286,6 @@ Testing with SyTest is recommended for verifying that changes related to the
Client-Server API are functioning correctly. See the `installation instructions
<https://github.com/matrix-org/sytest#installing>`_ for details.
Platform dependencies
=====================
Synapse uses a number of platform dependencies such as Python and PostgreSQL,
and aims to follow supported upstream versions. See the
`<docs/deprecation_policy.md>`_ document for more details.
Troubleshooting
===============
@@ -393,17 +356,12 @@ massive excess of outgoing federation requests (see `discussion
indicate that your server is also issuing far more outgoing federation
requests than can be accounted for by your users' activity, this is a
likely cause. The misbehavior can be worked around by setting
the following in the Synapse config file:
.. code-block:: yaml
presence:
enabled: false
``use_presence: false`` in the Synapse config file.
People can't accept room invitations from me
--------------------------------------------
The typical failure mode here is that you send an invitation to someone
The typical failure mode here is that you send an invitation to someone
to join a room or direct chat, but when they go to accept it, they get an
error (typically along the lines of "Invalid signature"). They might see
something like the following in their logs::

View File

@@ -5,16 +5,6 @@ Before upgrading check if any special steps are required to upgrade from the
version you currently have installed to the current version of Synapse. The extra
instructions that may be required are listed later in this document.
* Check that your versions of Python and PostgreSQL are still supported.
Synapse follows upstream lifecycles for `Python`_ and `PostgreSQL`_, and
removes support for versions which are no longer maintained.
The website https://endoflife.date also offers convenient summaries.
.. _Python: https://devguide.python.org/devcycle/#end-of-life-branches
.. _PostgreSQL: https://www.postgresql.org/support/versioning/
* If Synapse was installed using `prebuilt packages
<INSTALL.md#prebuilt-packages>`_, you will need to follow the normal process
for upgrading those packages.
@@ -85,237 +75,6 @@ for example:
wget https://packages.matrix.org/debian/pool/main/m/matrix-synapse-py3/matrix-synapse-py3_1.3.0+stretch1_amd64.deb
dpkg -i matrix-synapse-py3_1.3.0+stretch1_amd64.deb
Upgrading to v1.32.0
====================
Removal of old List Accounts Admin API
--------------------------------------
The deprecated v1 "list accounts" admin API (``GET /_synapse/admin/v1/users/<user_id>``) has been removed in this version.
The `v2 list accounts API <https://github.com/matrix-org/synapse/blob/master/docs/admin_api/user_admin_api.rst#list-accounts>`_
has been available since Synapse 1.7.0 (2019-12-13), and is accessible under ``GET /_synapse/admin/v2/users``.
The deprecation of the old endpoint was announced with Synapse 1.28.0 (released on 2021-02-25).
Upgrading to v1.29.0
====================
Requirement for X-Forwarded-Proto header
----------------------------------------
When using Synapse with a reverse proxy (in particular, when using the
`x_forwarded` option on an HTTP listener), Synapse now expects to receive an
`X-Forwarded-Proto` header on incoming HTTP requests. If it is not set, Synapse
will log a warning on each received request.
To avoid the warning, administrators using a reverse proxy should ensure that
the reverse proxy sets `X-Forwarded-Proto` header to `https` or `http` to
indicate the protocol used by the client.
Synapse also requires the `Host` header to be preserved.
See the `reverse proxy documentation <docs/reverse_proxy.md>`_, where the
example configurations have been updated to show how to set these headers.
(Users of `Caddy <https://caddyserver.com/>`_ are unaffected, since we believe it
sets `X-Forwarded-Proto` by default.)
Upgrading to v1.27.0
====================
Changes to callback URI for OAuth2 / OpenID Connect and SAML2
-------------------------------------------------------------
This version changes the URI used for callbacks from OAuth2 and SAML2 identity providers:
* If your server is configured for single sign-on via an OpenID Connect or OAuth2 identity
provider, you will need to add ``[synapse public baseurl]/_synapse/client/oidc/callback``
to the list of permitted "redirect URIs" at the identity provider.
See `docs/openid.md <docs/openid.md>`_ for more information on setting up OpenID
Connect.
* If your server is configured for single sign-on via a SAML2 identity provider, you will
need to add ``[synapse public baseurl]/_synapse/client/saml2/authn_response`` as a permitted
"ACS location" (also known as "allowed callback URLs") at the identity provider.
The "Issuer" in the "AuthnRequest" to the SAML2 identity provider is also updated to
``[synapse public baseurl]/_synapse/client/saml2/metadata.xml``. If your SAML2 identity
provider uses this property to validate or otherwise identify Synapse, its configuration
will need to be updated to use the new URL. Alternatively you could create a new, separate
"EntityDescriptor" in your SAML2 identity provider with the new URLs and leave the URLs in
the existing "EntityDescriptor" as they were.
Changes to HTML templates
-------------------------
The HTML templates for SSO and email notifications now have `Jinja2's autoescape <https://jinja.palletsprojects.com/en/2.11.x/api/#autoescaping>`_
enabled for files ending in ``.html``, ``.htm``, and ``.xml``. If you have customised
these templates and see issues when viewing them you might need to update them.
It is expected that most configurations will need no changes.
If you have customised the templates *names* for these templates, it is recommended
to verify they end in ``.html`` to ensure autoescape is enabled.
The above applies to the following templates:
* ``add_threepid.html``
* ``add_threepid_failure.html``
* ``add_threepid_success.html``
* ``notice_expiry.html``
* ``notice_expiry.html``
* ``notif_mail.html`` (which, by default, includes ``room.html`` and ``notif.html``)
* ``password_reset.html``
* ``password_reset_confirmation.html``
* ``password_reset_failure.html``
* ``password_reset_success.html``
* ``registration.html``
* ``registration_failure.html``
* ``registration_success.html``
* ``sso_account_deactivated.html``
* ``sso_auth_bad_user.html``
* ``sso_auth_confirm.html``
* ``sso_auth_success.html``
* ``sso_error.html``
* ``sso_login_idp_picker.html``
* ``sso_redirect_confirm.html``
Upgrading to v1.26.0
====================
Rolling back to v1.25.0 after a failed upgrade
----------------------------------------------
v1.26.0 includes a lot of large changes. If something problematic occurs, you
may want to roll-back to a previous version of Synapse. Because v1.26.0 also
includes a new database schema version, reverting that version is also required
alongside the generic rollback instructions mentioned above. In short, to roll
back to v1.25.0 you need to:
1. Stop the server
2. Decrease the schema version in the database:
.. code:: sql
UPDATE schema_version SET version = 58;
3. Delete the ignored users & chain cover data:
.. code:: sql
DROP TABLE IF EXISTS ignored_users;
UPDATE rooms SET has_auth_chain_index = false;
For PostgreSQL run:
.. code:: sql
TRUNCATE event_auth_chain_links;
TRUNCATE event_auth_chains;
For SQLite run:
.. code:: sql
DELETE FROM event_auth_chain_links;
DELETE FROM event_auth_chains;
4. Mark the deltas as not run (so they will re-run on upgrade).
.. code:: sql
DELETE FROM applied_schema_deltas WHERE version = 59 AND file = "59/01ignored_user.py";
DELETE FROM applied_schema_deltas WHERE version = 59 AND file = "59/06chain_cover_index.sql";
5. Downgrade Synapse by following the instructions for your installation method
in the "Rolling back to older versions" section above.
Upgrading to v1.25.0
====================
Last release supporting Python 3.5
----------------------------------
This is the last release of Synapse which guarantees support with Python 3.5,
which passed its upstream End of Life date several months ago.
We will attempt to maintain support through March 2021, but without guarantees.
In the future, Synapse will follow upstream schedules for ending support of
older versions of Python and PostgreSQL. Please upgrade to at least Python 3.6
and PostgreSQL 9.6 as soon as possible.
Blacklisting IP ranges
----------------------
Synapse v1.25.0 includes new settings, ``ip_range_blacklist`` and
``ip_range_whitelist``, for controlling outgoing requests from Synapse for federation,
identity servers, push, and for checking key validity for third-party invite events.
The previous setting, ``federation_ip_range_blacklist``, is deprecated. The new
``ip_range_blacklist`` defaults to private IP ranges if it is not defined.
If you have never customised ``federation_ip_range_blacklist`` it is recommended
that you remove that setting.
If you have customised ``federation_ip_range_blacklist`` you should update the
setting name to ``ip_range_blacklist``.
If you have a custom push server that is reached via private IP space you may
need to customise ``ip_range_blacklist`` or ``ip_range_whitelist``.
Upgrading to v1.24.0
====================
Custom OpenID Connect mapping provider breaking change
------------------------------------------------------
This release allows the OpenID Connect mapping provider to perform normalisation
of the localpart of the Matrix ID. This allows for the mapping provider to
specify different algorithms, instead of the [default way](https://matrix.org/docs/spec/appendices#mapping-from-other-character-sets).
If your Synapse configuration uses a custom mapping provider
(`oidc_config.user_mapping_provider.module` is specified and not equal to
`synapse.handlers.oidc_handler.JinjaOidcMappingProvider`) then you *must* ensure
that `map_user_attributes` of the mapping provider performs some normalisation
of the `localpart` returned. To match previous behaviour you can use the
`map_username_to_mxid_localpart` function provided by Synapse. An example is
shown below:
.. code-block:: python
from synapse.types import map_username_to_mxid_localpart
class MyMappingProvider:
def map_user_attributes(self, userinfo, token):
# ... your custom logic ...
sso_user_id = ...
localpart = map_username_to_mxid_localpart(sso_user_id)
return {"localpart": localpart}
Removal historical Synapse Admin API
------------------------------------
Historically, the Synapse Admin API has been accessible under:
* ``/_matrix/client/api/v1/admin``
* ``/_matrix/client/unstable/admin``
* ``/_matrix/client/r0/admin``
* ``/_synapse/admin/v1``
The endpoints with ``/_matrix/client/*`` prefixes have been removed as of v1.24.0.
The Admin API is now only accessible under:
* ``/_synapse/admin/v1``
The only exception is the `/admin/whois` endpoint, which is
`also available via the client-server API <https://matrix.org/docs/spec/client_server/r0.6.1#get-matrix-client-r0-admin-whois-userid>`_.
The deprecation of the old endpoints was announced with Synapse 1.20.0 (released
on 2020-09-22) and makes it easier for homeserver admins to lock down external
access to the Admin API endpoints.
Upgrading to v1.23.0
====================
@@ -328,7 +87,7 @@ then it should be modified based on the `structured logging documentation
<https://github.com/matrix-org/synapse/blob/master/docs/structured_logging.md>`_.
The ``structured`` and ``drains`` logging options are now deprecated and should
be replaced by standard logging configuration of ``handlers`` and ``formatters``.
be replaced by standard logging configuration of ``handlers`` and ``formatters`.
A future will release of Synapse will make using ``structured: true`` an error.

1
changelog.d/6739.feature Normal file
View File

@@ -0,0 +1 @@
Implement "room knocking" as per MSC2403. Contributed by Sorunome and anoa.

1
changelog.d/8286.feature Normal file
View File

@@ -0,0 +1 @@
Add a push rule that highlights when a jitsi conference is created in a room.

1
changelog.d/8455.bugfix Normal file
View File

@@ -0,0 +1 @@
Fix fetching of E2E cross signing keys over federation when only one of the master key and device signing key is cached already.

1
changelog.d/8519.feature Normal file
View File

@@ -0,0 +1 @@
Add an admin api to delete a single file or files were not used for a defined time from server. Contributed by @dklimpel.

1
changelog.d/8539.feature Normal file
View File

@@ -0,0 +1 @@
Split admin API for reported events (`GET /_synapse/admin/v1/event_reports`) into detail and list endpoints. This is a breaking change to #8217 which was introduced in Synapse v1.21.0. Those who already use this API should check their scripts. Contributed by @dklimpel.

1
changelog.d/8559.misc Normal file
View File

@@ -0,0 +1 @@
Optimise `/createRoom` with multiple invited users.

1
changelog.d/8580.bugfix Normal file
View File

@@ -0,0 +1 @@
Fix a bug where Synapse would blindly forward bad responses from federation to clients when retrieving profile information.

1
changelog.d/8582.doc Normal file
View File

@@ -0,0 +1 @@
Instructions for Azure AD in the OpenID Connect documentation. Contributed by peterk.

1
changelog.d/8595.misc Normal file
View File

@@ -0,0 +1 @@
Implement and use an @lru_cache decorator.

1
changelog.d/8607.feature Normal file
View File

@@ -0,0 +1 @@
Support generating structured logs via the standard logging configuration.

1
changelog.d/8610.feature Normal file
View File

@@ -0,0 +1 @@
Add an admin APIs to allow server admins to list users' pushers. Contributed by @dklimpel.

1
changelog.d/8614.misc Normal file
View File

@@ -0,0 +1 @@
Don't instansiate Requester directly.

1
changelog.d/8615.misc Normal file
View File

@@ -0,0 +1 @@
Type hints for `RegistrationStore`.

1
changelog.d/8616.misc Normal file
View File

@@ -0,0 +1 @@
Change schema to support access tokens belonging to one user but granting access to another.

1
changelog.d/8620.bugfix Normal file
View File

@@ -0,0 +1 @@
Fix a bug where the account validity endpoint would silently fail if the user ID did not have an expiration time. It now returns a 400 error.

1
changelog.d/8621.misc Normal file
View File

@@ -0,0 +1 @@
Remove unused OPTIONS handlers.

1
changelog.d/8627.bugfix Normal file
View File

@@ -0,0 +1 @@
Fix email notifications for invites without local state.

1
changelog.d/8628.bugfix Normal file
View File

@@ -0,0 +1 @@
Fix handling of invalid group IDs to return a 400 rather than log an exception and return a 500.

1
changelog.d/8632.bugfix Normal file
View File

@@ -0,0 +1 @@
Fix handling of User-Agent headers that are invalid UTF-8, which caused user agents of users to not get correctly recorded.

1
changelog.d/8633.misc Normal file
View File

@@ -0,0 +1 @@
Run `mypy` as part of the lint.sh script.

1
changelog.d/8634.misc Normal file
View File

@@ -0,0 +1 @@
Correct Synapse's PyPI package name in the OpenID Connect installation instructions.

1
changelog.d/8635.doc Normal file
View File

@@ -0,0 +1 @@
Improve the sample configuration for single sign-on providers.

1
changelog.d/8639.misc Normal file
View File

@@ -0,0 +1 @@
Fix typos and spelling errors in the code.

1
changelog.d/8640.misc Normal file
View File

@@ -0,0 +1 @@
Reduce number of OpenTracing spans started.

1
changelog.d/8643.bugfix Normal file
View File

@@ -0,0 +1 @@
Fix a bug in the `joined_rooms` admin API if the user has never joined any rooms. The bug was introduced, along with the API, in v1.21.0.

1
changelog.d/8644.misc Normal file
View File

@@ -0,0 +1 @@
Add field `total` to device list in admin API.

1
changelog.d/8647.feature Normal file
View File

@@ -0,0 +1 @@
Add an admin API `GET /_synapse/admin/v1/users/<user_id>/media` to get information about uploaded media. Contributed by @dklimpel.

1
changelog.d/8655.misc Normal file
View File

@@ -0,0 +1 @@
Add more type hints to the application services code.

1
changelog.d/8657.doc Normal file
View File

@@ -0,0 +1 @@
Fix the filepath of Dex's example config and the link to Dex's Getting Started guide in the OpenID Connect docs.

1
changelog.d/8664.misc Normal file
View File

@@ -0,0 +1 @@
Tell Black to format code for Python 3.5.

1
changelog.d/8665.doc Normal file
View File

@@ -0,0 +1 @@
Note support for Python 3.9.

1
changelog.d/8666.doc Normal file
View File

@@ -0,0 +1 @@
Minor updates to docs on running tests.

1
changelog.d/8667.doc Normal file
View File

@@ -0,0 +1 @@
Interlink prometheus/grafana documentation.

1
changelog.d/8668.misc Normal file
View File

@@ -0,0 +1 @@
Reduce number of OpenTracing spans started.

1
changelog.d/8669.misc Normal file
View File

@@ -0,0 +1 @@
Don't pull event from DB when handling replication traffic.

1
changelog.d/8670.misc Normal file
View File

@@ -0,0 +1 @@
Reduce number of OpenTracing spans started.

1
changelog.d/8671.misc Normal file
View File

@@ -0,0 +1 @@
Abstract some invite-related code in preparation for landing knocking.

1
changelog.d/8679.misc Normal file
View File

@@ -0,0 +1 @@
Clarify representation of events in logfiles.

1
changelog.d/8680.misc Normal file
View File

@@ -0,0 +1 @@
Don't require `hiredis` package to be installed to run unit tests.

1
changelog.d/8682.bugfix Normal file
View File

@@ -0,0 +1 @@
Fix exception during handling multiple concurrent requests for remote media when using multiple media repositories.

1
changelog.d/8684.misc Normal file
View File

@@ -0,0 +1 @@
Fix typing info on cache call signature to accept `on_invalidate`.

1
changelog.d/8685.feature Normal file
View File

@@ -0,0 +1 @@
Support generating structured logs via the standard logging configuration.

1
changelog.d/8688.misc Normal file
View File

@@ -0,0 +1 @@
Abstract some invite-related code in preparation for landing knocking.

1
changelog.d/8689.feature Normal file
View File

@@ -0,0 +1 @@
Add an admin APIs to allow server admins to list users' pushers. Contributed by @dklimpel.

1
changelog.d/8690.misc Normal file
View File

@@ -0,0 +1 @@
Fail tests if they do not await coroutines.

1
changelog.d/8693.misc Normal file
View File

@@ -0,0 +1 @@
Add more type hints to the application services code.

1
changelog.d/8694.misc Normal file
View File

@@ -0,0 +1 @@
Improve start time by adding an index to `e2e_cross_signing_keys.stream_id`.

1
changelog.d/8697.misc Normal file
View File

@@ -0,0 +1 @@
Re-organize the structured logging code to separate the TCP transport handling from the JSON formatting.

1
changelog.d/8698.misc Normal file
View File

@@ -0,0 +1 @@
Use Python 3.8 in Docker images by default.

1
changelog.d/8700.feature Normal file
View File

@@ -0,0 +1 @@
Add an admin API for local user media statistics. Contributed by @dklimpel.

1
changelog.d/8701.doc Normal file
View File

@@ -0,0 +1 @@
Notes on SSO logins and media_repository worker.

1
changelog.d/8702.misc Normal file
View File

@@ -0,0 +1 @@
Remove the "draft" status of the Room Details Admin API.

1
changelog.d/8705.misc Normal file
View File

@@ -0,0 +1 @@
Improve the error returned when a non-string displayname or avatar_url is used when updating a user's profile.

1
changelog.d/8706.doc Normal file
View File

@@ -0,0 +1 @@
Document experimental support for running multiple event persisters.

1
changelog.d/8708.misc Normal file
View File

@@ -0,0 +1 @@
Block attempts by clients to send server ACLs, or redactions of server ACLs, that would result in the local server being blocked from the room.

1
changelog.d/8713.misc Normal file
View File

@@ -0,0 +1 @@
Consolidate duplicated lists of purged tables that are checked in tests.

1
changelog.d/8714.doc Normal file
View File

@@ -0,0 +1 @@
Add information regarding the various sources of, and expected contributions to, Synapse's documentation to `CONTRIBUTING.md`.

1
changelog.d/8722.feature Normal file
View File

@@ -0,0 +1 @@
Add `displayname` to Shared-Secret Registration for admins.

View File

@@ -1 +0,0 @@
Add a dockerfile for running Synapse in worker-mode under Complement.

View File

@@ -1 +0,0 @@
Speed up federation transmission by using fewer database calls. Contributed by @ShadowJonathan.

View File

@@ -1 +0,0 @@
Apply `pyupgrade` across the codebase.

View File

@@ -1 +0,0 @@
Fix thumbnail generation for some sites with non-standard content types. Contributed by @rkfg.

View File

@@ -1 +0,0 @@
Move some replication processing out of `generic_worker`.

View File

@@ -1 +0,0 @@
Update experimental support for [MSC3083](https://github.com/matrix-org/matrix-doc/pull/3083): restricting room access via group membership.

View File

@@ -1 +0,0 @@
Add a note to the docker docs mentioning that we mirror upstream's supported Docker platforms.

View File

@@ -1 +0,0 @@
Replace `HomeServer.get_config()` with inline references.

View File

@@ -1 +0,0 @@
Add experimental support for handling presence on a worker.

View File

@@ -24,7 +24,6 @@ import sys
import time
import urllib
from http import TwistedHttpClient
from typing import Optional
import nacl.encoding
import nacl.signing
@@ -93,7 +92,7 @@ class SynapseCmd(cmd.Cmd):
return self.config["user"].split(":")[1]
def do_config(self, line):
"""Show the config for this client: "config"
""" Show the config for this client: "config"
Edit a key value mapping: "config key value" e.g. "config token 1234"
Config variables:
user: The username to auth with.
@@ -361,7 +360,7 @@ class SynapseCmd(cmd.Cmd):
print(e)
def do_topic(self, line):
""" "topic [set|get] <roomid> [<newtopic>]"
""""topic [set|get] <roomid> [<newtopic>]"
Set the topic for a room: topic set <roomid> <newtopic>
Get the topic for a room: topic get <roomid>
"""
@@ -691,7 +690,7 @@ class SynapseCmd(cmd.Cmd):
self._do_presence_state(2, line)
def _parse(self, line, keys, force_keys=False):
"""Parses the given line.
""" Parses the given line.
Args:
line : The line to parse
@@ -719,10 +718,10 @@ class SynapseCmd(cmd.Cmd):
method,
path,
data=None,
query_params: Optional[dict] = None,
query_params={"access_token": None},
alt_text=None,
):
"""Runs an HTTP request and pretty prints the output.
""" Runs an HTTP request and pretty prints the output.
Args:
method: HTTP method
@@ -730,8 +729,6 @@ class SynapseCmd(cmd.Cmd):
data: Raw JSON data if any
query_params: dict of query parameters to add to the url
"""
query_params = query_params or {"access_token": None}
url = self._url() + path
if "access_token" in query_params:
query_params["access_token"] = self._tok()

View File

@@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# Copyright 2014-2016 OpenMarket Ltd
#
# Licensed under the Apache License, Version 2.0 (the "License");
@@ -15,7 +16,6 @@
import json
import urllib
from pprint import pformat
from typing import Optional
from twisted.internet import defer, reactor
from twisted.web.client import Agent, readBody
@@ -23,10 +23,11 @@ from twisted.web.http_headers import Headers
class HttpClient:
"""Interface for talking json over http"""
""" Interface for talking json over http
"""
def put_json(self, url, data):
"""Sends the specifed json data using PUT
""" Sends the specifed json data using PUT
Args:
url (str): The URL to PUT data to.
@@ -40,7 +41,7 @@ class HttpClient:
pass
def get_json(self, url, args=None):
"""Gets some json from the given host homeserver and path
""" Gets some json from the given host homeserver and path
Args:
url (str): The URL to GET data from.
@@ -57,7 +58,7 @@ class HttpClient:
class TwistedHttpClient(HttpClient):
"""Wrapper around the twisted HTTP client api.
""" Wrapper around the twisted HTTP client api.
Attributes:
agent (twisted.web.client.Agent): The twisted Agent used to send the
@@ -85,9 +86,9 @@ class TwistedHttpClient(HttpClient):
body = yield readBody(response)
defer.returnValue(json.loads(body))
def _create_put_request(self, url, json_data, headers_dict: Optional[dict] = None):
"""Wrapper of _create_request to issue a PUT request"""
headers_dict = headers_dict or {}
def _create_put_request(self, url, json_data, headers_dict={}):
""" Wrapper of _create_request to issue a PUT request
"""
if "Content-Type" not in headers_dict:
raise defer.error(RuntimeError("Must include Content-Type header for PUTs"))
@@ -96,22 +97,15 @@ class TwistedHttpClient(HttpClient):
"PUT", url, producer=_JsonProducer(json_data), headers_dict=headers_dict
)
def _create_get_request(self, url, headers_dict: Optional[dict] = None):
"""Wrapper of _create_request to issue a GET request"""
return self._create_request("GET", url, headers_dict=headers_dict or {})
def _create_get_request(self, url, headers_dict={}):
""" Wrapper of _create_request to issue a GET request
"""
return self._create_request("GET", url, headers_dict=headers_dict)
@defer.inlineCallbacks
def do_request(
self,
method,
url,
data=None,
qparams=None,
jsonreq=True,
headers: Optional[dict] = None,
self, method, url, data=None, qparams=None, jsonreq=True, headers={}
):
headers = headers or {}
if qparams:
url = "%s?%s" % (url, urllib.urlencode(qparams, True))
@@ -132,12 +126,9 @@ class TwistedHttpClient(HttpClient):
defer.returnValue(json.loads(body))
@defer.inlineCallbacks
def _create_request(
self, method, url, producer=None, headers_dict: Optional[dict] = None
):
"""Creates and sends a request to the given url"""
headers_dict = headers_dict or {}
def _create_request(self, method, url, producer=None, headers_dict={}):
""" Creates and sends a request to the given url
"""
headers_dict["User-Agent"] = ["Synapse Cmd Client"]
retries_left = 5
@@ -194,7 +185,8 @@ class _RawProducer:
class _JsonProducer:
"""Used by the twisted http client to create the HTTP body from json"""
""" Used by the twisted http client to create the HTTP body from json
"""
def __init__(self, jsn):
self.data = jsn

View File

@@ -63,7 +63,8 @@ class CursesStdIO:
self.redraw()
def redraw(self):
"""method for redisplaying lines based on internal list of lines"""
""" method for redisplaying lines
based on internal list of lines """
self.stdscr.clear()
self.paintStatus(self.statusText)

View File

@@ -1,3 +1,4 @@
# -*- coding: utf-8 -*-
# Copyright 2014-2016 OpenMarket Ltd
#
# Licensed under the Apache License, Version 2.0 (the "License");
@@ -55,7 +56,7 @@ def excpetion_errback(failure):
class InputOutput:
"""This is responsible for basic I/O so that a user can interact with
""" This is responsible for basic I/O so that a user can interact with
the example app.
"""
@@ -67,7 +68,8 @@ class InputOutput:
self.server = server
def on_line(self, line):
"""This is where we process commands."""
""" This is where we process commands.
"""
try:
m = re.match(r"^join (\S+)$", line)
@@ -131,7 +133,7 @@ class IOLoggerHandler(logging.Handler):
class Room:
"""Used to store (in memory) the current membership state of a room, and
""" Used to store (in memory) the current membership state of a room, and
which home servers we should send PDUs associated with the room to.
"""
@@ -146,7 +148,8 @@ class Room:
self.have_got_metadata = False
def add_participant(self, participant):
"""Someone has joined the room"""
""" Someone has joined the room
"""
self.participants.add(participant)
self.invited.discard(participant)
@@ -157,13 +160,14 @@ class Room:
self.oldest_server = server
def add_invited(self, invitee):
"""Someone has been invited to the room"""
""" Someone has been invited to the room
"""
self.invited.add(invitee)
self.servers.add(origin_from_ucid(invitee))
class HomeServer(ReplicationHandler):
"""A very basic home server implentation that allows people to join a
""" A very basic home server implentation that allows people to join a
room and then invite other people.
"""
@@ -177,7 +181,8 @@ class HomeServer(ReplicationHandler):
self.output = output
def on_receive_pdu(self, pdu):
"""We just received a PDU"""
""" We just received a PDU
"""
pdu_type = pdu.pdu_type
if pdu_type == "sy.room.message":
@@ -194,20 +199,23 @@ class HomeServer(ReplicationHandler):
)
def _on_message(self, pdu):
"""We received a message"""
""" We received a message
"""
self.output.print_line(
"#%s %s %s" % (pdu.context, pdu.content["sender"], pdu.content["body"])
)
def _on_join(self, context, joinee):
"""Someone has joined a room, either a remote user or a local user"""
""" Someone has joined a room, either a remote user or a local user
"""
room = self._get_or_create_room(context)
room.add_participant(joinee)
self.output.print_line("#%s %s %s" % (context, joinee, "*** JOINED"))
def _on_invite(self, origin, context, invitee):
"""Someone has been invited"""
""" Someone has been invited
"""
room = self._get_or_create_room(context)
room.add_invited(invitee)
@@ -220,27 +228,27 @@ class HomeServer(ReplicationHandler):
@defer.inlineCallbacks
def send_message(self, room_name, sender, body):
"""Send a message to a room!"""
""" Send a message to a room!
"""
destinations = yield self.get_servers_for_context(room_name)
try:
yield self.replication_layer.send_pdus(
[
Pdu.create_new(
context=room_name,
pdu_type="sy.room.message",
content={"sender": sender, "body": body},
origin=self.server_name,
destinations=destinations,
)
]
yield self.replication_layer.send_pdu(
Pdu.create_new(
context=room_name,
pdu_type="sy.room.message",
content={"sender": sender, "body": body},
origin=self.server_name,
destinations=destinations,
)
)
except Exception as e:
logger.exception(e)
@defer.inlineCallbacks
def join_room(self, room_name, sender, joinee):
"""Join a room!"""
""" Join a room!
"""
self._on_join(room_name, joinee)
destinations = yield self.get_servers_for_context(room_name)
@@ -255,30 +263,29 @@ class HomeServer(ReplicationHandler):
origin=self.server_name,
destinations=destinations,
)
yield self.replication_layer.send_pdus([pdu])
yield self.replication_layer.send_pdu(pdu)
except Exception as e:
logger.exception(e)
@defer.inlineCallbacks
def invite_to_room(self, room_name, sender, invitee):
"""Invite someone to a room!"""
""" Invite someone to a room!
"""
self._on_invite(self.server_name, room_name, invitee)
destinations = yield self.get_servers_for_context(room_name)
try:
yield self.replication_layer.send_pdus(
[
Pdu.create_new(
context=room_name,
is_state=True,
pdu_type="sy.room.member",
state_key=invitee,
content={"membership": "invite"},
origin=self.server_name,
destinations=destinations,
)
]
yield self.replication_layer.send_pdu(
Pdu.create_new(
context=room_name,
is_state=True,
pdu_type="sy.room.member",
state_key=invitee,
content={"membership": "invite"},
origin=self.server_name,
destinations=destinations,
)
)
except Exception as e:
logger.exception(e)

View File

@@ -193,12 +193,15 @@ class TrivialXmppClient:
time.sleep(7)
print("SSRC spammer started")
while self.running:
ssrcMsg = "<presence to='%(tojid)s' xmlns='jabber:client'><x xmlns='http://jabber.org/protocol/muc'/><c xmlns='http://jabber.org/protocol/caps' hash='sha-1' node='http://jitsi.org/jitsimeet' ver='0WkSdhFnAUxrz4ImQQLdB80GFlE='/><nick xmlns='http://jabber.org/protocol/nick'>%(nick)s</nick><stats xmlns='http://jitsi.org/jitmeet/stats'><stat name='bitrate_download' value='175'/><stat name='bitrate_upload' value='176'/><stat name='packetLoss_total' value='0'/><stat name='packetLoss_download' value='0'/><stat name='packetLoss_upload' value='0'/></stats><media xmlns='http://estos.de/ns/mjs'><source type='audio' ssrc='%(assrc)s' direction='sendre'/><source type='video' ssrc='%(vssrc)s' direction='sendre'/></media></presence>" % {
"tojid": "%s@%s/%s" % (ROOMNAME, ROOMDOMAIN, self.shortJid),
"nick": self.userId,
"assrc": self.ssrcs["audio"],
"vssrc": self.ssrcs["video"],
}
ssrcMsg = (
"<presence to='%(tojid)s' xmlns='jabber:client'><x xmlns='http://jabber.org/protocol/muc'/><c xmlns='http://jabber.org/protocol/caps' hash='sha-1' node='http://jitsi.org/jitsimeet' ver='0WkSdhFnAUxrz4ImQQLdB80GFlE='/><nick xmlns='http://jabber.org/protocol/nick'>%(nick)s</nick><stats xmlns='http://jitsi.org/jitmeet/stats'><stat name='bitrate_download' value='175'/><stat name='bitrate_upload' value='176'/><stat name='packetLoss_total' value='0'/><stat name='packetLoss_download' value='0'/><stat name='packetLoss_upload' value='0'/></stats><media xmlns='http://estos.de/ns/mjs'><source type='audio' ssrc='%(assrc)s' direction='sendre'/><source type='video' ssrc='%(vssrc)s' direction='sendre'/></media></presence>"
% {
"tojid": "%s@%s/%s" % (ROOMNAME, ROOMDOMAIN, self.shortJid),
"nick": self.userId,
"assrc": self.ssrcs["audio"],
"vssrc": self.ssrcs["video"],
}
)
res = self.sendIq(ssrcMsg)
print("reply from ssrc announce: ", res)
time.sleep(10)

View File

@@ -20,7 +20,6 @@ Add a new job to the main prometheus.conf file:
```
### for Prometheus v2
Add a new job to the main prometheus.yml file:
```yaml
@@ -30,17 +29,14 @@ Add a new job to the main prometheus.yml file:
scheme: "https"
static_configs:
- targets: ["my.server.here:port"]
- targets: ['SERVER.LOCATION:PORT']
```
An example of a Prometheus configuration with workers can be found in
[metrics-howto.md](https://github.com/matrix-org/synapse/blob/master/docs/metrics-howto.md).
To use `synapse.rules` add
```yaml
rule_files:
- "/PATH/TO/synapse-v2.rules"
rule_files:
- "/PATH/TO/synapse-v2.rules"
```
Metrics are disabled by default when running synapse; they must be enabled

View File

@@ -9,7 +9,7 @@
new PromConsole.Graph({
node: document.querySelector("#process_resource_utime"),
expr: "rate(process_cpu_seconds_total[2m]) * 100",
name: "[[job]]-[[index]]",
name: "[[job]]",
min: 0,
max: 100,
renderer: "line",
@@ -22,12 +22,12 @@ new PromConsole.Graph({
</script>
<h3>Memory</h3>
<div id="process_resident_memory_bytes"></div>
<div id="process_resource_maxrss"></div>
<script>
new PromConsole.Graph({
node: document.querySelector("#process_resident_memory_bytes"),
expr: "process_resident_memory_bytes",
name: "[[job]]-[[index]]",
node: document.querySelector("#process_resource_maxrss"),
expr: "process_psutil_rss:max",
name: "Maxrss",
min: 0,
renderer: "line",
height: 150,
@@ -43,8 +43,8 @@ new PromConsole.Graph({
<script>
new PromConsole.Graph({
node: document.querySelector("#process_fds"),
expr: "process_open_fds",
name: "[[job]]-[[index]]",
expr: "process_open_fds{job='synapse'}",
name: "FDs",
min: 0,
renderer: "line",
height: 150,
@@ -62,8 +62,8 @@ new PromConsole.Graph({
<script>
new PromConsole.Graph({
node: document.querySelector("#reactor_total_time"),
expr: "rate(python_twisted_reactor_tick_time_sum[2m])",
name: "[[job]]-[[index]]",
expr: "rate(python_twisted_reactor_tick_time:total[2m]) / 1000",
name: "time",
max: 1,
min: 0,
renderer: "area",
@@ -80,8 +80,8 @@ new PromConsole.Graph({
<script>
new PromConsole.Graph({
node: document.querySelector("#reactor_average_time"),
expr: "rate(python_twisted_reactor_tick_time_sum[2m]) / rate(python_twisted_reactor_tick_time_count[2m])",
name: "[[job]]-[[index]]",
expr: "rate(python_twisted_reactor_tick_time:total[2m]) / rate(python_twisted_reactor_tick_time:count[2m]) / 1000",
name: "time",
min: 0,
renderer: "line",
height: 150,
@@ -97,14 +97,14 @@ new PromConsole.Graph({
<script>
new PromConsole.Graph({
node: document.querySelector("#reactor_pending_calls"),
expr: "rate(python_twisted_reactor_pending_calls_sum[30s]) / rate(python_twisted_reactor_pending_calls_count[30s])",
name: "[[job]]-[[index]]",
expr: "rate(python_twisted_reactor_pending_calls:total[30s])/rate(python_twisted_reactor_pending_calls:count[30s])",
name: "calls",
min: 0,
renderer: "line",
height: 150,
yAxisFormatter: PromConsole.NumberFormatter.humanize,
yHoverFormatter: PromConsole.NumberFormatter.humanize,
yTitle: "Pending Calls"
yTitle: "Pending Cals"
})
</script>
@@ -115,7 +115,7 @@ new PromConsole.Graph({
<script>
new PromConsole.Graph({
node: document.querySelector("#synapse_storage_query_time"),
expr: "sum(rate(synapse_storage_query_time_count[2m])) by (verb)",
expr: "rate(synapse_storage_query_time:count[2m])",
name: "[[verb]]",
yAxisFormatter: PromConsole.NumberFormatter.humanizeNoSmallPrefix,
yHoverFormatter: PromConsole.NumberFormatter.humanizeNoSmallPrefix,
@@ -129,8 +129,8 @@ new PromConsole.Graph({
<script>
new PromConsole.Graph({
node: document.querySelector("#synapse_storage_transaction_time"),
expr: "topk(10, rate(synapse_storage_transaction_time_count[2m]))",
name: "[[job]]-[[index]] [[desc]]",
expr: "rate(synapse_storage_transaction_time:count[2m])",
name: "[[desc]]",
min: 0,
yAxisFormatter: PromConsole.NumberFormatter.humanizeNoSmallPrefix,
yHoverFormatter: PromConsole.NumberFormatter.humanizeNoSmallPrefix,
@@ -140,12 +140,12 @@ new PromConsole.Graph({
</script>
<h3>Transaction execution time</h3>
<div id="synapse_storage_transactions_time_sec"></div>
<div id="synapse_storage_transactions_time_msec"></div>
<script>
new PromConsole.Graph({
node: document.querySelector("#synapse_storage_transactions_time_sec"),
expr: "rate(synapse_storage_transaction_time_sum[2m])",
name: "[[job]]-[[index]] [[desc]]",
node: document.querySelector("#synapse_storage_transactions_time_msec"),
expr: "rate(synapse_storage_transaction_time:total[2m]) / 1000",
name: "[[desc]]",
min: 0,
yAxisFormatter: PromConsole.NumberFormatter.humanize,
yHoverFormatter: PromConsole.NumberFormatter.humanize,
@@ -154,33 +154,34 @@ new PromConsole.Graph({
})
</script>
<h3>Average time waiting for database connection</h3>
<div id="synapse_storage_avg_waiting_time"></div>
<h3>Database scheduling latency</h3>
<div id="synapse_storage_schedule_time"></div>
<script>
new PromConsole.Graph({
node: document.querySelector("#synapse_storage_avg_waiting_time"),
expr: "rate(synapse_storage_schedule_time_sum[2m]) / rate(synapse_storage_schedule_time_count[2m])",
name: "[[job]]-[[index]]",
node: document.querySelector("#synapse_storage_schedule_time"),
expr: "rate(synapse_storage_schedule_time:total[2m]) / 1000",
name: "Total latency",
min: 0,
yAxisFormatter: PromConsole.NumberFormatter.humanize,
yHoverFormatter: PromConsole.NumberFormatter.humanize,
yUnits: "s",
yTitle: "Time"
yUnits: "s/s",
yTitle: "Usage"
})
</script>
<h3>Cache request rate</h3>
<div id="synapse_cache_request_rate"></div>
<h3>Cache hit ratio</h3>
<div id="synapse_cache_ratio"></div>
<script>
new PromConsole.Graph({
node: document.querySelector("#synapse_cache_request_rate"),
expr: "rate(synapse_util_caches_cache:total[2m])",
name: "[[job]]-[[index]] [[name]]",
node: document.querySelector("#synapse_cache_ratio"),
expr: "rate(synapse_util_caches_cache:total[2m]) * 100",
name: "[[name]]",
min: 0,
max: 100,
yAxisFormatter: PromConsole.NumberFormatter.humanizeNoSmallPrefix,
yHoverFormatter: PromConsole.NumberFormatter.humanizeNoSmallPrefix,
yUnits: "rps",
yTitle: "Cache request rate"
yUnits: "%",
yTitle: "Percentage"
})
</script>
@@ -190,7 +191,7 @@ new PromConsole.Graph({
new PromConsole.Graph({
node: document.querySelector("#synapse_cache_size"),
expr: "synapse_util_caches_cache:size",
name: "[[job]]-[[index]] [[name]]",
name: "[[name]]",
yAxisFormatter: PromConsole.NumberFormatter.humanizeNoSmallPrefix,
yHoverFormatter: PromConsole.NumberFormatter.humanizeNoSmallPrefix,
yUnits: "",
@@ -205,8 +206,8 @@ new PromConsole.Graph({
<script>
new PromConsole.Graph({
node: document.querySelector("#synapse_http_server_request_count_servlet"),
expr: "rate(synapse_http_server_in_flight_requests_count[2m])",
name: "[[job]]-[[index]] [[method]] [[servlet]]",
expr: "rate(synapse_http_server_request_count:servlet[2m])",
name: "[[servlet]]",
yAxisFormatter: PromConsole.NumberFormatter.humanize,
yHoverFormatter: PromConsole.NumberFormatter.humanize,
yUnits: "req/s",
@@ -218,8 +219,8 @@ new PromConsole.Graph({
<script>
new PromConsole.Graph({
node: document.querySelector("#synapse_http_server_request_count_servlet_minus_events"),
expr: "rate(synapse_http_server_in_flight_requests_count{servlet!=\"EventStreamRestServlet\", servlet!=\"SyncRestServlet\"}[2m])",
name: "[[job]]-[[index]] [[method]] [[servlet]]",
expr: "rate(synapse_http_server_request_count:servlet{servlet!=\"EventStreamRestServlet\", servlet!=\"SyncRestServlet\"}[2m])",
name: "[[servlet]]",
yAxisFormatter: PromConsole.NumberFormatter.humanize,
yHoverFormatter: PromConsole.NumberFormatter.humanize,
yUnits: "req/s",
@@ -232,8 +233,8 @@ new PromConsole.Graph({
<script>
new PromConsole.Graph({
node: document.querySelector("#synapse_http_server_response_time_avg"),
expr: "rate(synapse_http_server_response_time_seconds_sum[2m]) / rate(synapse_http_server_response_count[2m])",
name: "[[job]]-[[index]] [[servlet]]",
expr: "rate(synapse_http_server_response_time_seconds[2m]) / rate(synapse_http_server_response_count[2m]) / 1000",
name: "[[servlet]]",
yAxisFormatter: PromConsole.NumberFormatter.humanize,
yHoverFormatter: PromConsole.NumberFormatter.humanize,
yUnits: "s/req",
@@ -276,7 +277,7 @@ new PromConsole.Graph({
new PromConsole.Graph({
node: document.querySelector("#synapse_http_server_response_ru_utime"),
expr: "rate(synapse_http_server_response_ru_utime_seconds[2m])",
name: "[[job]]-[[index]] [[servlet]]",
name: "[[servlet]]",
yAxisFormatter: PromConsole.NumberFormatter.humanize,
yHoverFormatter: PromConsole.NumberFormatter.humanize,
yUnits: "s/s",
@@ -291,7 +292,7 @@ new PromConsole.Graph({
new PromConsole.Graph({
node: document.querySelector("#synapse_http_server_response_db_txn_duration"),
expr: "rate(synapse_http_server_response_db_txn_duration_seconds[2m])",
name: "[[job]]-[[index]] [[servlet]]",
name: "[[servlet]]",
yAxisFormatter: PromConsole.NumberFormatter.humanize,
yHoverFormatter: PromConsole.NumberFormatter.humanize,
yUnits: "s/s",
@@ -305,8 +306,8 @@ new PromConsole.Graph({
<script>
new PromConsole.Graph({
node: document.querySelector("#synapse_http_server_send_time_avg"),
expr: "rate(synapse_http_server_response_time_seconds_sum{servlet='RoomSendEventRestServlet'}[2m]) / rate(synapse_http_server_response_count{servlet='RoomSendEventRestServlet'}[2m])",
name: "[[job]]-[[index]] [[servlet]]",
expr: "rate(synapse_http_server_response_time_second{servlet='RoomSendEventRestServlet'}[2m]) / rate(synapse_http_server_response_count{servlet='RoomSendEventRestServlet'}[2m]) / 1000",
name: "[[servlet]]",
yAxisFormatter: PromConsole.NumberFormatter.humanize,
yHoverFormatter: PromConsole.NumberFormatter.humanize,
yUnits: "s/req",
@@ -322,7 +323,7 @@ new PromConsole.Graph({
new PromConsole.Graph({
node: document.querySelector("#synapse_federation_client_sent"),
expr: "rate(synapse_federation_client_sent[2m])",
name: "[[job]]-[[index]] [[type]]",
name: "[[type]]",
yAxisFormatter: PromConsole.NumberFormatter.humanize,
yHoverFormatter: PromConsole.NumberFormatter.humanize,
yUnits: "req/s",
@@ -336,7 +337,7 @@ new PromConsole.Graph({
new PromConsole.Graph({
node: document.querySelector("#synapse_federation_server_received"),
expr: "rate(synapse_federation_server_received[2m])",
name: "[[job]]-[[index]] [[type]]",
name: "[[type]]",
yAxisFormatter: PromConsole.NumberFormatter.humanize,
yHoverFormatter: PromConsole.NumberFormatter.humanize,
yUnits: "req/s",
@@ -366,7 +367,7 @@ new PromConsole.Graph({
new PromConsole.Graph({
node: document.querySelector("#synapse_notifier_listeners"),
expr: "synapse_notifier_listeners",
name: "[[job]]-[[index]]",
name: "listeners",
min: 0,
yAxisFormatter: PromConsole.NumberFormatter.humanizeNoSmallPrefix,
yHoverFormatter: PromConsole.NumberFormatter.humanizeNoSmallPrefix,
@@ -381,7 +382,7 @@ new PromConsole.Graph({
new PromConsole.Graph({
node: document.querySelector("#synapse_notifier_notified_events"),
expr: "rate(synapse_notifier_notified_events[2m])",
name: "[[job]]-[[index]]",
name: "events",
yAxisFormatter: PromConsole.NumberFormatter.humanize,
yHoverFormatter: PromConsole.NumberFormatter.humanize,
yUnits: "events/s",

View File

@@ -58,21 +58,3 @@ groups:
labels:
type: "PDU"
expr: 'synapse_federation_transaction_queue_pending_pdus + 0'
- record: synapse_storage_events_persisted_by_source_type
expr: sum without(type, origin_type, origin_entity) (synapse_storage_events_persisted_events_sep{origin_type="remote"})
labels:
type: remote
- record: synapse_storage_events_persisted_by_source_type
expr: sum without(type, origin_type, origin_entity) (synapse_storage_events_persisted_events_sep{origin_entity="*client*",origin_type="local"})
labels:
type: local
- record: synapse_storage_events_persisted_by_source_type
expr: sum without(type, origin_type, origin_entity) (synapse_storage_events_persisted_events_sep{origin_entity!="*client*",origin_type="local"})
labels:
type: bridges
- record: synapse_storage_events_persisted_by_event_type
expr: sum without(origin_entity, origin_type) (synapse_storage_events_persisted_events_sep)
- record: synapse_storage_events_persisted_by_origin
expr: sum without(type) (synapse_storage_events_persisted_events_sep)

View File

@@ -1,4 +1,4 @@
#!/usr/bin/env bash
#!/bin/bash
# this script will use the api:
# https://github.com/matrix-org/synapse/blob/master/docs/admin_api/purge_history_api.rst

View File

@@ -1,4 +1,4 @@
#!/usr/bin/env bash
#!/bin/bash
DOMAIN=yourserver.tld
# add this user as admin in your home server:

View File

@@ -33,13 +33,11 @@ esac
# Use --builtin-venv to use the better `venv` module from CPython 3.4+ rather
# than the 2/3 compatible `virtualenv`.
# Pin pip to 20.3.4 to fix breakage in 21.0 on py3.5 (xenial)
dh_virtualenv \
--install-suffix "matrix-synapse" \
--builtin-venv \
--python "$SNAKE" \
--upgrade-pip-to="20.3.4" \
--upgrade-pip \
--preinstall="lxml" \
--preinstall="mock" \
--extra-pip-arg="--no-cache-dir" \
@@ -50,27 +48,18 @@ PACKAGE_BUILD_DIR="debian/matrix-synapse-py3"
VIRTUALENV_DIR="${PACKAGE_BUILD_DIR}${DH_VIRTUALENV_INSTALL_ROOT}/matrix-synapse"
TARGET_PYTHON="${VIRTUALENV_DIR}/bin/python"
case "$DEB_BUILD_OPTIONS" in
*nocheck*)
# Skip running tests if "nocheck" present in $DEB_BUILD_OPTIONS
;;
# we copy the tests to a temporary directory so that we can put them on the
# PYTHONPATH without putting the uninstalled synapse on the pythonpath.
tmpdir=`mktemp -d`
trap "rm -r $tmpdir" EXIT
*)
# Copy tests to a temporary directory so that we can put them on the
# PYTHONPATH without putting the uninstalled synapse on the pythonpath.
tmpdir=`mktemp -d`
trap "rm -r $tmpdir" EXIT
cp -r tests "$tmpdir"
cp -r tests "$tmpdir"
PYTHONPATH="$tmpdir" \
"${TARGET_PYTHON}" -m twisted.trial --reporter=text -j2 tests
;;
esac
PYTHONPATH="$tmpdir" \
"${TARGET_PYTHON}" -B -m twisted.trial --reporter=text -j2 tests
# build the config file
"${TARGET_PYTHON}" "${VIRTUALENV_DIR}/bin/generate_config" \
"${TARGET_PYTHON}" -B "${VIRTUALENV_DIR}/bin/generate_config" \
--config-dir="/etc/matrix-synapse" \
--data-dir="/var/lib/matrix-synapse" |
perl -pe '
@@ -96,7 +85,7 @@ esac
' > "${PACKAGE_BUILD_DIR}/etc/matrix-synapse/homeserver.yaml"
# build the log config file
"${TARGET_PYTHON}" "${VIRTUALENV_DIR}/bin/generate_log_config" \
"${TARGET_PYTHON}" -B "${VIRTUALENV_DIR}/bin/generate_log_config" \
--output-file="${PACKAGE_BUILD_DIR}/etc/matrix-synapse/log.yaml"
# add a dependency on the right version of python to substvars.

89
debian/changelog vendored
View File

@@ -1,92 +1,3 @@
matrix-synapse-py3 (1.31.0+nmu1) UNRELEASED; urgency=medium
* Skip tests when DEB_BUILD_OPTIONS contains "nocheck".
-- Dan Callahan <danc@element.io> Mon, 12 Apr 2021 13:07:36 +0000
matrix-synapse-py3 (1.31.0) stable; urgency=medium
* New synapse release 1.31.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 06 Apr 2021 13:08:29 +0100
matrix-synapse-py3 (1.30.1) stable; urgency=medium
* New synapse release 1.30.1.
-- Synapse Packaging team <packages@matrix.org> Fri, 26 Mar 2021 12:01:28 +0000
matrix-synapse-py3 (1.30.0) stable; urgency=medium
* New synapse release 1.30.0.
-- Synapse Packaging team <packages@matrix.org> Mon, 22 Mar 2021 13:15:34 +0000
matrix-synapse-py3 (1.29.0) stable; urgency=medium
[ Jonathan de Jong ]
* Remove the python -B flag (don't generate bytecode) in scripts and documentation.
[ Synapse Packaging team ]
* New synapse release 1.29.0.
-- Synapse Packaging team <packages@matrix.org> Mon, 08 Mar 2021 13:51:50 +0000
matrix-synapse-py3 (1.28.0) stable; urgency=medium
* New synapse release 1.28.0.
-- Synapse Packaging team <packages@matrix.org> Thu, 25 Feb 2021 10:21:57 +0000
matrix-synapse-py3 (1.27.0) stable; urgency=medium
[ Dan Callahan ]
* Fix build on Ubuntu 16.04 LTS (Xenial).
[ Synapse Packaging team ]
* New synapse release 1.27.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 16 Feb 2021 13:11:28 +0000
matrix-synapse-py3 (1.26.0) stable; urgency=medium
[ Richard van der Hoff ]
* Remove dependency on `python3-distutils`.
[ Synapse Packaging team ]
* New synapse release 1.26.0.
-- Synapse Packaging team <packages@matrix.org> Wed, 27 Jan 2021 12:43:35 -0500
matrix-synapse-py3 (1.25.0) stable; urgency=medium
[ Dan Callahan ]
* Update dependencies to account for the removal of the transitional
dh-systemd package from Debian Bullseye.
[ Synapse Packaging team ]
* New synapse release 1.25.0.
-- Synapse Packaging team <packages@matrix.org> Wed, 13 Jan 2021 10:14:55 +0000
matrix-synapse-py3 (1.24.0) stable; urgency=medium
* New synapse release 1.24.0.
-- Synapse Packaging team <packages@matrix.org> Wed, 09 Dec 2020 10:14:30 +0000
matrix-synapse-py3 (1.23.1) stable; urgency=medium
* New synapse release 1.23.1.
-- Synapse Packaging team <packages@matrix.org> Wed, 09 Dec 2020 10:40:39 +0000
matrix-synapse-py3 (1.23.0) stable; urgency=medium
* New synapse release 1.23.0.
-- Synapse Packaging team <packages@matrix.org> Wed, 18 Nov 2020 11:41:28 +0000
matrix-synapse-py3 (1.22.1) stable; urgency=medium
* New synapse release 1.22.1.

7
debian/control vendored
View File

@@ -3,11 +3,9 @@ Section: contrib/python
Priority: extra
Maintainer: Synapse Packaging team <packages@matrix.org>
# keep this list in sync with the build dependencies in docker/Dockerfile-dhvirtualenv.
# TODO: Remove the dependency on dh-systemd after dropping support for Ubuntu xenial
# On all other supported releases, it's merely a transitional package which
# does nothing but depends on debhelper (> 9.20160709)
Build-Depends:
debhelper (>= 9.20160709) | dh-systemd,
debhelper (>= 9),
dh-systemd,
dh-virtualenv (>= 1.1),
libsystemd-dev,
libpq-dev,
@@ -31,6 +29,7 @@ Pre-Depends: dpkg (>= 1.16.1)
Depends:
adduser,
debconf,
python3-distutils|libpython3-stdlib (<< 3.6),
${misc:Depends},
${shlibs:Depends},
${synapse:pydepends},

2
debian/synctl.1 vendored
View File

@@ -44,7 +44,7 @@ Configuration file may be generated as follows:
.
.nf
$ python \-m synapse\.app\.homeserver \-c config\.yaml \-\-generate\-config \-\-server\-name=<server name>
$ python \-B \-m synapse\.app\.homeserver \-c config\.yaml \-\-generate\-config \-\-server\-name=<server name>
.
.fi
.

2
debian/synctl.ronn vendored
View File

@@ -41,7 +41,7 @@ process.
Configuration file may be generated as follows:
$ python -m synapse.app.homeserver -c config.yaml --generate-config --server-name=<server name>
$ python -B -m synapse.app.homeserver -c config.yaml --generate-config --server-name=<server name>
## ENVIRONMENT

View File

@@ -1,4 +1,4 @@
#!/usr/bin/env bash
#!/bin/bash
set -e

View File

@@ -1,4 +1,4 @@
#!/usr/bin/env bash
#!/bin/bash
DIR="$( cd "$( dirname "$0" )" && pwd )"

View File

@@ -1,4 +1,4 @@
#!/usr/bin/env bash
#!/bin/bash
DIR="$( cd "$( dirname "$0" )" && pwd )"

59
demo/webserver.py Normal file
View File

@@ -0,0 +1,59 @@
import argparse
import BaseHTTPServer
import os
import SimpleHTTPServer
import cgi, logging
from daemonize import Daemonize
class SimpleHTTPRequestHandlerWithPOST(SimpleHTTPServer.SimpleHTTPRequestHandler):
UPLOAD_PATH = "upload"
"""
Accept all post request as file upload
"""
def do_POST(self):
path = os.path.join(self.UPLOAD_PATH, os.path.basename(self.path))
length = self.headers["content-length"]
data = self.rfile.read(int(length))
with open(path, "wb") as fh:
fh.write(data)
self.send_response(200)
self.send_header("Content-Type", "application/json")
self.end_headers()
# Return the absolute path of the uploaded file
self.wfile.write('{"url":"/%s"}' % path)
def setup():
parser = argparse.ArgumentParser()
parser.add_argument("directory")
parser.add_argument("-p", "--port", dest="port", type=int, default=8080)
parser.add_argument("-P", "--pid-file", dest="pid", default="web.pid")
args = parser.parse_args()
# Get absolute path to directory to serve, as daemonize changes to '/'
os.chdir(args.directory)
dr = os.getcwd()
httpd = BaseHTTPServer.HTTPServer(("", args.port), SimpleHTTPRequestHandlerWithPOST)
def run():
os.chdir(dr)
httpd.serve_forever()
daemon = Daemonize(
app="synapse-webclient", pid=args.pid, action=run, auto_close_fds=False
)
daemon.start()
if __name__ == "__main__":
setup()

Some files were not shown because too many files have changed in this diff Show More