1
0

Compare commits

..

126 Commits

Author SHA1 Message Date
Erik Johnston
92dbb8a62f update proxy 2019-02-13 15:16:05 +00:00
Erik Johnston
816f5a502f Update proxy 2019-02-13 15:16:05 +00:00
Erik Johnston
561fa571c3 Fix fetching media when using proxy 2019-02-13 15:16:05 +00:00
Erik Johnston
b76f1e2cb8 Fix bug in DTG 2019-02-13 15:16:05 +00:00
Erik Johnston
549e580dc9 Reduce send invite request size 2019-02-13 15:16:05 +00:00
Erik Johnston
c3f36414bf Update proxy 2019-02-13 15:16:05 +00:00
Erik Johnston
17eb4504a8 Update flate file 2019-02-13 15:16:05 +00:00
Erik Johnston
a066b00487 Compress some client data 2019-02-13 15:16:05 +00:00
Erik Johnston
4e0ac33053 Handle slow/lossy connections better when sending transactions 2019-02-13 15:16:05 +00:00
Erik Johnston
76d888cf48 pep8 2019-02-13 15:16:05 +00:00
Erik Johnston
dde7110c0d Reduce transaction response size 2019-02-13 15:16:05 +00:00
Erik Johnston
5e6b5ccd26 Actually fix exceptions 2019-02-13 15:16:04 +00:00
Erik Johnston
1d7420ed2f Don't log ERROR when no profile exists 2019-02-13 15:16:04 +00:00
Travis Ralston
a527fbaae6 Catch room profile errors and anything else that can go wrong
Fixes an issue where things become unhappy when the room profile for a user is missing.
2019-02-13 15:16:04 +00:00
Erik Johnston
b951f35572 Reduce size of fed transaction IDs 2019-02-13 15:16:04 +00:00
Brendan Abolivier
6eca7dc3e8 Update maps and proxy 2019-02-13 15:16:04 +00:00
Erik Johnston
1466adf427 Make event_ids smaller 2019-02-13 15:16:04 +00:00
Erik Johnston
a99c2f56b5 Mangle some more PDU fields 2019-02-13 15:16:04 +00:00
Brendan Abolivier
306b670371 Update proxy maps 2019-02-13 15:16:04 +00:00
Brendan Abolivier
31825c10d6 Update proxy & maps 2019-02-13 15:16:04 +00:00
Erik Johnston
a01468c1a8 Change access tokens to be base64'ed 4 bytes 2019-02-13 15:16:04 +00:00
Brendan Abolivier
31c910a9a2 Update proxy 2019-02-13 15:16:04 +00:00
Erik Johnston
62fa8570ec Route full mesh if message contains 'mesh' 2019-02-13 15:16:04 +00:00
Brendan Abolivier
5f52a2c25e Update proxy 2019-02-13 15:16:04 +00:00
Travis Ralston
645d5c8c35 Use run_as_background_process 2019-02-13 15:16:04 +00:00
Travis Ralston
0463d9ba75 Safer execution 2019-02-13 15:16:04 +00:00
Travis Ralston
b26d8cea66 Preserve log contexts in the room_member_handler 2019-02-13 15:16:04 +00:00
Travis Ralston
de6d002d01 Proof of concept for auto-accepting invites
This is for demonstration purposes only. In practice this would actually look up the right profile and use the right thing, not to mention be in a more reasonable location.
2019-02-13 15:16:04 +00:00
Neil Johnson
2b77c8d50e Remove riot.im from the list of trusted Identity Servers in the default configuration (#4207) 2019-02-13 15:16:04 +00:00
Richard van der Hoff
fa78a83ac3 changelog 2019-02-13 15:16:04 +00:00
Richard van der Hoff
5eceb4dc0f Fix logcontext leak in test_url_preview 2019-02-13 15:16:04 +00:00
Richard van der Hoff
07577e0542 Fix logcontext leak in http pusher test 2019-02-13 15:16:04 +00:00
Richard van der Hoff
3cda7da827 Fix some tests which leaked logcontexts 2019-02-13 15:16:04 +00:00
Richard van der Hoff
cb7c2ad85a Fix logcontext leak in EmailPusher 2019-02-13 15:16:04 +00:00
Amber Brown
a29da814c6 towncrier 2019-02-13 15:16:04 +00:00
Amber Brown
5e499c58fd version 2019-02-13 15:16:03 +00:00
Neil Johnson
fa574331fb release 0.33.9rc1 2019-02-13 15:16:03 +00:00
Amber Brown
ca05b679e3 Fix fallback auth on Python 3 (#4197) 2019-02-13 15:16:03 +00:00
Aaron Raimist
83ed2c494b Fix case
Signed-off-by: Aaron Raimist <aaron@raim.ist>
2019-02-13 15:16:03 +00:00
Aaron Raimist
5d4dfc0313 Add SUPPORT.md
https://help.github.com/articles/adding-support-resources-to-your-project/
2019-02-13 15:16:03 +00:00
Aaron Raimist
7d4b700204 Add changelog
Signed-off-by: Aaron Raimist <aaron@raim.ist>
2019-02-13 15:16:03 +00:00
Aaron Raimist
c01605da24 Add a pull request template and add multiple issue templates
Signed-off-by: Aaron Raimist <aaron@raim.ist>
2019-02-13 15:16:03 +00:00
Aaron Raimist
76b251c599 Add changelog
Signed-off-by: Aaron Raimist <aaron@raim.ist>
2019-02-13 15:16:03 +00:00
Aaron Raimist
26708be7f8 Add a note saying you need to manually reclaim disk space
People keep asking why their database hasn't gotten smaller after using this API.

Signed-off-by: Aaron Raimist <aaron@raim.ist>
2019-02-13 15:16:03 +00:00
Erik Johnston
6baa32fd65 Update coap-proxy 2019-02-13 15:16:03 +00:00
Matthew Hodgson
32df91cbcb de-hardcode IP for jaeger 2019-02-13 15:16:03 +00:00
Erik Johnston
42f555393a GAH FILES WHY 2019-02-13 15:16:03 +00:00
Erik Johnston
224df403ea Fixup opentracing error logging 2019-02-13 15:16:03 +00:00
Erik Johnston
2d8da62feb Only relay 'live' events 2019-02-13 15:16:03 +00:00
Erik Johnston
e2230b28fb Mangle PDUs some more. Disable presence/typing/receipts. Don't die if we can't parse an EDU 2019-02-13 15:16:03 +00:00
Erik Johnston
28c3a43a7e Make using proxy optional 2019-02-13 15:16:03 +00:00
Brendan Abolivier
c11388b1ce Update proxy version and maps 2019-02-13 15:16:03 +00:00
Erik Johnston
9dfb6b6c52 Drop unnecessary keys from transactions 2019-02-13 15:16:03 +00:00
Erik Johnston
caa0004466 Track PDU in opentracing 2019-02-13 15:16:03 +00:00
Erik Johnston
ae7460b9f4 Make room ID smaller 2019-02-13 15:16:03 +00:00
Erik Johnston
fe6e221cfa Opentracing. Reduce event ID size 2019-02-13 15:16:03 +00:00
Erik Johnston
b071101729 Strip signatures and hashes on outgoing events 2019-02-13 15:16:03 +00:00
Matthew Hodgson
3405156c4b use right script 2019-02-13 15:16:03 +00:00
Matthew Hodgson
1682ee95ac switch to registering users via add_users.sh 2019-02-13 15:16:03 +00:00
Erik Johnston
e7b70e272f Fix jaeger over federation 2019-02-13 15:16:03 +00:00
Erik Johnston
ec288b48fd add basic jaeger support 2019-02-13 15:16:03 +00:00
Erik Johnston
27ca009b0a Reenable retries for sending transactions 2019-02-13 15:16:03 +00:00
Erik Johnston
6284acf910 Add API to force new threads 2019-02-13 15:16:03 +00:00
Erik Johnston
93db2124ec Add timestamp lookup API 2019-02-13 15:16:03 +00:00
Brendan Abolivier
ca0e0892ca Fix proxy 2019-02-13 15:16:03 +00:00
Brendan Abolivier
8cccbc6f47 Use UDP-able proxy 2019-02-13 15:16:02 +00:00
Brendan Abolivier
799112b0fd Fix cbor encoding in the proxy and enable it by default 2019-02-13 15:16:02 +00:00
Brendan Abolivier
4f7b42c20f Update proxy 2019-02-13 15:16:02 +00:00
Brendan Abolivier
72779ec93f Start synapse + proxy in the same container 2019-02-13 15:16:02 +00:00
Brendan Abolivier
fc99d3dab3 Make the Docker image run both synapse and the proxy 2019-02-13 15:16:02 +00:00
Brendan Abolivier
55bfb3caa8 Make synapse talk HTTP to the local proxy only when federating 2019-02-13 15:16:02 +00:00
Erik Johnston
e8be4ca1ad Join via closest server 2019-02-13 15:16:02 +00:00
Erik Johnston
781bd4fb96 FILES 2019-02-13 15:16:02 +00:00
Erik Johnston
4b5ad3dd12 Add SYNAPSE_LOG_HOST to enable HTTP logging for PDU tracking 2019-02-13 15:15:56 +00:00
Matthew Hodgson
a688d10bca secret password; more timeout 2019-02-13 14:24:42 +00:00
Matthew Hodgson
fe3b9d085f meshsim Dockerfile 2019-02-13 14:24:42 +00:00
Ashe Connor
dad89a4902 add changelog.d entry 2019-02-13 14:24:42 +00:00
Ashe Connor
d5243f0ff3 add jpeg to OpenBSD prereq list
Signed-off-by: Ashe Connor <ashe@kivikakk.ee>
2019-02-13 14:24:42 +00:00
Travis Ralston
037a5b48a6 Fix the terms UI auth tests
By setting the config value directly, we skip the block that adds the slash automatically for us.
2019-02-13 14:24:42 +00:00
Travis Ralston
5abcb455b2 Changelog 2019-02-13 14:24:42 +00:00
Travis Ralston
8d98c4e3e3 Remove duplicate slashes in generated consent URLs 2019-02-13 14:24:42 +00:00
Amber Brown
e0581ccf0e Fix Content-Disposition in media repository (#4176) 2019-02-13 14:24:42 +00:00
Travis Ralston
ac9b734e31 Add option to track MAU stats (but not limit people) (#3830) 2019-02-13 14:24:42 +00:00
Amber Brown
dc768f208e Use <meta> tags to discover the per-page encoding of html previews (#4183) 2019-02-13 14:24:42 +00:00
Amber Brown
404cee9853 Add a coveragerc (#4180) 2019-02-13 14:24:42 +00:00
Richard van der Hoff
166cc35a48 Update README for #1491 fix 2019-02-13 14:24:42 +00:00
Richard van der Hoff
0d934b9ae1 changelog 2019-02-13 14:24:42 +00:00
Richard van der Hoff
ba2b6229c1 Add a test for the public T&Cs form 2019-02-13 14:24:42 +00:00
Richard van der Hoff
7cee15c47d Fix an internal server error when viewing the public privacy policy 2019-02-13 14:24:42 +00:00
David Baker
71f866d54d pep8 2019-02-13 14:24:42 +00:00
David Baker
785f5ef0f3 add docs 2019-02-13 14:24:42 +00:00
David Baker
daf28668d0 Remove unnecessary str() 2019-02-13 14:24:42 +00:00
David Baker
e750d031c8 Cast to int here too 2019-02-13 14:24:42 +00:00
David Baker
efb77b87d1 Cast bacjup version to int when querying 2019-02-13 14:24:42 +00:00
David Baker
b0ac23319a Convert version back to a string 2019-02-13 14:24:42 +00:00
David Baker
515a6cb0d3 news fragment 2019-02-13 14:24:42 +00:00
David Baker
8f46b61aed Try & make it work on postgres 2019-02-13 14:24:42 +00:00
David Baker
f814a1ec5a Make e2e backup versions numeric in the DB
We were doing max(version) which does not do what we wanted
on a column of type TEXT.
2019-02-13 14:24:42 +00:00
Brendan Abolivier
7b28b058e1 Add a Content-Type header on POST requests to the federation client 2019-02-13 14:24:42 +00:00
Erik Johnston
06132f1f0b Add test to assert set_e2e_device_keys correctly returns False on no-op 2019-02-13 14:24:42 +00:00
Erik Johnston
b8077ca8cd Lets convert bytes to unicode instead 2019-02-13 14:24:42 +00:00
Erik Johnston
e56d0456cb Newsfile 2019-02-13 14:24:42 +00:00
Erik Johnston
755f42d769 Fix noop checks when updating device keys
Clients often reupload their device keys (for some reason) so its
important for the server to check for no-ops before sending out device
list update notifications.

The check is broken in python 3 due to the fact comparing bytes and
unicode always fails, and that we write bytes to the DB but get unicode
when we read.
2019-02-13 14:24:42 +00:00
Richard van der Hoff
ef77ab59a7 fix parse_string docstring 2019-02-13 14:24:42 +00:00
Richard van der Hoff
d26852e9d8 changelog 2019-02-13 14:24:42 +00:00
hera
1f8a82077e Fix encoding error for consent form on python3
The form was rendering this as "b'01234....'".

-- richvdh
2019-02-13 14:24:42 +00:00
Erik Johnston
14059e2300 pep8 2019-02-13 14:24:41 +00:00
Erik Johnston
3223f415e2 Add server health apis and server presence 2019-02-13 14:23:21 +00:00
Erik Johnston
f57e71645a Missing file 2019-02-13 14:22:59 +00:00
Erik Johnston
c400d9dcca Add backchatter 2019-02-13 14:22:58 +00:00
Erik Johnston
ed43a63fcf Don't verify stuff 2019-02-13 14:22:18 +00:00
Erik Johnston
e6896040c7 Merge branch 'erikj/thread_demo' of github.com:matrix-org/synapse into erikj/add_routing_hooks 2018-11-21 11:45:11 +00:00
Erik Johnston
d0d3c63705 Fix threading when pulling in via get_missing_events 2018-11-21 10:45:35 +00:00
Erik Johnston
5ae1644d3d Send down new thread marker 2018-11-20 17:42:43 +00:00
Erik Johnston
115e4bb4c6 Fix threading 2018-11-20 17:04:19 +00:00
Erik Johnston
607ac7ea37 Lower all the timeouts 2018-11-20 13:32:47 +00:00
Erik Johnston
775441105a Reduce timeouts for sending transaction 2018-11-20 11:30:43 +00:00
Erik Johnston
e644f49b46 Delta file 2018-11-19 15:09:07 +00:00
Erik Johnston
712caeba60 Add hooks in federation for funky event routing 2018-11-14 16:12:33 +00:00
Erik Johnston
956b47da2b Dont' log so aggressively 2018-11-14 15:32:33 +00:00
Erik Johnston
822fcc3bb8 Add concept of internal events 2018-11-13 15:33:54 +00:00
Erik Johnston
5daa2b9dbc Fix sync for archived rooms 2018-11-13 15:13:03 +00:00
Erik Johnston
08395c7f89 Implemented thread support for backfills 2018-11-13 14:56:38 +00:00
Erik Johnston
c67953748d Add thread_id to filter 2018-11-13 10:34:38 +00:00
Erik Johnston
78fec6b3c9 Add flag to sync to exclude threads 2018-11-12 16:20:14 +00:00
Erik Johnston
dfa830e61a Store and fetch thread IDs 2018-11-12 15:44:22 +00:00
1711 changed files with 85881 additions and 291107 deletions

View File

@@ -1,4 +0,0 @@
---
title: CI run against latest deps is failing
---
See https://github.com/{{env.GITHUB_REPOSITORY}}/actions/runs/{{env.GITHUB_RUN_ID}}

View File

@@ -1,19 +0,0 @@
# Configuration file used for testing the 'synapse_port_db' script.
# Tells the script to connect to the postgresql database that will be available in the
# CI's Docker setup at the point where this file is considered.
server_name: "localhost:8800"
signing_key_path: ".ci/test.signing.key"
report_stats: false
database:
name: "psycopg2"
args:
user: postgres
host: localhost
password: postgres
database: synapse
# Suppress the key server warning.
trusted_key_servers: []

View File

@@ -1,31 +0,0 @@
#!/usr/bin/env python
# Copyright 2019 The Matrix.org Foundation C.I.C.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import sys
import psycopg2
# a very simple replacment for `psql`, to make up for the lack of the postgres client
# libraries in the synapse docker image.
# We use "postgres" as a database because it's bound to exist and the "synapse" one
# doesn't exist yet.
db_conn = psycopg2.connect(
user="postgres", host="localhost", password="postgres", dbname="postgres"
)
db_conn.autocommit = True
cur = db_conn.cursor()
for c in sys.argv[1:]:
cur.execute(c)

View File

@@ -1,52 +0,0 @@
#!/usr/bin/env bash
# Test for the export-data admin command against sqlite and postgres
# Expects Synapse to have been already installed with `poetry install --extras postgres`.
# Expects `poetry` to be available on the `PATH`.
set -xe
cd "$(dirname "$0")/../.."
echo "--- Generate the signing key"
# Generate the server's signing key.
poetry run synapse_homeserver --generate-keys -c .ci/sqlite-config.yaml
echo "--- Prepare test database"
# Make sure the SQLite3 database is using the latest schema and has no pending background update.
poetry run update_synapse_database --database-config .ci/sqlite-config.yaml --run-background-updates
# Run the export-data command on the sqlite test database
poetry run python -m synapse.app.admin_cmd -c .ci/sqlite-config.yaml export-data @anon-20191002_181700-832:localhost:8800 \
--output-directory /tmp/export_data
# Test that the output directory exists and contains the rooms directory
dir="/tmp/export_data/rooms"
if [ -d "$dir" ]; then
echo "Command successful, this test passes"
else
echo "No output directories found, the command fails against a sqlite database."
exit 1
fi
# Create the PostgreSQL database.
poetry run .ci/scripts/postgres_exec.py "CREATE DATABASE synapse"
# Port the SQLite databse to postgres so we can check command works against postgres
echo "+++ Port SQLite3 databse to postgres"
poetry run synapse_port_db --sqlite-database .ci/test_db.db --postgres-config .ci/postgres-config.yaml
# Run the export-data command on postgres database
poetry run python -m synapse.app.admin_cmd -c .ci/postgres-config.yaml export-data @anon-20191002_181700-832:localhost:8800 \
--output-directory /tmp/export_data2
# Test that the output directory exists and contains the rooms directory
dir2="/tmp/export_data2/rooms"
if [ -d "$dir2" ]; then
echo "Command successful, this test passes"
else
echo "No output directories found, the command fails against a postgres database."
exit 1
fi

View File

@@ -1,81 +0,0 @@
#!/usr/bin/env bash
# this script is run by GitHub Actions in a plain `focal` container; it
# - installs the minimal system requirements, and poetry;
# - patches the project definition file to refer to old versions only;
# - creates a venv with these old versions using poetry; and finally
# - invokes `trial` to run the tests with old deps.
# Prevent tzdata from asking for user input
export DEBIAN_FRONTEND=noninteractive
set -ex
apt-get update
apt-get install -y \
python3 python3-dev python3-pip python3-venv pipx \
libxml2-dev libxslt-dev xmlsec1 zlib1g-dev libjpeg-dev libwebp-dev
export LANG="C.UTF-8"
# Prevent virtualenv from auto-updating pip to an incompatible version
export VIRTUALENV_NO_DOWNLOAD=1
# TODO: in the future, we could use an implementation of
# https://github.com/python-poetry/poetry/issues/3527
# https://github.com/pypa/pip/issues/8085
# to select the lowest possible versions, rather than resorting to this sed script.
# Patch the project definitions in-place:
# - Replace all lower and tilde bounds with exact bounds
# - Make the pyopenssl 17.0, which is the oldest version that works with
# a `cryptography` compiled against OpenSSL 1.1.
# - Delete all lines referring to psycopg2 --- so no testing of postgres support.
# - Omit systemd: we're not logging to journal here.
# TODO: also replace caret bounds, see https://python-poetry.org/docs/dependency-specification/#version-constraints
# We don't use these yet, but IIRC they are the default bound used when you `poetry add`.
# The sed expression 's/\^/==/g' ought to do the trick. But it would also change
# `python = "^3.7"` to `python = "==3.7", which would mean we fail because olddeps
# runs on 3.8 (#12343).
sed -i \
-e "s/[~>]=/==/g" \
-e "/psycopg2/d" \
-e 's/pyOpenSSL = "==16.0.0"/pyOpenSSL = "==17.0.0"/' \
-e '/systemd/d' \
pyproject.toml
# Use poetry to do the installation. This ensures that the versions are all mutually
# compatible (as far the package metadata declares, anyway); pip's package resolver
# is more lax.
#
# Rather than `poetry install --no-dev`, we drop all dev dependencies from the
# toml file. This means we don't have to ensure compatibility between old deps and
# dev tools.
pip install --user toml
REMOVE_DEV_DEPENDENCIES="
import toml
with open('pyproject.toml', 'r') as f:
data = toml.loads(f.read())
del data['tool']['poetry']['dev-dependencies']
with open('pyproject.toml', 'w') as f:
toml.dump(data, f)
"
python3 -c "$REMOVE_DEV_DEPENDENCIES"
pipx install poetry==1.1.12
~/.local/bin/poetry lock
echo "::group::Patched pyproject.toml"
cat pyproject.toml
echo "::endgroup::"
echo "::group::Lockfile after patch"
cat poetry.lock
echo "::endgroup::"
~/.local/bin/poetry install -E "all test"
~/.local/bin/poetry run trial --jobs=2 tests

View File

@@ -1,53 +0,0 @@
#!/usr/bin/env bash
#
# Test script for 'synapse_port_db'.
# - configures synapse and a postgres server.
# - runs the port script on a prepopulated test sqlite db
# - also runs it against an new sqlite db
#
# Expects Synapse to have been already installed with `poetry install --extras postgres`.
# Expects `poetry` to be available on the `PATH`.
set -xe
cd "$(dirname "$0")/../.."
echo "--- Generate the signing key"
# Generate the server's signing key.
poetry run synapse_homeserver --generate-keys -c .ci/sqlite-config.yaml
echo "--- Prepare test database"
# Make sure the SQLite3 database is using the latest schema and has no pending background update.
poetry run update_synapse_database --database-config .ci/sqlite-config.yaml --run-background-updates
# Create the PostgreSQL database.
poetry run .ci/scripts/postgres_exec.py "CREATE DATABASE synapse"
echo "+++ Run synapse_port_db against test database"
# TODO: this invocation of synapse_port_db (and others below) used to be prepended with `coverage run`,
# but coverage seems unable to find the entrypoints installed by `pip install -e .`.
poetry run synapse_port_db --sqlite-database .ci/test_db.db --postgres-config .ci/postgres-config.yaml
# We should be able to run twice against the same database.
echo "+++ Run synapse_port_db a second time"
poetry run synapse_port_db --sqlite-database .ci/test_db.db --postgres-config .ci/postgres-config.yaml
#####
# Now do the same again, on an empty database.
echo "--- Prepare empty SQLite database"
# we do this by deleting the sqlite db, and then doing the same again.
rm .ci/test_db.db
poetry run update_synapse_database --database-config .ci/sqlite-config.yaml --run-background-updates
# re-create the PostgreSQL database.
poetry run .ci/scripts/postgres_exec.py \
"DROP DATABASE synapse" \
"CREATE DATABASE synapse"
echo "+++ Run synapse_port_db against empty database"
poetry run synapse_port_db --sqlite-database .ci/test_db.db --postgres-config .ci/postgres-config.yaml

View File

@@ -1,16 +0,0 @@
# Configuration file used for testing the 'synapse_port_db' script.
# Tells the 'update_database' script to connect to the test SQLite database to upgrade its
# schema and run background updates on it.
server_name: "localhost:8800"
signing_key_path: ".ci/test.signing.key"
report_stats: false
database:
name: "sqlite3"
args:
database: ".ci/test_db.db"
# Suppress the key server warning.
trusted_key_servers: []

Binary file not shown.

View File

@@ -1,4 +0,0 @@
---
title: CI run against Twisted trunk is failing
---
See https://github.com/{{env.GITHUB_REPOSITORY}}/actions/runs/{{env.GITHUB_RUN_ID}}

View File

@@ -1,2 +0,0 @@
# This file serves as a blacklist for SyTest tests that we expect will fail in
# Synapse when run under worker mode. For more details, see sytest-blacklist.

172
.circleci/config.yml Normal file
View File

@@ -0,0 +1,172 @@
version: 2
jobs:
dockerhubuploadrelease:
machine: true
steps:
- checkout
- run: docker build -f docker/Dockerfile -t matrixdotorg/synapse:${CIRCLE_TAG} .
- run: docker build -f docker/Dockerfile -t matrixdotorg/synapse:${CIRCLE_TAG}-py3 --build-arg PYTHON_VERSION=3.6 .
- run: docker login --username $DOCKER_HUB_USERNAME --password $DOCKER_HUB_PASSWORD
- run: docker push matrixdotorg/synapse:${CIRCLE_TAG}
- run: docker push matrixdotorg/synapse:${CIRCLE_TAG}-py3
dockerhubuploadlatest:
machine: true
steps:
- checkout
- run: docker build -f docker/Dockerfile -t matrixdotorg/synapse:${CIRCLE_SHA1} .
- run: docker build -f docker/Dockerfile -t matrixdotorg/synapse:${CIRCLE_SHA1}-py3 --build-arg PYTHON_VERSION=3.6 .
- run: docker login --username $DOCKER_HUB_USERNAME --password $DOCKER_HUB_PASSWORD
- run: docker tag matrixdotorg/synapse:${CIRCLE_SHA1} matrixdotorg/synapse:latest
- run: docker tag matrixdotorg/synapse:${CIRCLE_SHA1}-py3 matrixdotorg/synapse:latest-py3
- run: docker push matrixdotorg/synapse:${CIRCLE_SHA1}
- run: docker push matrixdotorg/synapse:${CIRCLE_SHA1}-py3
- run: docker push matrixdotorg/synapse:latest
- run: docker push matrixdotorg/synapse:latest-py3
sytestpy2:
docker:
- image: matrixdotorg/sytest-synapsepy2
working_directory: /src
steps:
- checkout
- run: /synapse_sytest.sh
- store_artifacts:
path: /logs
destination: logs
- store_test_results:
path: /logs
sytestpy2postgres:
docker:
- image: matrixdotorg/sytest-synapsepy2
working_directory: /src
steps:
- checkout
- run: POSTGRES=1 /synapse_sytest.sh
- store_artifacts:
path: /logs
destination: logs
- store_test_results:
path: /logs
sytestpy2merged:
docker:
- image: matrixdotorg/sytest-synapsepy2
working_directory: /src
steps:
- checkout
- run: bash .circleci/merge_base_branch.sh
- run: /synapse_sytest.sh
- store_artifacts:
path: /logs
destination: logs
- store_test_results:
path: /logs
sytestpy2postgresmerged:
docker:
- image: matrixdotorg/sytest-synapsepy2
working_directory: /src
steps:
- checkout
- run: bash .circleci/merge_base_branch.sh
- run: POSTGRES=1 /synapse_sytest.sh
- store_artifacts:
path: /logs
destination: logs
- store_test_results:
path: /logs
sytestpy3:
docker:
- image: matrixdotorg/sytest-synapsepy3
working_directory: /src
steps:
- checkout
- run: /synapse_sytest.sh
- store_artifacts:
path: /logs
destination: logs
- store_test_results:
path: /logs
sytestpy3postgres:
docker:
- image: matrixdotorg/sytest-synapsepy3
working_directory: /src
steps:
- checkout
- run: POSTGRES=1 /synapse_sytest.sh
- store_artifacts:
path: /logs
destination: logs
- store_test_results:
path: /logs
sytestpy3merged:
docker:
- image: matrixdotorg/sytest-synapsepy3
working_directory: /src
steps:
- checkout
- run: bash .circleci/merge_base_branch.sh
- run: /synapse_sytest.sh
- store_artifacts:
path: /logs
destination: logs
- store_test_results:
path: /logs
sytestpy3postgresmerged:
docker:
- image: matrixdotorg/sytest-synapsepy3
working_directory: /src
steps:
- checkout
- run: bash .circleci/merge_base_branch.sh
- run: POSTGRES=1 /synapse_sytest.sh
- store_artifacts:
path: /logs
destination: logs
- store_test_results:
path: /logs
workflows:
version: 2
build:
jobs:
- sytestpy2:
filters:
branches:
only: /develop|master|release-.*/
- sytestpy2postgres:
filters:
branches:
only: /develop|master|release-.*/
- sytestpy3:
filters:
branches:
only: /develop|master|release-.*/
- sytestpy3postgres:
filters:
branches:
only: /develop|master|release-.*/
- sytestpy2merged:
filters:
branches:
ignore: /develop|master|release-.*/
- sytestpy2postgresmerged:
filters:
branches:
ignore: /develop|master|release-.*/
- sytestpy3merged:
filters:
branches:
ignore: /develop|master|release-.*/
- sytestpy3postgresmerged:
filters:
branches:
ignore: /develop|master|release-.*/
- dockerhubuploadrelease:
filters:
tags:
only: /v[0-9].[0-9]+.[0-9]+.*/
branches:
ignore: /.*/
- dockerhubuploadlatest:
filters:
branches:
only: master

34
.circleci/merge_base_branch.sh Executable file
View File

@@ -0,0 +1,34 @@
#!/usr/bin/env bash
set -e
# CircleCI doesn't give CIRCLE_PR_NUMBER in the environment for non-forked PRs. Wonderful.
# In this case, we just need to do some ~shell magic~ to strip it out of the PULL_REQUEST URL.
echo 'export CIRCLE_PR_NUMBER="${CIRCLE_PR_NUMBER:-${CIRCLE_PULL_REQUEST##*/}}"' >> $BASH_ENV
source $BASH_ENV
if [[ -z "${CIRCLE_PR_NUMBER}" ]]
then
echo "Can't figure out what the PR number is! Assuming merge target is develop."
# It probably hasn't had a PR opened yet. Since all PRs land on develop, we
# can probably assume it's based on it and will be merged into it.
GITBASE="develop"
else
# Get the reference, using the GitHub API
GITBASE=`wget -O- https://api.github.com/repos/matrix-org/synapse/pulls/${CIRCLE_PR_NUMBER} | jq -r '.base.ref'`
fi
# Show what we are before
git show -s
# Set up username so it can do a merge
git config --global user.email bot@matrix.org
git config --global user.name "A robot"
# Fetch and merge. If it doesn't work, it will raise due to set -e.
git fetch -u origin $GITBASE
git merge --no-edit origin/$GITBASE
# Show what we are after.
git show -s

View File

@@ -1,14 +0,0 @@
comment: off
coverage:
status:
project:
default:
target: 0 # Target % coverage, can be auto. Turned off for now
threshold: null
base: auto
patch:
default:
target: 0
threshold: null
base: auto

View File

@@ -1,8 +1,12 @@
[run]
branch = True
parallel = True
include=$TOP/synapse/*
data_file = $TOP/.coverage
source = synapse
[paths]
source=
coverage
[report]
precision = 2
ignore_errors = True

View File

@@ -1,11 +1,7 @@
# ignore everything by default
*
# things to include
!docker
!synapse
!README.rst
!pyproject.toml
!poetry.lock
**/__pycache__
Dockerfile
.travis.yml
.gitignore
demo/etc
tox.ini
.git/*
.tox/*

View File

@@ -1,9 +0,0 @@
# EditorConfig https://EditorConfig.org
# top-most EditorConfig file
root = true
# 4 space indentation
[*.py]
indent_style = space
indent_size = 4

11
.flake8
View File

@@ -1,11 +0,0 @@
# TODO: incorporate this into pyproject.toml if flake8 supports it in the future.
# See https://github.com/PyCQA/flake8/issues/234
[flake8]
# see https://pycodestyle.readthedocs.io/en/latest/intro.html#error-codes
# for error codes. The ones we ignore are:
# W503: line break before binary operator
# W504: line break after binary operator
# E203: whitespace before ':' (which is contrary to pep8?)
# E731: do not assign a lambda expression, use a def
# E501: Line too long (black enforces this for us)
ignore=W503,W504,E203,E731,E501

View File

@@ -1,8 +0,0 @@
# Black reformatting (#5482).
32e7c9e7f20b57dd081023ac42d6931a8da9b3a3
# Target Python 3.5 with black (#8664).
aff1eb7c671b0a3813407321d2702ec46c71fa56
# Update black to 20.8b1 (#9381).
0a00b7ff14890987f09112a2ae696c61001e6cf1

2
.github/CODEOWNERS vendored
View File

@@ -1,2 +0,0 @@
# Automatically request reviews from the synapse-core team when a pull request comes in.
* @matrix-org/synapse-core

4
.github/FUNDING.yml vendored
View File

@@ -1,4 +0,0 @@
# One username per supported platform and one custom link
patreon: matrixdotorg
liberapay: matrixdotorg
custom: https://paypal.me/matrixdotorg

View File

@@ -1,5 +0,0 @@
**If you are looking for support** please ask in **#synapse:matrix.org**
(using a matrix.org account if necessary). We do not use GitHub issues for
support.
**If you want to report a security issue** please see https://matrix.org/security-disclosure-policy/

View File

@@ -4,13 +4,11 @@ about: Create a report to help us improve
---
<!--
<!--
**THIS IS NOT A SUPPORT CHANNEL!**
**IF YOU HAVE SUPPORT QUESTIONS ABOUT RUNNING OR CONFIGURING YOUR OWN HOME SERVER**,
please ask in **#synapse:matrix.org** (using a matrix.org account if necessary)
**IF YOU HAVE SUPPORT QUESTIONS ABOUT RUNNING OR CONFIGURING YOUR OWN HOME SERVER**:
You will likely get better support more quickly if you ask in ** #matrix:matrix.org ** ;)
If you want to report a security issue, please see https://matrix.org/security-disclosure-policy/
This is a bug report template. By following the instructions below and
filling out the sections with your information, you will help the us to get all
@@ -19,7 +17,7 @@ the necessary data to fix your issue.
You can also preview your report before submitting it. You may remove sections
that aren't relevant to your particular case.
Text between <!-- and --> marks will be invisible in the report.
Text between <!-- and --> marks will be invisible in the report.
-->
@@ -33,7 +31,7 @@ Text between <!-- and --> marks will be invisible in the report.
- that reproduce the bug
- using hyphens as bullet points
<!--
<!--
Describe how what happens differs from what you expected.
If you can identify any relevant log snippets from _homeserver.log_, please include
@@ -46,26 +44,22 @@ those (please be careful to remove any personal or private data). Please surroun
<!-- IMPORTANT: please answer the following questions, to help us narrow down the problem -->
<!-- Was this issue identified on matrix.org or another homeserver? -->
- **Homeserver**:
- **Homeserver**:
If not matrix.org:
<!--
What version of Synapse is running?
You can find the Synapse version with this command:
$ curl http://localhost:8008/_synapse/admin/v1/server_version
(You may need to replace `localhost:8008` if Synapse is not configured to
listen on that port.)
<!--
What version of Synapse is running?
You can find the Synapse version by inspecting the server headers (replace matrix.org with
your own homeserver domain):
$ curl -v https://matrix.org/_matrix/client/versions 2>&1 | grep "Server:"
-->
- **Version**:
- **Version**:
- **Install method**:
- **Install method**:
<!-- examples: package manager/git clone/pip -->
- **Platform**:
- **Platform**:
<!--
Tell us about the environment in which your homeserver is operating
distro, hardware, if it's running in a vm/container, etc.

View File

@@ -4,7 +4,6 @@ about: I need support for Synapse
---
Please don't file github issues asking for support.
# Please ask for support in [**#matrix:matrix.org**](https://matrix.to/#/#matrix:matrix.org)
Instead, please join [`#synapse:matrix.org`](https://matrix.to/#/#synapse:matrix.org)
(from a matrix.org account if necessary), and ask there.
## Don't file an issue as a support request.

View File

@@ -1,14 +1,7 @@
### Pull Request Checklist
<!-- Please read https://matrix-org.github.io/synapse/latest/development/contributing_guide.html before submitting your pull request -->
<!-- Please read CONTRIBUTING.rst before submitting your pull request -->
* [ ] Pull request is based on the develop branch
* [ ] Pull request includes a [changelog file](https://matrix-org.github.io/synapse/latest/development/contributing_guide.html#changelog). The entry should:
- Be a short description of your change which makes sense to users. "Fixed a bug that prevented receiving messages from other servers." instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
- Use markdown where necessary, mostly for `code blocks`.
- End with either a period (.) or an exclamation mark (!).
- Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by @github_username." or "Contributed by [Your Name]." to the end of the entry.
* [ ] Pull request includes a [sign off](https://matrix-org.github.io/synapse/latest/development/contributing_guide.html#sign-off)
* [ ] [Code style](https://matrix-org.github.io/synapse/latest/code_style.html) is correct
(run the [linters](https://matrix-org.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))
* [ ] Pull request includes a [changelog file](CONTRIBUTING.rst#changelog)
* [ ] Pull request includes a [sign off](CONTRIBUTING.rst#sign-off)

6
.github/SUPPORT.md vendored
View File

@@ -1,3 +1,3 @@
[**#synapse:matrix.org**](https://matrix.to/#/#synapse:matrix.org) is the official support room for
Synapse, and can be accessed by any client from https://matrix.org/docs/projects/try-matrix-now.html.
Please ask for support there, rather than filing github issues.
[**#matrix:matrix.org**](https://matrix.to/#/#matrix:matrix.org) is the official support room for Matrix, and can be accessed by any client from https://matrix.org/docs/projects/try-matrix-now.html
It can also be access via IRC bridge at irc://irc.freenode.net/matrix or on the web here: https://webchat.freenode.net/?channels=matrix

View File

@@ -1,57 +0,0 @@
# GitHub actions workflow which builds and publishes the docker images.
name: Build docker images
on:
push:
tags: ["v*"]
branches: [ master, main, develop ]
workflow_dispatch:
permissions:
contents: read
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Set up QEMU
id: qemu
uses: docker/setup-qemu-action@v1
with:
platforms: arm64
- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action@v1
- name: Inspect builder
run: docker buildx inspect
- name: Log in to DockerHub
uses: docker/login-action@v1
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Calculate docker image tag
id: set-tag
uses: docker/metadata-action@master
with:
images: matrixdotorg/synapse
flavor: |
latest=false
tags: |
type=raw,value=develop,enable=${{ github.ref == 'refs/heads/develop' }}
type=raw,value=latest,enable=${{ github.ref == 'refs/heads/master' }}
type=raw,value=latest,enable=${{ github.ref == 'refs/heads/main' }}
type=pep440,pattern={{raw}}
- name: Build and push all platforms
uses: docker/build-push-action@v2
with:
push: true
labels: "gitsha1=${{ github.sha }}"
tags: "${{ steps.set-tag.outputs.tags }}"
file: "docker/Dockerfile"
platforms: linux/amd64,linux/arm64

View File

@@ -1,65 +0,0 @@
name: Deploy the documentation
on:
push:
branches:
# For bleeding-edge documentation
- develop
# For documentation specific to a release
- 'release-v*'
# stable docs
- master
workflow_dispatch:
jobs:
pages:
name: GitHub Pages
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Setup mdbook
uses: peaceiris/actions-mdbook@4b5ef36b314c2599664ca107bb8c02412548d79d # v1.1.14
with:
mdbook-version: '0.4.17'
- name: Build the documentation
# mdbook will only create an index.html if we're including docs/README.md in SUMMARY.md.
# However, we're using docs/README.md for other purposes and need to pick a new page
# as the default. Let's opt for the welcome page instead.
run: |
mdbook build
cp book/welcome_and_overview.html book/index.html
# Figure out the target directory.
#
# The target directory depends on the name of the branch
#
- name: Get the target directory name
id: vars
run: |
# first strip the 'refs/heads/' prefix with some shell foo
branch="${GITHUB_REF#refs/heads/}"
case $branch in
release-*)
# strip 'release-' from the name for release branches.
branch="${branch#release-}"
;;
master)
# deploy to "latest" for the master branch.
branch="latest"
;;
esac
# finally, set the 'branch-version' var.
echo "::set-output name=branch-version::$branch"
# Deploy to the target directory.
- name: Deploy to gh pages
uses: peaceiris/actions-gh-pages@068dc23d9710f1ba62e86896f84735d869951305 # v3.8.0
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: ./book
destination_dir: ./${{ steps.vars.outputs.branch-version }}

View File

@@ -1,159 +0,0 @@
# People who are freshly `pip install`ing from PyPI will pull in the latest versions of
# dependencies which match the broad requirements. Since most CI runs are against
# the locked poetry environment, run specifically against the latest dependencies to
# know if there's an upcoming breaking change.
#
# As an overview this workflow:
# - checks out develop,
# - installs from source, pulling in the dependencies like a fresh `pip install` would, and
# - runs mypy and test suites in that checkout.
#
# Based on the twisted trunk CI job.
name: Latest dependencies
on:
schedule:
- cron: 0 7 * * *
workflow_dispatch:
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
mypy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
# The dev dependencies aren't exposed in the wheel metadata (at least with current
# poetry-core versions), so we install with poetry.
- uses: matrix-org/setup-python-poetry@v1
with:
python-version: "3.x"
poetry-version: "1.2.0b1"
extras: "all"
# Dump installed versions for debugging.
- run: poetry run pip list > before.txt
# Upgrade all runtime dependencies only. This is intended to mimic a fresh
# `pip install matrix-synapse[all]` as closely as possible.
- run: poetry update --no-dev
- run: poetry run pip list > after.txt && (diff -u before.txt after.txt || true)
- name: Remove warn_unused_ignores from mypy config
run: sed '/warn_unused_ignores = True/d' -i mypy.ini
- run: poetry run mypy
trial:
runs-on: ubuntu-latest
strategy:
matrix:
include:
- database: "sqlite"
- database: "postgres"
postgres-version: "14"
steps:
- uses: actions/checkout@v2
- run: sudo apt-get -qq install xmlsec1
- name: Set up PostgreSQL ${{ matrix.postgres-version }}
if: ${{ matrix.postgres-version }}
run: |
docker run -d -p 5432:5432 \
-e POSTGRES_PASSWORD=postgres \
-e POSTGRES_INITDB_ARGS="--lc-collate C --lc-ctype C --encoding UTF8" \
postgres:${{ matrix.postgres-version }}
- uses: actions/setup-python@v2
with:
python-version: "3.x"
- run: pip install .[all,test]
- name: Await PostgreSQL
if: ${{ matrix.postgres-version }}
timeout-minutes: 2
run: until pg_isready -h localhost; do sleep 1; done
- run: python -m twisted.trial --jobs=2 tests
env:
SYNAPSE_POSTGRES: ${{ matrix.database == 'postgres' || '' }}
SYNAPSE_POSTGRES_HOST: localhost
SYNAPSE_POSTGRES_USER: postgres
SYNAPSE_POSTGRES_PASSWORD: postgres
- name: Dump logs
# Logs are most useful when the command fails, always include them.
if: ${{ always() }}
# Note: Dumps to workflow logs instead of using actions/upload-artifact
# This keeps logs colocated with failing jobs
# It also ignores find's exit code; this is a best effort affair
run: >-
find _trial_temp -name '*.log'
-exec echo "::group::{}" \;
-exec cat {} \;
-exec echo "::endgroup::" \;
|| true
sytest:
runs-on: ubuntu-latest
container:
image: matrixdotorg/sytest-synapse:testing
volumes:
- ${{ github.workspace }}:/src
strategy:
fail-fast: false
matrix:
include:
- sytest-tag: focal
- sytest-tag: focal
postgres: postgres
workers: workers
redis: redis
env:
POSTGRES: ${{ matrix.postgres && 1}}
WORKERS: ${{ matrix.workers && 1 }}
REDIS: ${{ matrix.redis && 1 }}
BLACKLIST: ${{ matrix.workers && 'synapse-blacklist-with-workers' }}
steps:
- uses: actions/checkout@v2
- name: Ensure sytest runs `pip install`
# Delete the lockfile so sytest will `pip install` rather than `poetry install`
run: rm /src/poetry.lock
working-directory: /src
- name: Prepare test blacklist
run: cat sytest-blacklist .ci/worker-blacklist > synapse-blacklist-with-workers
- name: Run SyTest
run: /bootstrap.sh synapse
working-directory: /src
- name: Summarise results.tap
if: ${{ always() }}
run: /sytest/scripts/tap_to_gha.pl /logs/results.tap
- name: Upload SyTest logs
uses: actions/upload-artifact@v2
if: ${{ always() }}
with:
name: Sytest Logs - ${{ job.status }} - (${{ join(matrix.*, ', ') }})
path: |
/logs/results.tap
/logs/**/*.log*
# TODO: run complement (as with twisted trunk, see #12473).
# open an issue if the build fails, so we know about it.
open-issue:
if: failure()
needs:
# TODO: should mypy be included here? It feels more brittle than the other two.
- mypy
- trial
- sytest
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: JasonEtco/create-an-issue@5d9504915f79f9cc6d791934b8ef34f2353dd74d # v2.5.0, 2020-12-06
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
update_existing: true
filename: .ci/latest_deps_build_failed_issue_template.md

View File

@@ -1,121 +0,0 @@
# GitHub actions workflow which builds the release artifacts.
name: Build release artifacts
on:
# we build on PRs and develop to (hopefully) get early warning
# of things breaking (but only build one set of debs)
pull_request:
push:
branches: ["develop", "release-*"]
# we do the full build on tags.
tags: ["v*"]
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions:
contents: write
jobs:
get-distros:
name: "Calculate list of debian distros"
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- id: set-distros
run: |
# if we're running from a tag, get the full list of distros; otherwise just use debian:sid
dists='["debian:sid"]'
if [[ $GITHUB_REF == refs/tags/* ]]; then
dists=$(scripts-dev/build_debian_packages.py --show-dists-json)
fi
echo "::set-output name=distros::$dists"
# map the step outputs to job outputs
outputs:
distros: ${{ steps.set-distros.outputs.distros }}
# now build the packages with a matrix build.
build-debs:
needs: get-distros
name: "Build .deb packages"
runs-on: ubuntu-latest
strategy:
matrix:
distro: ${{ fromJson(needs.get-distros.outputs.distros) }}
steps:
- name: Checkout
uses: actions/checkout@v2
with:
path: src
- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action@v1
with:
install: true
- name: Set up docker layer caching
uses: actions/cache@v2
with:
path: /tmp/.buildx-cache
key: ${{ runner.os }}-buildx-${{ github.sha }}
restore-keys: |
${{ runner.os }}-buildx-
- name: Set up python
uses: actions/setup-python@v2
- name: Build the packages
# see https://github.com/docker/build-push-action/issues/252
# for the cache magic here
run: |
./src/scripts-dev/build_debian_packages.py \
--docker-build-arg=--cache-from=type=local,src=/tmp/.buildx-cache \
--docker-build-arg=--cache-to=type=local,mode=max,dest=/tmp/.buildx-cache-new \
--docker-build-arg=--progress=plain \
--docker-build-arg=--load \
"${{ matrix.distro }}"
rm -rf /tmp/.buildx-cache
mv /tmp/.buildx-cache-new /tmp/.buildx-cache
- name: Upload debs as artifacts
uses: actions/upload-artifact@v2
with:
name: debs
path: debs/*
build-sdist:
name: "Build pypi distribution files"
uses: "matrix-org/backend-meta/.github/workflows/packaging.yml@v1"
# if it's a tag, create a release and attach the artifacts to it
attach-assets:
name: "Attach assets to release"
if: ${{ !failure() && !cancelled() && startsWith(github.ref, 'refs/tags/') }}
needs:
- build-debs
- build-sdist
runs-on: ubuntu-latest
steps:
- name: Download all workflow run artifacts
uses: actions/download-artifact@v2
- name: Build a tarball for the debs
run: tar -cvJf debs.tar.xz debs
- name: Attach to release
uses: softprops/action-gh-release@a929a66f232c1b11af63782948aa2210f981808a # PR#109
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
files: |
Sdist/*
Wheel/*
debs.tar.xz
# if it's not already published, keep the release as a draft.
draft: true
# mark it as a prerelease if the tag contains 'rc'.
prerelease: ${{ contains(github.ref, 'rc') }}

View File

@@ -1,385 +0,0 @@
name: Tests
on:
push:
branches: ["develop", "release-*"]
pull_request:
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
check-sampleconfig:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- run: pip install .
- run: scripts-dev/generate_sample_config.sh --check
- run: scripts-dev/config-lint.sh
lint:
uses: "matrix-org/backend-meta/.github/workflows/python-poetry-ci.yml@v1"
with:
typechecking-extras: "all"
lint-crlf:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Check line endings
run: scripts-dev/check_line_terminators.sh
lint-newsfile:
if: ${{ github.base_ref == 'develop' || contains(github.base_ref, 'release-') }}
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- uses: actions/setup-python@v2
- run: "pip install 'towncrier>=18.6.0rc1'"
- run: scripts-dev/check-newsfragment.sh
env:
PULL_REQUEST_NUMBER: ${{ github.event.number }}
# Dummy step to gate other tests on without repeating the whole list
linting-done:
if: ${{ !cancelled() }} # Run this even if prior jobs were skipped
needs: [lint, lint-crlf, lint-newsfile, check-sampleconfig]
runs-on: ubuntu-latest
steps:
- run: "true"
trial:
if: ${{ !cancelled() && !failure() }} # Allow previous steps to be skipped, but not fail
needs: linting-done
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.7", "3.8", "3.9", "3.10"]
database: ["sqlite"]
extras: ["all"]
include:
# Newest Python without optional deps
- python-version: "3.10"
extras: ""
# Oldest Python with PostgreSQL
- python-version: "3.7"
database: "postgres"
postgres-version: "10"
extras: "all"
# Newest Python with newest PostgreSQL
- python-version: "3.10"
database: "postgres"
postgres-version: "14"
extras: "all"
steps:
- uses: actions/checkout@v2
- run: sudo apt-get -qq install xmlsec1
- name: Set up PostgreSQL ${{ matrix.postgres-version }}
if: ${{ matrix.postgres-version }}
run: |
docker run -d -p 5432:5432 \
-e POSTGRES_PASSWORD=postgres \
-e POSTGRES_INITDB_ARGS="--lc-collate C --lc-ctype C --encoding UTF8" \
postgres:${{ matrix.postgres-version }}
- uses: matrix-org/setup-python-poetry@v1
with:
python-version: ${{ matrix.python-version }}
extras: ${{ matrix.extras }}
- name: Await PostgreSQL
if: ${{ matrix.postgres-version }}
timeout-minutes: 2
run: until pg_isready -h localhost; do sleep 1; done
- run: poetry run trial --jobs=2 tests
env:
SYNAPSE_POSTGRES: ${{ matrix.database == 'postgres' || '' }}
SYNAPSE_POSTGRES_HOST: localhost
SYNAPSE_POSTGRES_USER: postgres
SYNAPSE_POSTGRES_PASSWORD: postgres
- name: Dump logs
# Logs are most useful when the command fails, always include them.
if: ${{ always() }}
# Note: Dumps to workflow logs instead of using actions/upload-artifact
# This keeps logs colocated with failing jobs
# It also ignores find's exit code; this is a best effort affair
run: >-
find _trial_temp -name '*.log'
-exec echo "::group::{}" \;
-exec cat {} \;
-exec echo "::endgroup::" \;
|| true
trial-olddeps:
# Note: sqlite only; no postgres
if: ${{ !cancelled() && !failure() }} # Allow previous steps to be skipped, but not fail
needs: linting-done
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Test with old deps
uses: docker://ubuntu:focal # For old python and sqlite
# Note: focal seems to be using 3.8, but the oldest is 3.7?
# See https://github.com/matrix-org/synapse/issues/12343
with:
workdir: /github/workspace
entrypoint: .ci/scripts/test_old_deps.sh
- name: Dump logs
# Logs are most useful when the command fails, always include them.
if: ${{ always() }}
# Note: Dumps to workflow logs instead of using actions/upload-artifact
# This keeps logs colocated with failing jobs
# It also ignores find's exit code; this is a best effort affair
run: >-
find _trial_temp -name '*.log'
-exec echo "::group::{}" \;
-exec cat {} \;
-exec echo "::endgroup::" \;
|| true
trial-pypy:
# Very slow; only run if the branch name includes 'pypy'
# Note: sqlite only; no postgres. Completely untested since poetry move.
if: ${{ contains(github.ref, 'pypy') && !failure() && !cancelled() }}
needs: linting-done
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["pypy-3.7"]
extras: ["all"]
steps:
- uses: actions/checkout@v2
# Install libs necessary for PyPy to build binary wheels for dependencies
- run: sudo apt-get -qq install xmlsec1 libxml2-dev libxslt-dev
- uses: matrix-org/setup-python-poetry@v1
with:
python-version: ${{ matrix.python-version }}
extras: ${{ matrix.extras }}
- run: poetry run trial --jobs=2 tests
- name: Dump logs
# Logs are most useful when the command fails, always include them.
if: ${{ always() }}
# Note: Dumps to workflow logs instead of using actions/upload-artifact
# This keeps logs colocated with failing jobs
# It also ignores find's exit code; this is a best effort affair
run: >-
find _trial_temp -name '*.log'
-exec echo "::group::{}" \;
-exec cat {} \;
-exec echo "::endgroup::" \;
|| true
sytest:
if: ${{ !failure() && !cancelled() }}
needs: linting-done
runs-on: ubuntu-latest
container:
image: matrixdotorg/sytest-synapse:${{ matrix.sytest-tag }}
volumes:
- ${{ github.workspace }}:/src
env:
SYTEST_BRANCH: ${{ github.head_ref }}
POSTGRES: ${{ matrix.postgres && 1}}
MULTI_POSTGRES: ${{ (matrix.postgres == 'multi-postgres') && 1}}
WORKERS: ${{ matrix.workers && 1 }}
REDIS: ${{ matrix.redis && 1 }}
BLACKLIST: ${{ matrix.workers && 'synapse-blacklist-with-workers' }}
TOP: ${{ github.workspace }}
strategy:
fail-fast: false
matrix:
include:
- sytest-tag: focal
- sytest-tag: focal
postgres: postgres
- sytest-tag: testing
postgres: postgres
- sytest-tag: focal
postgres: multi-postgres
workers: workers
- sytest-tag: buster
postgres: multi-postgres
workers: workers
- sytest-tag: buster
postgres: postgres
workers: workers
redis: redis
steps:
- uses: actions/checkout@v2
- name: Prepare test blacklist
run: cat sytest-blacklist .ci/worker-blacklist > synapse-blacklist-with-workers
- name: Run SyTest
run: /bootstrap.sh synapse
working-directory: /src
- name: Summarise results.tap
if: ${{ always() }}
run: /sytest/scripts/tap_to_gha.pl /logs/results.tap
- name: Upload SyTest logs
uses: actions/upload-artifact@v2
if: ${{ always() }}
with:
name: Sytest Logs - ${{ job.status }} - (${{ join(matrix.*, ', ') }})
path: |
/logs/results.tap
/logs/**/*.log*
export-data:
if: ${{ !failure() && !cancelled() }} # Allow previous steps to be skipped, but not fail
needs: [linting-done, portdb]
runs-on: ubuntu-latest
env:
TOP: ${{ github.workspace }}
services:
postgres:
image: postgres
ports:
- 5432:5432
env:
POSTGRES_PASSWORD: "postgres"
POSTGRES_INITDB_ARGS: "--lc-collate C --lc-ctype C --encoding UTF8"
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
steps:
- uses: actions/checkout@v2
- run: sudo apt-get -qq install xmlsec1
- uses: matrix-org/setup-python-poetry@v1
with:
python-version: ${{ matrix.python-version }}
extras: "postgres"
- run: .ci/scripts/test_export_data_command.sh
portdb:
if: ${{ !failure() && !cancelled() }} # Allow previous steps to be skipped, but not fail
needs: linting-done
runs-on: ubuntu-latest
env:
TOP: ${{ github.workspace }}
strategy:
matrix:
include:
- python-version: "3.7"
postgres-version: "10"
- python-version: "3.10"
postgres-version: "14"
services:
postgres:
image: postgres:${{ matrix.postgres-version }}
ports:
- 5432:5432
env:
POSTGRES_PASSWORD: "postgres"
POSTGRES_INITDB_ARGS: "--lc-collate C --lc-ctype C --encoding UTF8"
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
steps:
- uses: actions/checkout@v2
- run: sudo apt-get -qq install xmlsec1
- uses: matrix-org/setup-python-poetry@v1
with:
python-version: ${{ matrix.python-version }}
extras: "postgres"
- run: .ci/scripts/test_synapse_port_db.sh
complement:
if: ${{ !failure() && !cancelled() }}
needs: linting-done
runs-on: ubuntu-latest
steps:
# The path is set via a file given by $GITHUB_PATH. We need both Go 1.17 and GOPATH on the path to run Complement.
# See https://docs.github.com/en/actions/using-workflows/workflow-commands-for-github-actions#adding-a-system-path
- name: "Set Go Version"
run: |
# Add Go 1.17 to the PATH: see https://github.com/actions/virtual-environments/blob/main/images/linux/Ubuntu2004-Readme.md#environment-variables-2
echo "$GOROOT_1_17_X64/bin" >> $GITHUB_PATH
# Add the Go path to the PATH: We need this so we can call gotestfmt
echo "~/go/bin" >> $GITHUB_PATH
- name: "Install Complement Dependencies"
run: |
sudo apt-get update && sudo apt-get install -y libolm3 libolm-dev
go get -v github.com/haveyoudebuggedit/gotestfmt/v2/cmd/gotestfmt@latest
- name: Run actions/checkout@v2 for synapse
uses: actions/checkout@v2
with:
path: synapse
# Attempt to check out the same branch of Complement as the PR. If it
# doesn't exist, fallback to HEAD.
- name: Checkout complement
shell: bash
run: |
mkdir -p complement
# Attempt to use the version of complement which best matches the current
# build. Depending on whether this is a PR or release, etc. we need to
# use different fallbacks.
#
# 1. First check if there's a similarly named branch (GITHUB_HEAD_REF
# for pull requests, otherwise GITHUB_REF).
# 2. Attempt to use the base branch, e.g. when merging into release-vX.Y
# (GITHUB_BASE_REF for pull requests).
# 3. Use the default complement branch ("HEAD").
for BRANCH_NAME in "$GITHUB_HEAD_REF" "$GITHUB_BASE_REF" "${GITHUB_REF#refs/heads/}" "HEAD"; do
# Skip empty branch names and merge commits.
if [[ -z "$BRANCH_NAME" || $BRANCH_NAME =~ ^refs/pull/.* ]]; then
continue
fi
(wget -O - "https://github.com/matrix-org/complement/archive/$BRANCH_NAME.tar.gz" | tar -xz --strip-components=1 -C complement) && break
done
- run: |
set -o pipefail
COMPLEMENT_DIR=`pwd`/complement synapse/scripts-dev/complement.sh -json 2>&1 | gotestfmt
shell: bash
name: Run Complement Tests
# a job which marks all the other jobs as complete, thus allowing PRs to be merged.
tests-done:
if: ${{ always() }}
needs:
- check-sampleconfig
- lint
- lint-crlf
- lint-newsfile
- trial
- trial-olddeps
- sytest
- export-data
- portdb
- complement
runs-on: ubuntu-latest
steps:
- uses: matrix-org/done-action@v2
with:
needs: ${{ toJSON(needs) }}
# The newsfile lint may be skipped on non PR builds
skippable:
lint-newsfile

View File

@@ -1,116 +0,0 @@
name: Twisted Trunk
on:
schedule:
- cron: 0 8 * * *
workflow_dispatch:
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
mypy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: matrix-org/setup-python-poetry@v1
with:
python-version: "3.x"
extras: "all"
- run: |
poetry remove twisted
poetry add --extras tls git+https://github.com/twisted/twisted.git#trunk
poetry install --no-interaction --extras "all test"
- name: Remove warn_unused_ignores from mypy config
run: sed '/warn_unused_ignores = True/d' -i mypy.ini
- run: poetry run mypy
trial:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- run: sudo apt-get -qq install xmlsec1
- uses: matrix-org/setup-python-poetry@v1
with:
python-version: "3.x"
extras: "all test"
- run: |
poetry remove twisted
poetry add --extras tls git+https://github.com/twisted/twisted.git#trunk
poetry install --no-interaction --extras "all test"
- run: poetry run trial --jobs 2 tests
- name: Dump logs
# Logs are most useful when the command fails, always include them.
if: ${{ always() }}
# Note: Dumps to workflow logs instead of using actions/upload-artifact
# This keeps logs colocated with failing jobs
# It also ignores find's exit code; this is a best effort affair
run: >-
find _trial_temp -name '*.log'
-exec echo "::group::{}" \;
-exec cat {} \;
-exec echo "::endgroup::" \;
|| true
sytest:
runs-on: ubuntu-latest
container:
image: matrixdotorg/sytest-synapse:buster
volumes:
- ${{ github.workspace }}:/src
steps:
- uses: actions/checkout@v2
- name: Patch dependencies
# Note: The poetry commands want to create a virtualenv in /src/.venv/,
# but the sytest-synapse container expects it to be in /venv/.
# We symlink it before running poetry so that poetry actually
# ends up installing to `/venv`.
run: |
ln -s -T /venv /src/.venv
poetry remove twisted
poetry add --extras tls git+https://github.com/twisted/twisted.git#trunk
poetry install --no-interaction --extras "all test"
working-directory: /src
- name: Run SyTest
run: /bootstrap.sh synapse
working-directory: /src
env:
# Use offline mode to avoid reinstalling the pinned version of
# twisted.
OFFLINE: 1
- name: Summarise results.tap
if: ${{ always() }}
run: /sytest/scripts/tap_to_gha.pl /logs/results.tap
- name: Upload SyTest logs
uses: actions/upload-artifact@v2
if: ${{ always() }}
with:
name: Sytest Logs - ${{ job.status }} - (${{ join(matrix.*, ', ') }})
path: |
/logs/results.tap
/logs/**/*.log*
# open an issue if the build fails, so we know about it.
open-issue:
if: failure()
needs:
- mypy
- trial
- sytest
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: JasonEtco/create-an-issue@5d9504915f79f9cc6d791934b8ef34f2353dd74d # v2.5.0, 2020-12-06
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
update_existing: true
filename: .ci/twisted_trunk_build_failed_issue_template.md

99
.gitignore vendored
View File

@@ -1,62 +1,59 @@
# filename patterns
*~
*.pyc
.*.swp
.#*
*.deb
*.egg
*.egg-info
*~
*.lock
*.py[cod]
*.snap
*.tac
.DS_Store
_trial_temp/
_trial_temp*/
/out
.DS_Store
__pycache__/
logs/
dbs/
*.egg
dist/
docs/build/
*.egg-info
# We do want the poetry lockfile.
!poetry.lock
cmdclient_config.json
homeserver*.db
homeserver*.log
homeserver*.log.*
homeserver*.pid
homeserver*.yaml
# stuff that is likely to exist when you run a server locally
/*.db
/*.log
/*.log.*
/*.log.config
/*.pid
/.python-version
/*.signing.key
/env/
/.venv*/
/homeserver*.yaml
/logs
/media_store/
/uploads
*.signing.key
*.tls.crt
*.tls.dh
*.tls.key
# For direnv users
/.envrc
.coverage
htmlcov
# IDEs
/.idea/
/.ropeproject/
/.vscode/
demo/*/*.db
demo/*/*.log
demo/*/*.log.*
demo/*/*.pid
demo/media_store.*
demo/etc
# build products
!/.coveragerc
/.coverage*
/.mypy_cache/
/.tox
/.tox-pg-container
/build/
/coverage.*
/dist/
/docs/build/
/htmlcov
/pip-wheel-metadata/
uploads
cache
# docs
book/
.idea/
media_store/
# complement
/complement-*
/master.tar.gz
*.tac
build/
venv/
venv*/
*venv/
localhost-800*/
static/client/register/register_config.js
.tox
env/
*.config
.vscode/
.ropeproject/

73
.travis.yml Normal file
View File

@@ -0,0 +1,73 @@
sudo: false
language: python
cache:
directories:
# we only bother to cache the wheels; parts of the http cache get
# invalidated every build (because they get served with a max-age of 600
# seconds), which means that we end up re-uploading the whole cache for
# every build, which is time-consuming In any case, it's not obvious that
# downloading the cache from S3 would be much faster than downloading the
# originals from pypi.
#
- $HOME/.cache/pip/wheels
# don't clone the whole repo history, one commit will do
git:
depth: 1
# only build branches we care about (PRs are built seperately)
branches:
only:
- master
- develop
- /^release-v/
# When running the tox environments that call Twisted Trial, we can pass the -j
# flag to run the tests concurrently. We set this to 2 for CPU bound tests
# (SQLite) and 4 for I/O bound tests (PostgreSQL).
matrix:
fast_finish: true
include:
- python: 2.7
env: TOX_ENV=packaging
- python: 3.6
env: TOX_ENV="pep8,check_isort"
- python: 2.7
env: TOX_ENV=py27 TRIAL_FLAGS="-j 2"
- python: 2.7
env: TOX_ENV=py27-old TRIAL_FLAGS="-j 2"
- python: 2.7
env: TOX_ENV=py27-postgres TRIAL_FLAGS="-j 4"
services:
- postgresql
- python: 3.5
env: TOX_ENV=py35 TRIAL_FLAGS="-j 2"
- python: 3.6
env: TOX_ENV=py36 TRIAL_FLAGS="-j 2"
- python: 3.6
env: TOX_ENV=py36-postgres TRIAL_FLAGS="-j 4"
services:
- postgresql
- # we only need to check for the newsfragment if it's a PR build
if: type = pull_request
python: 3.6
env: TOX_ENV=check-newsfragment
script:
- git remote set-branches --add origin develop
- git fetch origin develop
- tox -e $TOX_ENV
install:
- pip install tox
script:
- tox -e $TOX_ENV

View File

@@ -1,8 +1,34 @@
The following is an incomplete list of people outside the core team who have
contributed to Synapse. It is no longer maintained: more recent contributions
are listed in the `changelog <CHANGES.md>`_.
Erik Johnston <erik at matrix.org>
* HS core
* Federation API impl
----
Mark Haines <mark at matrix.org>
* HS core
* Crypto
* Content repository
* CS v2 API impl
Kegan Dougal <kegan at matrix.org>
* HS core
* CS v1 API impl
* AS API impl
Paul "LeoNerd" Evans <paul at matrix.org>
* HS core
* Presence
* Typing Notifications
* Performance metrics and caching layer
Dave Baker <dave at matrix.org>
* Push notifications
* Auth CS v2 impl
Matthew Hodgson <matthew at matrix.org>
* General doc & housekeeping
* Vertobot/vertobridge matrix<->verto PoC
Emmanuel Rohee <manu at matrix.org>
* Supporting iOS clients (testability and fallback registration)
Turned to Dust <dwinslow86 at gmail.com>
* ArchLinux installation instructions
@@ -36,16 +62,7 @@ Christoph Witzany <christoph at web.crofting.com>
* Add LDAP support for authentication
Pierre Jaury <pierre at jaury.eu>
* Docker packaging
* Docker packaging
Serban Constantin <serban.constantin at gmail dot com>
* Small bug fix
Joseph Weston <joseph at weston.cloud>
* Add admin API for querying HS version
Benjamin Saunders <ben.e.saunders at gmail dot com>
* Documentation improvements
Werner Sembach <werner.sembach at fau dot de>
* Automatically remove a group/community when it is empty
* Small bug fix

3684
CHANGES.md

File diff suppressed because it is too large Load Diff

View File

@@ -1,3 +0,0 @@
# Welcome to Synapse
Please see the [contributors' guide](https://matrix-org.github.io/synapse/latest/development/contributing_guide.html) in our rendered documentation.

169
CONTRIBUTING.rst Normal file
View File

@@ -0,0 +1,169 @@
Contributing code to Matrix
===========================
Everyone is welcome to contribute code to Matrix
(https://github.com/matrix-org), provided that they are willing to license
their contributions under the same license as the project itself. We follow a
simple 'inbound=outbound' model for contributions: the act of submitting an
'inbound' contribution means that the contributor agrees to license the code
under the same terms as the project's overall 'outbound' license - in our
case, this is almost always Apache Software License v2 (see LICENSE).
How to contribute
~~~~~~~~~~~~~~~~~
The preferred and easiest way to contribute changes to Matrix is to fork the
relevant project on github, and then create a pull request to ask us to pull
your changes into our repo
(https://help.github.com/articles/using-pull-requests/)
**The single biggest thing you need to know is: please base your changes on
the develop branch - /not/ master.**
We use the master branch to track the most recent release, so that folks who
blindly clone the repo and automatically check out master get something that
works. Develop is the unstable branch where all the development actually
happens: the workflow is that contributors should fork the develop branch to
make a 'feature' branch for a particular contribution, and then make a pull
request to merge this back into the matrix.org 'official' develop branch. We
use github's pull request workflow to review the contribution, and either ask
you to make any refinements needed or merge it and make them ourselves. The
changes will then land on master when we next do a release.
We use `CircleCI <https://circleci.com/gh/matrix-org>`_ and `Travis CI
<https://travis-ci.org/matrix-org/synapse>`_ for continuous integration. All
pull requests to synapse get automatically tested by Travis and CircleCI.
If your change breaks the build, this will be shown in GitHub, so please
keep an eye on the pull request for feedback.
To run unit tests in a local development environment, you can use:
- ``tox -e py27`` (requires tox to be installed by ``pip install tox``) for
SQLite-backed Synapse on Python 2.7.
- ``tox -e py35`` for SQLite-backed Synapse on Python 3.5.
- ``tox -e py36`` for SQLite-backed Synapse on Python 3.6.
- ``tox -e py27-postgres`` for PostgreSQL-backed Synapse on Python 2.7
(requires a running local PostgreSQL with access to create databases).
- ``./test_postgresql.sh`` for PostgreSQL-backed Synapse on Python 2.7
(requires Docker). Entirely self-contained, recommended if you don't want to
set up PostgreSQL yourself.
Docker images are available for running the integration tests (SyTest) locally,
see the `documentation in the SyTest repo
<https://github.com/matrix-org/sytest/blob/develop/docker/README.md>`_ for more
information.
Code style
~~~~~~~~~~
All Matrix projects have a well-defined code-style - and sometimes we've even
got as far as documenting it... For instance, synapse's code style doc lives
at https://github.com/matrix-org/synapse/tree/master/docs/code_style.rst.
Please ensure your changes match the cosmetic style of the existing project,
and **never** mix cosmetic and functional changes in the same commit, as it
makes it horribly hard to review otherwise.
Changelog
~~~~~~~~~
All changes, even minor ones, need a corresponding changelog / newsfragment
entry. These are managed by Towncrier
(https://github.com/hawkowl/towncrier).
To create a changelog entry, make a new file in the ``changelog.d``
file named in the format of ``PRnumber.type``. The type can be
one of ``feature``, ``bugfix``, ``removal`` (also used for
deprecations), or ``misc`` (for internal-only changes). The content of
the file is your changelog entry, which can contain Markdown
formatting. Adding credits to the changelog is encouraged, we value
your contributions and would like to have you shouted out in the
release notes!
For example, a fix in PR #1234 would have its changelog entry in
``changelog.d/1234.bugfix``, and contain content like "The security levels of
Florbs are now validated when recieved over federation. Contributed by Jane
Matrix".
Attribution
~~~~~~~~~~~
Everyone who contributes anything to Matrix is welcome to be listed in the
AUTHORS.rst file for the project in question. Please feel free to include a
change to AUTHORS.rst in your pull request to list yourself and a short
description of the area(s) you've worked on. Also, we sometimes have swag to
give away to contributors - if you feel that Matrix-branded apparel is missing
from your life, please mail us your shipping address to matrix at matrix.org and
we'll try to fix it :)
Sign off
~~~~~~~~
In order to have a concrete record that your contribution is intentional
and you agree to license it under the same terms as the project's license, we've adopted the
same lightweight approach that the Linux Kernel
(https://www.kernel.org/doc/Documentation/SubmittingPatches), Docker
(https://github.com/docker/docker/blob/master/CONTRIBUTING.md), and many other
projects use: the DCO (Developer Certificate of Origin:
http://developercertificate.org/). This is a simple declaration that you wrote
the contribution or otherwise have the right to contribute it to Matrix::
Developer Certificate of Origin
Version 1.1
Copyright (C) 2004, 2006 The Linux Foundation and its contributors.
660 York Street, Suite 102,
San Francisco, CA 94110 USA
Everyone is permitted to copy and distribute verbatim copies of this
license document, but changing it is not allowed.
Developer's Certificate of Origin 1.1
By making a contribution to this project, I certify that:
(a) The contribution was created in whole or in part by me and I
have the right to submit it under the open source license
indicated in the file; or
(b) The contribution is based upon previous work that, to the best
of my knowledge, is covered under an appropriate open source
license and I have the right under that license to submit that
work with modifications, whether created in whole or in part
by me, under the same open source license (unless I am
permitted to submit under a different license), as indicated
in the file; or
(c) The contribution was provided directly to me by some other
person who certified (a), (b) or (c) and I have not modified
it.
(d) I understand and agree that this project and the contribution
are public and that a record of the contribution (including all
personal information I submit with it, including my sign-off) is
maintained indefinitely and may be redistributed consistent with
this project or the open source license(s) involved.
If you agree to this for your contribution, then all that's needed is to
include the line in your commit or pull request comment::
Signed-off-by: Your Name <your@email.example.org>
We accept contributions under a legally identifiable name, such as
your name on government documentation or common-law names (names
claimed by legitimate usage or repute). Unfortunately, we cannot
accept anonymous contributions at this time.
Git allows you to add this signoff automatically when using the ``-s``
flag to ``git commit``, which uses the name and email set in your
``user.name`` and ``user.email`` git configs.
Conclusion
~~~~~~~~~~
That's it! Matrix is a very open and collaborative project as you might expect
given our obsession with open communication. If we're going to successfully
matrix together all the fragmented communication technologies out there we are
reliant on contributions and collaboration from the community to do so. So
please get involved - and we hope you have as much fun hacking on Matrix as we
do!

View File

@@ -1,7 +0,0 @@
# Installation Instructions
This document has moved to the
[Synapse documentation website](https://matrix-org.github.io/synapse/latest/setup/installation.html).
Please update your links.
The markdown source is available in [docs/setup/installation.md](docs/setup/installation.md).

40
MANIFEST.in Normal file
View File

@@ -0,0 +1,40 @@
include synctl
include LICENSE
include VERSION
include *.rst
include *.md
include demo/README
include demo/demo.tls.dh
include demo/*.py
include demo/*.sh
recursive-include synapse/storage/schema *.sql
recursive-include synapse/storage/schema *.py
recursive-include docs *
recursive-include scripts *
recursive-include scripts-dev *
recursive-include synapse *.pyi
recursive-include tests *.py
recursive-include synapse/res *
recursive-include synapse/static *.css
recursive-include synapse/static *.gif
recursive-include synapse/static *.html
recursive-include synapse/static *.js
exclude Dockerfile
exclude .dockerignore
exclude test_postgresql.sh
include pyproject.toml
recursive-include changelog.d *
prune .github
prune demo/etc
prune docker
prune .circleci
prune .coveragerc
exclude jenkins*
recursive-exclude jenkins *.sh

File diff suppressed because it is too large Load Diff

View File

@@ -1,7 +1,319 @@
Upgrading Synapse
=================
This document has moved to the `Synapse documentation website <https://matrix-org.github.io/synapse/latest/upgrade>`_.
Please update your links.
Before upgrading check if any special steps are required to upgrade from the
what you currently have installed to current version of synapse. The extra
instructions that may be required are listed later in this document.
The markdown source is available in `docs/upgrade.md <docs/upgrade.md>`_.
1. If synapse was installed in a virtualenv then active that virtualenv before
upgrading. If synapse is installed in a virtualenv in ``~/.synapse/`` then
run:
.. code:: bash
source ~/.synapse/bin/activate
2. If synapse was installed using pip then upgrade to the latest version by
running:
.. code:: bash
pip install --upgrade --process-dependency-links matrix-synapse
# restart synapse
synctl restart
If synapse was installed using git then upgrade to the latest version by
running:
.. code:: bash
# Pull the latest version of the master branch.
git pull
# Update the versions of synapse's python dependencies.
python synapse/python_dependencies.py | xargs pip install --upgrade
# restart synapse
./synctl restart
To check whether your update was sucessful, you can check the Server header
returned by the Client-Server API:
.. code:: bash
# replace <host.name> with the hostname of your synapse homeserver.
# You may need to specify a port (eg, :8448) if your server is not
# configured on port 443.
curl -kv https://<host.name>/_matrix/client/versions 2>&1 | grep "Server:"
Upgrading to v0.33.7
====================
This release removes the example email notification templates from
``res/templates`` (they are now internal to the python package). This should
only affect you if you (a) deploy your Synapse instance from a git checkout or
a github snapshot URL, and (b) have email notifications enabled.
If you have email notifications enabled, you should ensure that
``email.template_dir`` is either configured to point at a directory where you
have installed customised templates, or leave it unset to use the default
templates.
Upgrading to v0.27.3
====================
This release expands the anonymous usage stats sent if the opt-in
``report_stats`` configuration is set to ``true``. We now capture RSS memory
and cpu use at a very coarse level. This requires administrators to install
the optional ``psutil`` python module.
We would appreciate it if you could assist by ensuring this module is available
and ``report_stats`` is enabled. This will let us see if performance changes to
synapse are having an impact to the general community.
Upgrading to v0.15.0
====================
If you want to use the new URL previewing API (/_matrix/media/r0/preview_url)
then you have to explicitly enable it in the config and update your dependencies
dependencies. See README.rst for details.
Upgrading to v0.11.0
====================
This release includes the option to send anonymous usage stats to matrix.org,
and requires that administrators explictly opt in or out by setting the
``report_stats`` option to either ``true`` or ``false``.
We would really appreciate it if you could help our project out by reporting
anonymized usage statistics from your homeserver. Only very basic aggregate
data (e.g. number of users) will be reported, but it helps us to track the
growth of the Matrix community, and helps us to make Matrix a success, as well
as to convince other networks that they should peer with us.
Upgrading to v0.9.0
===================
Application services have had a breaking API change in this version.
They can no longer register themselves with a home server using the AS HTTP API. This
decision was made because a compromised application service with free reign to register
any regex in effect grants full read/write access to the home server if a regex of ``.*``
is used. An attack where a compromised AS re-registers itself with ``.*`` was deemed too
big of a security risk to ignore, and so the ability to register with the HS remotely has
been removed.
It has been replaced by specifying a list of application service registrations in
``homeserver.yaml``::
app_service_config_files: ["registration-01.yaml", "registration-02.yaml"]
Where ``registration-01.yaml`` looks like::
url: <String> # e.g. "https://my.application.service.com"
as_token: <String>
hs_token: <String>
sender_localpart: <String> # This is a new field which denotes the user_id localpart when using the AS token
namespaces:
users:
- exclusive: <Boolean>
regex: <String> # e.g. "@prefix_.*"
aliases:
- exclusive: <Boolean>
regex: <String>
rooms:
- exclusive: <Boolean>
regex: <String>
Upgrading to v0.8.0
===================
Servers which use captchas will need to add their public key to::
static/client/register/register_config.js
window.matrixRegistrationConfig = {
recaptcha_public_key: "YOUR_PUBLIC_KEY"
};
This is required in order to support registration fallback (typically used on
mobile devices).
Upgrading to v0.7.0
===================
New dependencies are:
- pydenticon
- simplejson
- syutil
- matrix-angular-sdk
To pull in these dependencies in a virtual env, run::
python synapse/python_dependencies.py | xargs -n 1 pip install
Upgrading to v0.6.0
===================
To pull in new dependencies, run::
python setup.py develop --user
This update includes a change to the database schema. To upgrade you first need
to upgrade the database by running::
python scripts/upgrade_db_to_v0.6.0.py <db> <server_name> <signing_key>
Where `<db>` is the location of the database, `<server_name>` is the
server name as specified in the synapse configuration, and `<signing_key>` is
the location of the signing key as specified in the synapse configuration.
This may take some time to complete. Failures of signatures and content hashes
can safely be ignored.
Upgrading to v0.5.1
===================
Depending on precisely when you installed v0.5.0 you may have ended up with
a stale release of the reference matrix webclient installed as a python module.
To uninstall it and ensure you are depending on the latest module, please run::
$ pip uninstall syweb
Upgrading to v0.5.0
===================
The webclient has been split out into a seperate repository/pacakage in this
release. Before you restart your homeserver you will need to pull in the
webclient package by running::
python setup.py develop --user
This release completely changes the database schema and so requires upgrading
it before starting the new version of the homeserver.
The script "database-prepare-for-0.5.0.sh" should be used to upgrade the
database. This will save all user information, such as logins and profiles,
but will otherwise purge the database. This includes messages, which
rooms the home server was a member of and room alias mappings.
If you would like to keep your history, please take a copy of your database
file and ask for help in #matrix:matrix.org. The upgrade process is,
unfortunately, non trivial and requires human intervention to resolve any
resulting conflicts during the upgrade process.
Before running the command the homeserver should be first completely
shutdown. To run it, simply specify the location of the database, e.g.:
./scripts/database-prepare-for-0.5.0.sh "homeserver.db"
Once this has successfully completed it will be safe to restart the
homeserver. You may notice that the homeserver takes a few seconds longer to
restart than usual as it reinitializes the database.
On startup of the new version, users can either rejoin remote rooms using room
aliases or by being reinvited. Alternatively, if any other homeserver sends a
message to a room that the homeserver was previously in the local HS will
automatically rejoin the room.
Upgrading to v0.4.0
===================
This release needs an updated syutil version. Run::
python setup.py develop
You will also need to upgrade your configuration as the signing key format has
changed. Run::
python -m synapse.app.homeserver --config-path <CONFIG> --generate-config
Upgrading to v0.3.0
===================
This registration API now closely matches the login API. This introduces a bit
more backwards and forwards between the HS and the client, but this improves
the overall flexibility of the API. You can now GET on /register to retrieve a list
of valid registration flows. Upon choosing one, they are submitted in the same
way as login, e.g::
{
type: m.login.password,
user: foo,
password: bar
}
The default HS supports 2 flows, with and without Identity Server email
authentication. Enabling captcha on the HS will add in an extra step to all
flows: ``m.login.recaptcha`` which must be completed before you can transition
to the next stage. There is a new login type: ``m.login.email.identity`` which
contains the ``threepidCreds`` key which were previously sent in the original
register request. For more information on this, see the specification.
Web Client
----------
The VoIP specification has changed between v0.2.0 and v0.3.0. Users should
refresh any browser tabs to get the latest web client code. Users on
v0.2.0 of the web client will not be able to call those on v0.3.0 and
vice versa.
Upgrading to v0.2.0
===================
The home server now requires setting up of SSL config before it can run. To
automatically generate default config use::
$ python synapse/app/homeserver.py \
--server-name machine.my.domain.name \
--bind-port 8448 \
--config-path homeserver.config \
--generate-config
This config can be edited if desired, for example to specify a different SSL
certificate to use. Once done you can run the home server using::
$ python synapse/app/homeserver.py --config-path homeserver.config
See the README.rst for more information.
Also note that some config options have been renamed, including:
- "host" to "server-name"
- "database" to "database-path"
- "port" to "bind-port" and "unsecure-port"
Upgrading to v0.0.1
===================
This release completely changes the database schema and so requires upgrading
it before starting the new version of the homeserver.
The script "database-prepare-for-0.0.1.sh" should be used to upgrade the
database. This will save all user information, such as logins and profiles,
but will otherwise purge the database. This includes messages, which
rooms the home server was a member of and room alias mappings.
Before running the command the homeserver should be first completely
shutdown. To run it, simply specify the location of the database, e.g.:
./scripts/database-prepare-for-0.0.1.sh "homeserver.db"
Once this has successfully completed it will be safe to restart the
homeserver. You may notice that the homeserver takes a few seconds longer to
restart than usual as it reinitializes the database.
On startup of the new version, users can either rejoin remote rooms using room
aliases or by being reinvited. Alternatively, if any other homeserver sends a
message to a room that the homeserver was previously in the local HS will
automatically rejoin the room.

View File

@@ -1,39 +0,0 @@
# Documentation for possible options in this file is at
# https://rust-lang.github.io/mdBook/format/config.html
[book]
title = "Synapse"
authors = ["The Matrix.org Foundation C.I.C."]
language = "en"
multilingual = false
# The directory that documentation files are stored in
src = "docs"
[build]
# Prevent markdown pages from being automatically generated when they're
# linked to in SUMMARY.md
create-missing = false
[output.html]
# The URL visitors will be directed to when they try to edit a page
edit-url-template = "https://github.com/matrix-org/synapse/edit/develop/{path}"
# Remove the numbers that appear before each item in the sidebar, as they can
# get quite messy as we nest deeper
no-section-label = true
# The source code URL of the repository
git-repository-url = "https://github.com/matrix-org/synapse"
# The path that the docs are hosted on
site-url = "/synapse/"
# Additional HTML, JS, CSS that's injected into each page of the book.
# More information available in docs/website_files/README.md
additional-css = [
"docs/website_files/table-of-contents.css",
"docs/website_files/remove-nav-buttons.css",
"docs/website_files/indent-section-headers.css",
]
additional-js = ["docs/website_files/table-of-contents.js"]
theme = "docs/website_files/theme"

View File

@@ -1 +0,0 @@
Improve event caching mechanism to avoid having multiple copies of an event in memory at a time.

View File

@@ -1 +0,0 @@
Add some type hints to datastore.

View File

@@ -1 +0,0 @@
Preparation for faster-room-join work: return subsets of room state which we already have, immediately.

View File

@@ -1 +0,0 @@
Measure the time taken in spam-checking callbacks and expose those measurements as metrics.

View File

@@ -1 +0,0 @@
Replace string literal instances of stream key types with typed constants.

View File

@@ -1 +0,0 @@
Add `@cancellable` decorator, for use on endpoint methods that can be cancelled when clients disconnect.

View File

@@ -1 +0,0 @@
Add ability to cancel disconnected requests to `SynapseRequest`.

View File

@@ -1 +0,0 @@
Add a `default_power_level_content_override` config option to set default room power levels per room preset.

View File

@@ -1 +0,0 @@
Add support for [MSC3787: Allowing knocks to restricted rooms](https://github.com/matrix-org/matrix-spec-proposals/pull/3787).

View File

@@ -1 +0,0 @@
Add a helper class for testing request cancellation.

View File

@@ -1 +0,0 @@
Synapse will now reload [cache config](https://matrix-org.github.io/synapse/latest/usage/configuration/config_documentation.html#caching) when it receives a [SIGHUP](https://en.wikipedia.org/wiki/SIGHUP) signal.

View File

@@ -1 +0,0 @@
Improve documentation of the `synapse.push` module.

View File

@@ -1 +0,0 @@
Refactor functions to on `PushRuleEvaluatorForEvent`.

View File

@@ -1 +0,0 @@
Preparation for database schema simplifications: stop writing to `event_reference_hashes`.

View File

@@ -1 +0,0 @@
Remove code which updates unused database column `application_services_state.last_txn`.

View File

@@ -1 +0,0 @@
Fix a bug introduced in Synapse 1.57.0 where `/messages` would throw a 500 error when querying for a non-existent room.

View File

@@ -1 +0,0 @@
Refactor `EventContext` class.

View File

@@ -1 +0,0 @@
Remove an unneeded class in the push code.

View File

@@ -1 +0,0 @@
Consolidate parsing of relation information from events.

View File

@@ -1 +0,0 @@
Capture the `Deferred` for request cancellation in `_AsyncResource`.

View File

@@ -1 +0,0 @@
Fixes an incorrect type hint for `Filter._check_event_relations`.

View File

@@ -1 +0,0 @@
Fix a long-standing bug where an empty room would be created when a user with an insufficient power level tried to upgrade a room.

View File

@@ -1 +0,0 @@
Respect the `@cancellable` flag for `DirectServe{Html,Json}Resource`s.

View File

@@ -1 +0,0 @@
Respect the `@cancellable` flag for `RestServlet`s and `BaseFederationServlet`s.

View File

@@ -1 +0,0 @@
Respect the `@cancellable` flag for `ReplicationEndpoint`s.

View File

@@ -1 +0,0 @@
Add a config options to allow for auto-tuning of caches.

View File

@@ -1 +0,0 @@
Complain if a federation endpoint has the `@cancellable` flag, since some of the wrapper code may not handle cancellation correctly yet.

View File

@@ -1 +0,0 @@
Enable cancellation of `GET /rooms/$room_id/members`, `GET /rooms/$room_id/state` and `GET /rooms/$room_id/state/$event_type/*` requests.

View File

@@ -1 +0,0 @@
Require a body in POST requests to `/rooms/{roomId}/receipt/{receiptType}/{eventId}`, as required by the [Matrix specification](https://spec.matrix.org/v1.2/client-server-api/#post_matrixclientv3roomsroomidreceiptreceipttypeeventid). This breaks compatibility with Element Android 1.2.0 and earlier: users of those clients will be unable to send read receipts.

View File

@@ -1 +0,0 @@
Optimize private read receipt filtering.

View File

@@ -1 +0,0 @@
Fix a bug introduced in Synapse 1.30.0 where empty rooms could be automatically created if a monthly active users limit is set.

View File

@@ -1 +0,0 @@
Fix a typo in the Media Admin API documentation.

View File

@@ -1 +0,0 @@
Add type annotations to increase the number of modules passing `disallow-untyped-defs`.

View File

@@ -1 +0,0 @@
Add some type hints to datastore.

View File

@@ -1 +0,0 @@
Drop the logging level of status messages for the URL preview cache expiry job from INFO to DEBUG.

View File

@@ -1 +0,0 @@
Fix push to dismiss notifications when read on another client. Contributed by @SpiritCroc @ Beeper.

View File

@@ -1 +0,0 @@
Downgrade some OIDC errors to warnings in the logs, to reduce the noise of Sentry reports.

View File

@@ -1 +0,0 @@
Add type annotations to increase the number of modules passing `disallow-untyped-defs`.

View File

@@ -1 +0,0 @@
Update the OpenID Connect example for Keycloak to be compatible with newer versions of Keycloak. Contributed by @nhh.

View File

@@ -1 +0,0 @@
Update configs used by Complement to allow more invites/3PID validations during tests.

View File

@@ -1 +0,0 @@
Tidy up and type-hint the database engine modules.

View File

@@ -1 +0,0 @@
Fix typo in server listener documentation.

View File

@@ -1 +0,0 @@
Fix poor database performance when reading the cache invalidation stream for large servers with lots of workers.

View File

@@ -1 +0,0 @@
Link to the configuration manual from the welcome page of the documentation.

View File

@@ -1 +0,0 @@
Fix typo in 'run_background_tasks_on' option name in configuration manual documentation.

View File

@@ -1 +0,0 @@
Add some type hints to datastore.

View File

@@ -1 +0,0 @@
Add information regarding the `rc_invites` ratelimiting option to the configuration docs.

View File

@@ -1 +0,0 @@
Add documentation for cancellation of request processing.

View File

@@ -1 +0,0 @@
Fix a long-standing bug where the user directory background process would fail to make forward progress if a user included a null codepoint in their display name or avatar.

View File

@@ -1 +0,0 @@
Recommend using docker to run tests against postgres.

View File

@@ -1 +0,0 @@
Tweak the mypy plugin so that `@cached` can accept `on_invalidate=None`.

View File

@@ -1 +0,0 @@
Delete events from the `federation_inbound_events_staging` table when a room is purged through the admin API.

View File

@@ -1 +0,0 @@
Move methods that call `add_push_rule` to the `PushRuleStore` class.

View File

@@ -1 +0,0 @@
Make handling of federation Authorization header (more) compliant with RFC7230.

View File

@@ -1 +0,0 @@
Refactor `resolve_state_groups_for_events` to not pull out full state when no state resolution happens.

View File

@@ -1 +0,0 @@
Give a meaningful error message when a client tries to create a room with an invalid alias localpart.

View File

@@ -1 +0,0 @@
Do not keep going if there are 5 back-to-back background update failures.

View File

@@ -1 +0,0 @@
Fix federation when using the demo scripts.

Some files were not shown because too many files have changed in this diff Show More