1
0

Compare commits

..

4 Commits

Author SHA1 Message Date
David Robertson
4d343db081 Get rid of my home dir, whoops 2021-11-16 16:37:43 +00:00
David Robertson
a1367dcf8c Require networkx 2021-11-16 16:34:33 +00:00
David Robertson
9e361c8550 Changelog 2021-11-16 13:52:59 +00:00
David Robertson
51fec1a534 Commit hacky script to visualise store inheritance
Use e.g. with `scripts-dev/storage_inheritance.py DataStore --show`.
2021-11-16 13:51:50 +00:00
879 changed files with 31899 additions and 68676 deletions

View File

@@ -1,4 +0,0 @@
---
title: CI run against latest deps is failing
---
See https://github.com/{{env.GITHUB_REPOSITORY}}/actions/runs/{{env.GITHUB_RUN_ID}}

8
.ci/patch_for_twisted_trunk.sh Executable file
View File

@@ -0,0 +1,8 @@
#!/bin/sh
# replaces the dependency on Twisted in `python_dependencies` with trunk.
set -e
cd "$(dirname "$0")"/..
sed -i -e 's#"Twisted.*"#"Twisted @ git+https://github.com/twisted/twisted"#' synapse/python_dependencies.py

View File

@@ -2,24 +2,29 @@
# Test for the export-data admin command against sqlite and postgres # Test for the export-data admin command against sqlite and postgres
# Expects Synapse to have been already installed with `poetry install --extras postgres`.
# Expects `poetry` to be available on the `PATH`.
set -xe set -xe
cd "$(dirname "$0")/../.." cd "$(dirname "$0")/../.."
echo "--- Install dependencies"
# Install dependencies for this test.
pip install psycopg2
# Install Synapse itself. This won't update any libraries.
pip install -e .
echo "--- Generate the signing key" echo "--- Generate the signing key"
# Generate the server's signing key. # Generate the server's signing key.
poetry run synapse_homeserver --generate-keys -c .ci/sqlite-config.yaml python -m synapse.app.homeserver --generate-keys -c .ci/sqlite-config.yaml
echo "--- Prepare test database" echo "--- Prepare test database"
# Make sure the SQLite3 database is using the latest schema and has no pending background update. # Make sure the SQLite3 database is using the latest schema and has no pending background update.
poetry run update_synapse_database --database-config .ci/sqlite-config.yaml --run-background-updates scripts/update_synapse_database --database-config .ci/sqlite-config.yaml --run-background-updates
# Run the export-data command on the sqlite test database # Run the export-data command on the sqlite test database
poetry run python -m synapse.app.admin_cmd -c .ci/sqlite-config.yaml export-data @anon-20191002_181700-832:localhost:8800 \ python -m synapse.app.admin_cmd -c .ci/sqlite-config.yaml export-data @anon-20191002_181700-832:localhost:8800 \
--output-directory /tmp/export_data --output-directory /tmp/export_data
# Test that the output directory exists and contains the rooms directory # Test that the output directory exists and contains the rooms directory
@@ -32,14 +37,14 @@ else
fi fi
# Create the PostgreSQL database. # Create the PostgreSQL database.
poetry run .ci/scripts/postgres_exec.py "CREATE DATABASE synapse" .ci/scripts/postgres_exec.py "CREATE DATABASE synapse"
# Port the SQLite databse to postgres so we can check command works against postgres # Port the SQLite databse to postgres so we can check command works against postgres
echo "+++ Port SQLite3 databse to postgres" echo "+++ Port SQLite3 databse to postgres"
poetry run synapse_port_db --sqlite-database .ci/test_db.db --postgres-config .ci/postgres-config.yaml scripts/synapse_port_db --sqlite-database .ci/test_db.db --postgres-config .ci/postgres-config.yaml
# Run the export-data command on postgres database # Run the export-data command on postgres database
poetry run python -m synapse.app.admin_cmd -c .ci/postgres-config.yaml export-data @anon-20191002_181700-832:localhost:8800 \ python -m synapse.app.admin_cmd -c .ci/postgres-config.yaml export-data @anon-20191002_181700-832:localhost:8800 \
--output-directory /tmp/export_data2 --output-directory /tmp/export_data2
# Test that the output directory exists and contains the rooms directory # Test that the output directory exists and contains the rooms directory

View File

@@ -1,81 +1,16 @@
#!/usr/bin/env bash #!/usr/bin/env bash
# this script is run by GitHub Actions in a plain `focal` container; it
# - installs the minimal system requirements, and poetry;
# - patches the project definition file to refer to old versions only;
# - creates a venv with these old versions using poetry; and finally
# - invokes `trial` to run the tests with old deps.
# Prevent tzdata from asking for user input # this script is run by GitHub Actions in a plain `bionic` container; it installs the
export DEBIAN_FRONTEND=noninteractive # minimal requirements for tox and hands over to the py3-old tox environment.
set -ex set -ex
apt-get update apt-get update
apt-get install -y \ apt-get install -y python3 python3-dev python3-pip libxml2-dev libxslt-dev xmlsec1 zlib1g-dev tox
python3 python3-dev python3-pip python3-venv pipx \
libxml2-dev libxslt-dev xmlsec1 zlib1g-dev libjpeg-dev libwebp-dev
export LANG="C.UTF-8" export LANG="C.UTF-8"
# Prevent virtualenv from auto-updating pip to an incompatible version # Prevent virtualenv from auto-updating pip to an incompatible version
export VIRTUALENV_NO_DOWNLOAD=1 export VIRTUALENV_NO_DOWNLOAD=1
# TODO: in the future, we could use an implementation of exec tox -e py3-old,combine
# https://github.com/python-poetry/poetry/issues/3527
# https://github.com/pypa/pip/issues/8085
# to select the lowest possible versions, rather than resorting to this sed script.
# Patch the project definitions in-place:
# - Replace all lower and tilde bounds with exact bounds
# - Make the pyopenssl 17.0, which is the oldest version that works with
# a `cryptography` compiled against OpenSSL 1.1.
# - Delete all lines referring to psycopg2 --- so no testing of postgres support.
# - Omit systemd: we're not logging to journal here.
# TODO: also replace caret bounds, see https://python-poetry.org/docs/dependency-specification/#version-constraints
# We don't use these yet, but IIRC they are the default bound used when you `poetry add`.
# The sed expression 's/\^/==/g' ought to do the trick. But it would also change
# `python = "^3.7"` to `python = "==3.7", which would mean we fail because olddeps
# runs on 3.8 (#12343).
sed -i \
-e "s/[~>]=/==/g" \
-e "/psycopg2/d" \
-e 's/pyOpenSSL = "==16.0.0"/pyOpenSSL = "==17.0.0"/' \
-e '/systemd/d' \
pyproject.toml
# Use poetry to do the installation. This ensures that the versions are all mutually
# compatible (as far the package metadata declares, anyway); pip's package resolver
# is more lax.
#
# Rather than `poetry install --no-dev`, we drop all dev dependencies from the
# toml file. This means we don't have to ensure compatibility between old deps and
# dev tools.
pip install --user toml
REMOVE_DEV_DEPENDENCIES="
import toml
with open('pyproject.toml', 'r') as f:
data = toml.loads(f.read())
del data['tool']['poetry']['dev-dependencies']
with open('pyproject.toml', 'w') as f:
toml.dump(data, f)
"
python3 -c "$REMOVE_DEV_DEPENDENCIES"
pipx install poetry==1.1.12
~/.local/bin/poetry lock
echo "::group::Patched pyproject.toml"
cat pyproject.toml
echo "::endgroup::"
echo "::group::Lockfile after patch"
cat poetry.lock
echo "::endgroup::"
~/.local/bin/poetry install -E "all test"
~/.local/bin/poetry run trial --jobs=2 tests

View File

@@ -1,37 +1,41 @@
#!/usr/bin/env bash #!/usr/bin/env bash
# #
# Test script for 'synapse_port_db'. # Test script for 'synapse_port_db'.
# - configures synapse and a postgres server. # - sets up synapse and deps
# - runs the port script on a prepopulated test sqlite db # - runs the port script on a prepopulated test sqlite db
# - also runs it against an new sqlite db # - also runs it against an new sqlite db
#
# Expects Synapse to have been already installed with `poetry install --extras postgres`.
# Expects `poetry` to be available on the `PATH`.
set -xe set -xe
cd "$(dirname "$0")/../.." cd "$(dirname "$0")/../.."
echo "--- Install dependencies"
# Install dependencies for this test.
pip install psycopg2 coverage coverage-enable-subprocess
# Install Synapse itself. This won't update any libraries.
pip install -e .
echo "--- Generate the signing key" echo "--- Generate the signing key"
# Generate the server's signing key. # Generate the server's signing key.
poetry run synapse_homeserver --generate-keys -c .ci/sqlite-config.yaml python -m synapse.app.homeserver --generate-keys -c .ci/sqlite-config.yaml
echo "--- Prepare test database" echo "--- Prepare test database"
# Make sure the SQLite3 database is using the latest schema and has no pending background update. # Make sure the SQLite3 database is using the latest schema and has no pending background update.
poetry run update_synapse_database --database-config .ci/sqlite-config.yaml --run-background-updates scripts/update_synapse_database --database-config .ci/sqlite-config.yaml --run-background-updates
# Create the PostgreSQL database. # Create the PostgreSQL database.
poetry run .ci/scripts/postgres_exec.py "CREATE DATABASE synapse" .ci/scripts/postgres_exec.py "CREATE DATABASE synapse"
echo "+++ Run synapse_port_db against test database" echo "+++ Run synapse_port_db against test database"
# TODO: this invocation of synapse_port_db (and others below) used to be prepended with `coverage run`, coverage run scripts/synapse_port_db --sqlite-database .ci/test_db.db --postgres-config .ci/postgres-config.yaml
# but coverage seems unable to find the entrypoints installed by `pip install -e .`.
poetry run synapse_port_db --sqlite-database .ci/test_db.db --postgres-config .ci/postgres-config.yaml
# We should be able to run twice against the same database. # We should be able to run twice against the same database.
echo "+++ Run synapse_port_db a second time" echo "+++ Run synapse_port_db a second time"
poetry run synapse_port_db --sqlite-database .ci/test_db.db --postgres-config .ci/postgres-config.yaml coverage run scripts/synapse_port_db --sqlite-database .ci/test_db.db --postgres-config .ci/postgres-config.yaml
##### #####
@@ -42,12 +46,12 @@ echo "--- Prepare empty SQLite database"
# we do this by deleting the sqlite db, and then doing the same again. # we do this by deleting the sqlite db, and then doing the same again.
rm .ci/test_db.db rm .ci/test_db.db
poetry run update_synapse_database --database-config .ci/sqlite-config.yaml --run-background-updates scripts/update_synapse_database --database-config .ci/sqlite-config.yaml --run-background-updates
# re-create the PostgreSQL database. # re-create the PostgreSQL database.
poetry run .ci/scripts/postgres_exec.py \ .ci/scripts/postgres_exec.py \
"DROP DATABASE synapse" \ "DROP DATABASE synapse" \
"CREATE DATABASE synapse" "CREATE DATABASE synapse"
echo "+++ Run synapse_port_db against empty database" echo "+++ Run synapse_port_db against empty database"
poetry run synapse_port_db --sqlite-database .ci/test_db.db --postgres-config .ci/postgres-config.yaml coverage run scripts/synapse_port_db --sqlite-database .ci/test_db.db --postgres-config .ci/postgres-config.yaml

View File

@@ -3,9 +3,11 @@
# things to include # things to include
!docker !docker
!scripts
!synapse !synapse
!MANIFEST.in
!README.rst !README.rst
!pyproject.toml !setup.py
!poetry.lock !synctl
**/__pycache__ **/__pycache__

11
.flake8
View File

@@ -1,11 +0,0 @@
# TODO: incorporate this into pyproject.toml if flake8 supports it in the future.
# See https://github.com/PyCQA/flake8/issues/234
[flake8]
# see https://pycodestyle.readthedocs.io/en/latest/intro.html#error-codes
# for error codes. The ones we ignore are:
# W503: line break before binary operator
# W504: line break after binary operator
# E203: whitespace before ':' (which is contrary to pep8?)
# E731: do not assign a lambda expression, use a def
# E501: Line too long (black enforces this for us)
ignore=W503,W504,E203,E731,E501

View File

@@ -8,7 +8,6 @@
- Use markdown where necessary, mostly for `code blocks`. - Use markdown where necessary, mostly for `code blocks`.
- End with either a period (.) or an exclamation mark (!). - End with either a period (.) or an exclamation mark (!).
- Start with a capital letter. - Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by @github_username." or "Contributed by [Your Name]." to the end of the entry.
* [ ] Pull request includes a [sign off](https://matrix-org.github.io/synapse/latest/development/contributing_guide.html#sign-off) * [ ] Pull request includes a [sign off](https://matrix-org.github.io/synapse/latest/development/contributing_guide.html#sign-off)
* [ ] [Code style](https://matrix-org.github.io/synapse/latest/code_style.html) is correct * [ ] [Code style](https://matrix-org.github.io/synapse/latest/code_style.html) is correct
(run the [linters](https://matrix-org.github.io/synapse/latest/development/contributing_guide.html#run-the-linters)) (run the [linters](https://matrix-org.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))

View File

@@ -5,7 +5,7 @@ name: Build docker images
on: on:
push: push:
tags: ["v*"] tags: ["v*"]
branches: [ master, main, develop ] branches: [ master, main ]
workflow_dispatch: workflow_dispatch:
permissions: permissions:
@@ -36,22 +36,37 @@ jobs:
- name: Calculate docker image tag - name: Calculate docker image tag
id: set-tag id: set-tag
uses: docker/metadata-action@master run: |
case "${GITHUB_REF}" in
refs/heads/master|refs/heads/main)
tag=latest
;;
refs/tags/*)
tag=${GITHUB_REF#refs/tags/}
;;
*)
tag=${GITHUB_SHA}
;;
esac
echo "::set-output name=tag::$tag"
# for release builds, we want to get the amd64 image out asap, so first
# we do an amd64-only build, before following up with a multiarch build.
- name: Build and push amd64
uses: docker/build-push-action@v2
if: "${{ startsWith(github.ref, 'refs/tags/v') }}"
with: with:
images: matrixdotorg/synapse push: true
flavor: | labels: "gitsha1=${{ github.sha }}"
latest=false tags: "matrixdotorg/synapse:${{ steps.set-tag.outputs.tag }}"
tags: | file: "docker/Dockerfile"
type=raw,value=develop,enable=${{ github.ref == 'refs/heads/develop' }} platforms: linux/amd64
type=raw,value=latest,enable=${{ github.ref == 'refs/heads/master' }}
type=raw,value=latest,enable=${{ github.ref == 'refs/heads/main' }}
type=pep440,pattern={{raw}}
- name: Build and push all platforms - name: Build and push all platforms
uses: docker/build-push-action@v2 uses: docker/build-push-action@v2
with: with:
push: true push: true
labels: "gitsha1=${{ github.sha }}" labels: "gitsha1=${{ github.sha }}"
tags: "${{ steps.set-tag.outputs.tags }}" tags: "matrixdotorg/synapse:${{ steps.set-tag.outputs.tag }}"
file: "docker/Dockerfile" file: "docker/Dockerfile"
platforms: linux/amd64,linux/arm64 platforms: linux/amd64,linux/arm64

View File

@@ -22,7 +22,7 @@ jobs:
- name: Setup mdbook - name: Setup mdbook
uses: peaceiris/actions-mdbook@4b5ef36b314c2599664ca107bb8c02412548d79d # v1.1.14 uses: peaceiris/actions-mdbook@4b5ef36b314c2599664ca107bb8c02412548d79d # v1.1.14
with: with:
mdbook-version: '0.4.17' mdbook-version: '0.4.9'
- name: Build the documentation - name: Build the documentation
# mdbook will only create an index.html if we're including docs/README.md in SUMMARY.md. # mdbook will only create an index.html if we're including docs/README.md in SUMMARY.md.

View File

@@ -1,159 +0,0 @@
# People who are freshly `pip install`ing from PyPI will pull in the latest versions of
# dependencies which match the broad requirements. Since most CI runs are against
# the locked poetry environment, run specifically against the latest dependencies to
# know if there's an upcoming breaking change.
#
# As an overview this workflow:
# - checks out develop,
# - installs from source, pulling in the dependencies like a fresh `pip install` would, and
# - runs mypy and test suites in that checkout.
#
# Based on the twisted trunk CI job.
name: Latest dependencies
on:
schedule:
- cron: 0 7 * * *
workflow_dispatch:
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
mypy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
# The dev dependencies aren't exposed in the wheel metadata (at least with current
# poetry-core versions), so we install with poetry.
- uses: matrix-org/setup-python-poetry@v1
with:
python-version: "3.x"
poetry-version: "1.2.0b1"
extras: "all"
# Dump installed versions for debugging.
- run: poetry run pip list > before.txt
# Upgrade all runtime dependencies only. This is intended to mimic a fresh
# `pip install matrix-synapse[all]` as closely as possible.
- run: poetry update --no-dev
- run: poetry run pip list > after.txt && (diff -u before.txt after.txt || true)
- name: Remove warn_unused_ignores from mypy config
run: sed '/warn_unused_ignores = True/d' -i mypy.ini
- run: poetry run mypy
trial:
runs-on: ubuntu-latest
strategy:
matrix:
include:
- database: "sqlite"
- database: "postgres"
postgres-version: "14"
steps:
- uses: actions/checkout@v2
- run: sudo apt-get -qq install xmlsec1
- name: Set up PostgreSQL ${{ matrix.postgres-version }}
if: ${{ matrix.postgres-version }}
run: |
docker run -d -p 5432:5432 \
-e POSTGRES_PASSWORD=postgres \
-e POSTGRES_INITDB_ARGS="--lc-collate C --lc-ctype C --encoding UTF8" \
postgres:${{ matrix.postgres-version }}
- uses: actions/setup-python@v2
with:
python-version: "3.x"
- run: pip install .[all,test]
- name: Await PostgreSQL
if: ${{ matrix.postgres-version }}
timeout-minutes: 2
run: until pg_isready -h localhost; do sleep 1; done
- run: python -m twisted.trial --jobs=2 tests
env:
SYNAPSE_POSTGRES: ${{ matrix.database == 'postgres' || '' }}
SYNAPSE_POSTGRES_HOST: localhost
SYNAPSE_POSTGRES_USER: postgres
SYNAPSE_POSTGRES_PASSWORD: postgres
- name: Dump logs
# Logs are most useful when the command fails, always include them.
if: ${{ always() }}
# Note: Dumps to workflow logs instead of using actions/upload-artifact
# This keeps logs colocated with failing jobs
# It also ignores find's exit code; this is a best effort affair
run: >-
find _trial_temp -name '*.log'
-exec echo "::group::{}" \;
-exec cat {} \;
-exec echo "::endgroup::" \;
|| true
sytest:
runs-on: ubuntu-latest
container:
image: matrixdotorg/sytest-synapse:testing
volumes:
- ${{ github.workspace }}:/src
strategy:
fail-fast: false
matrix:
include:
- sytest-tag: focal
- sytest-tag: focal
postgres: postgres
workers: workers
redis: redis
env:
POSTGRES: ${{ matrix.postgres && 1}}
WORKERS: ${{ matrix.workers && 1 }}
REDIS: ${{ matrix.redis && 1 }}
BLACKLIST: ${{ matrix.workers && 'synapse-blacklist-with-workers' }}
steps:
- uses: actions/checkout@v2
- name: Ensure sytest runs `pip install`
# Delete the lockfile so sytest will `pip install` rather than `poetry install`
run: rm /src/poetry.lock
working-directory: /src
- name: Prepare test blacklist
run: cat sytest-blacklist .ci/worker-blacklist > synapse-blacklist-with-workers
- name: Run SyTest
run: /bootstrap.sh synapse
working-directory: /src
- name: Summarise results.tap
if: ${{ always() }}
run: /sytest/scripts/tap_to_gha.pl /logs/results.tap
- name: Upload SyTest logs
uses: actions/upload-artifact@v2
if: ${{ always() }}
with:
name: Sytest Logs - ${{ job.status }} - (${{ join(matrix.*, ', ') }})
path: |
/logs/results.tap
/logs/**/*.log*
# TODO: run complement (as with twisted trunk, see #12473).
# open an issue if the build fails, so we know about it.
open-issue:
if: failure()
needs:
# TODO: should mypy be included here? It feels more brittle than the other two.
- mypy
- trial
- sytest
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: JasonEtco/create-an-issue@5d9504915f79f9cc6d791934b8ef34f2353dd74d # v2.5.0, 2020-12-06
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
update_existing: true
filename: .ci/latest_deps_build_failed_issue_template.md

View File

@@ -7,7 +7,7 @@ on:
# of things breaking (but only build one set of debs) # of things breaking (but only build one set of debs)
pull_request: pull_request:
push: push:
branches: ["develop", "release-*"] branches: ["develop"]
# we do the full build on tags. # we do the full build on tags.
tags: ["v*"] tags: ["v*"]
@@ -31,7 +31,7 @@ jobs:
# if we're running from a tag, get the full list of distros; otherwise just use debian:sid # if we're running from a tag, get the full list of distros; otherwise just use debian:sid
dists='["debian:sid"]' dists='["debian:sid"]'
if [[ $GITHUB_REF == refs/tags/* ]]; then if [[ $GITHUB_REF == refs/tags/* ]]; then
dists=$(scripts-dev/build_debian_packages.py --show-dists-json) dists=$(scripts-dev/build_debian_packages --show-dists-json)
fi fi
echo "::set-output name=distros::$dists" echo "::set-output name=distros::$dists"
# map the step outputs to job outputs # map the step outputs to job outputs
@@ -74,7 +74,7 @@ jobs:
# see https://github.com/docker/build-push-action/issues/252 # see https://github.com/docker/build-push-action/issues/252
# for the cache magic here # for the cache magic here
run: | run: |
./src/scripts-dev/build_debian_packages.py \ ./src/scripts-dev/build_debian_packages \
--docker-build-arg=--cache-from=type=local,src=/tmp/.buildx-cache \ --docker-build-arg=--cache-from=type=local,src=/tmp/.buildx-cache \
--docker-build-arg=--cache-to=type=local,mode=max,dest=/tmp/.buildx-cache-new \ --docker-build-arg=--cache-to=type=local,mode=max,dest=/tmp/.buildx-cache-new \
--docker-build-arg=--progress=plain \ --docker-build-arg=--progress=plain \
@@ -91,7 +91,17 @@ jobs:
build-sdist: build-sdist:
name: "Build pypi distribution files" name: "Build pypi distribution files"
uses: "matrix-org/backend-meta/.github/workflows/packaging.yml@v1" runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- run: pip install wheel
- run: |
python setup.py sdist bdist_wheel
- uses: actions/upload-artifact@v2
with:
name: python-dist
path: dist/*
# if it's a tag, create a release and attach the artifacts to it # if it's a tag, create a release and attach the artifacts to it
attach-assets: attach-assets:
@@ -112,8 +122,7 @@ jobs:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with: with:
files: | files: |
Sdist/* python-dist/*
Wheel/*
debs.tar.xz debs.tar.xz
# if it's not already published, keep the release as a draft. # if it's not already published, keep the release as a draft.
draft: true draft: true

View File

@@ -10,19 +10,22 @@ concurrency:
cancel-in-progress: true cancel-in-progress: true
jobs: jobs:
check-sampleconfig: lint:
runs-on: ubuntu-latest runs-on: ubuntu-latest
strategy:
matrix:
toxenv:
- "check-sampleconfig"
- "check_codestyle"
- "check_isort"
- "mypy"
- "packaging"
steps: steps:
- uses: actions/checkout@v2 - uses: actions/checkout@v2
- uses: actions/setup-python@v2 - uses: actions/setup-python@v2
- run: pip install . - run: pip install tox
- run: scripts-dev/generate_sample_config.sh --check - run: tox -e ${{ matrix.toxenv }}
- run: scripts-dev/config-lint.sh
lint:
uses: "matrix-org/backend-meta/.github/workflows/python-poetry-ci.yml@v1"
with:
typechecking-extras: "all"
lint-crlf: lint-crlf:
runs-on: ubuntu-latest runs-on: ubuntu-latest
@@ -40,15 +43,29 @@ jobs:
ref: ${{ github.event.pull_request.head.sha }} ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0 fetch-depth: 0
- uses: actions/setup-python@v2 - uses: actions/setup-python@v2
- run: "pip install 'towncrier>=18.6.0rc1'" - run: pip install tox
- run: scripts-dev/check-newsfragment.sh - run: scripts-dev/check-newsfragment
env: env:
PULL_REQUEST_NUMBER: ${{ github.event.number }} PULL_REQUEST_NUMBER: ${{ github.event.number }}
lint-sdist:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: "3.x"
- run: pip install wheel
- run: python setup.py sdist bdist_wheel
- uses: actions/upload-artifact@v2
with:
name: Python Distributions
path: dist/*
# Dummy step to gate other tests on without repeating the whole list # Dummy step to gate other tests on without repeating the whole list
linting-done: linting-done:
if: ${{ !cancelled() }} # Run this even if prior jobs were skipped if: ${{ !cancelled() }} # Run this even if prior jobs were skipped
needs: [lint, lint-crlf, lint-newsfile, check-sampleconfig] needs: [lint, lint-crlf, lint-newsfile, lint-sdist]
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- run: "true" - run: "true"
@@ -59,25 +76,25 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
strategy: strategy:
matrix: matrix:
python-version: ["3.7", "3.8", "3.9", "3.10"] python-version: ["3.6", "3.7", "3.8", "3.9", "3.10"]
database: ["sqlite"] database: ["sqlite"]
extras: ["all"] toxenv: ["py"]
include: include:
# Newest Python without optional deps # Newest Python without optional deps
- python-version: "3.10" - python-version: "3.10"
extras: "" toxenv: "py-noextras"
# Oldest Python with PostgreSQL # Oldest Python with PostgreSQL
- python-version: "3.7" - python-version: "3.6"
database: "postgres" database: "postgres"
postgres-version: "10" postgres-version: "9.6"
extras: "all" toxenv: "py"
# Newest Python with newest PostgreSQL # Newest Python with newest PostgreSQL
- python-version: "3.10" - python-version: "3.10"
database: "postgres" database: "postgres"
postgres-version: "14" postgres-version: "14"
extras: "all" toxenv: "py"
steps: steps:
- uses: actions/checkout@v2 - uses: actions/checkout@v2
@@ -89,16 +106,17 @@ jobs:
-e POSTGRES_PASSWORD=postgres \ -e POSTGRES_PASSWORD=postgres \
-e POSTGRES_INITDB_ARGS="--lc-collate C --lc-ctype C --encoding UTF8" \ -e POSTGRES_INITDB_ARGS="--lc-collate C --lc-ctype C --encoding UTF8" \
postgres:${{ matrix.postgres-version }} postgres:${{ matrix.postgres-version }}
- uses: matrix-org/setup-python-poetry@v1 - uses: actions/setup-python@v2
with: with:
python-version: ${{ matrix.python-version }} python-version: ${{ matrix.python-version }}
extras: ${{ matrix.extras }} - run: pip install tox
- name: Await PostgreSQL - name: Await PostgreSQL
if: ${{ matrix.postgres-version }} if: ${{ matrix.postgres-version }}
timeout-minutes: 2 timeout-minutes: 2
run: until pg_isready -h localhost; do sleep 1; done run: until pg_isready -h localhost; do sleep 1; done
- run: poetry run trial --jobs=2 tests - run: tox -e ${{ matrix.toxenv }}
env: env:
TRIAL_FLAGS: "--jobs=2"
SYNAPSE_POSTGRES: ${{ matrix.database == 'postgres' || '' }} SYNAPSE_POSTGRES: ${{ matrix.database == 'postgres' || '' }}
SYNAPSE_POSTGRES_HOST: localhost SYNAPSE_POSTGRES_HOST: localhost
SYNAPSE_POSTGRES_USER: postgres SYNAPSE_POSTGRES_USER: postgres
@@ -117,19 +135,18 @@ jobs:
|| true || true
trial-olddeps: trial-olddeps:
# Note: sqlite only; no postgres
if: ${{ !cancelled() && !failure() }} # Allow previous steps to be skipped, but not fail if: ${{ !cancelled() && !failure() }} # Allow previous steps to be skipped, but not fail
needs: linting-done needs: linting-done
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v2 - uses: actions/checkout@v2
- name: Test with old deps - name: Test with old deps
uses: docker://ubuntu:focal # For old python and sqlite uses: docker://ubuntu:bionic # For old python and sqlite
# Note: focal seems to be using 3.8, but the oldest is 3.7?
# See https://github.com/matrix-org/synapse/issues/12343
with: with:
workdir: /github/workspace workdir: /github/workspace
entrypoint: .ci/scripts/test_old_deps.sh entrypoint: .ci/scripts/test_old_deps.sh
env:
TRIAL_FLAGS: "--jobs=2"
- name: Dump logs - name: Dump logs
# Logs are most useful when the command fails, always include them. # Logs are most useful when the command fails, always include them.
if: ${{ always() }} if: ${{ always() }}
@@ -145,24 +162,23 @@ jobs:
trial-pypy: trial-pypy:
# Very slow; only run if the branch name includes 'pypy' # Very slow; only run if the branch name includes 'pypy'
# Note: sqlite only; no postgres. Completely untested since poetry move.
if: ${{ contains(github.ref, 'pypy') && !failure() && !cancelled() }} if: ${{ contains(github.ref, 'pypy') && !failure() && !cancelled() }}
needs: linting-done needs: linting-done
runs-on: ubuntu-latest runs-on: ubuntu-latest
strategy: strategy:
matrix: matrix:
python-version: ["pypy-3.7"] python-version: ["pypy-3.6"]
extras: ["all"]
steps: steps:
- uses: actions/checkout@v2 - uses: actions/checkout@v2
# Install libs necessary for PyPy to build binary wheels for dependencies
- run: sudo apt-get -qq install xmlsec1 libxml2-dev libxslt-dev - run: sudo apt-get -qq install xmlsec1 libxml2-dev libxslt-dev
- uses: matrix-org/setup-python-poetry@v1 - uses: actions/setup-python@v2
with: with:
python-version: ${{ matrix.python-version }} python-version: ${{ matrix.python-version }}
extras: ${{ matrix.extras }} - run: pip install tox
- run: poetry run trial --jobs=2 tests - run: tox -e py
env:
TRIAL_FLAGS: "--jobs=2"
- name: Dump logs - name: Dump logs
# Logs are most useful when the command fails, always include them. # Logs are most useful when the command fails, always include them.
if: ${{ always() }} if: ${{ always() }}
@@ -197,15 +213,15 @@ jobs:
fail-fast: false fail-fast: false
matrix: matrix:
include: include:
- sytest-tag: focal - sytest-tag: bionic
- sytest-tag: focal - sytest-tag: bionic
postgres: postgres postgres: postgres
- sytest-tag: testing - sytest-tag: testing
postgres: postgres postgres: postgres
- sytest-tag: focal - sytest-tag: bionic
postgres: multi-postgres postgres: multi-postgres
workers: workers workers: workers
@@ -261,10 +277,9 @@ jobs:
steps: steps:
- uses: actions/checkout@v2 - uses: actions/checkout@v2
- run: sudo apt-get -qq install xmlsec1 - run: sudo apt-get -qq install xmlsec1
- uses: matrix-org/setup-python-poetry@v1 - uses: actions/setup-python@v2
with: with:
python-version: ${{ matrix.python-version }} python-version: "3.9"
extras: "postgres"
- run: .ci/scripts/test_export_data_command.sh - run: .ci/scripts/test_export_data_command.sh
portdb: portdb:
@@ -276,8 +291,8 @@ jobs:
strategy: strategy:
matrix: matrix:
include: include:
- python-version: "3.7" - python-version: "3.6"
postgres-version: "10" postgres-version: "9.6"
- python-version: "3.10" - python-version: "3.10"
postgres-version: "14" postgres-version: "14"
@@ -299,39 +314,33 @@ jobs:
steps: steps:
- uses: actions/checkout@v2 - uses: actions/checkout@v2
- run: sudo apt-get -qq install xmlsec1 - run: sudo apt-get -qq install xmlsec1
- uses: matrix-org/setup-python-poetry@v1 - uses: actions/setup-python@v2
with: with:
python-version: ${{ matrix.python-version }} python-version: ${{ matrix.python-version }}
extras: "postgres"
- run: .ci/scripts/test_synapse_port_db.sh - run: .ci/scripts/test_synapse_port_db.sh
complement: complement:
if: ${{ !failure() && !cancelled() }} if: ${{ !failure() && !cancelled() }}
needs: linting-done needs: linting-done
runs-on: ubuntu-latest runs-on: ubuntu-latest
container:
# https://github.com/matrix-org/complement/blob/master/dockerfiles/ComplementCIBuildkite.Dockerfile
image: matrixdotorg/complement:latest
env:
CI: true
ports:
- 8448:8448
volumes:
- /var/run/docker.sock:/var/run/docker.sock
steps: steps:
# The path is set via a file given by $GITHUB_PATH. We need both Go 1.17 and GOPATH on the path to run Complement.
# See https://docs.github.com/en/actions/using-workflows/workflow-commands-for-github-actions#adding-a-system-path
- name: "Set Go Version"
run: |
# Add Go 1.17 to the PATH: see https://github.com/actions/virtual-environments/blob/main/images/linux/Ubuntu2004-Readme.md#environment-variables-2
echo "$GOROOT_1_17_X64/bin" >> $GITHUB_PATH
# Add the Go path to the PATH: We need this so we can call gotestfmt
echo "~/go/bin" >> $GITHUB_PATH
- name: "Install Complement Dependencies"
run: |
sudo apt-get update && sudo apt-get install -y libolm3 libolm-dev
go get -v github.com/haveyoudebuggedit/gotestfmt/v2/cmd/gotestfmt@latest
- name: Run actions/checkout@v2 for synapse - name: Run actions/checkout@v2 for synapse
uses: actions/checkout@v2 uses: actions/checkout@v2
with: with:
path: synapse path: synapse
# Attempt to check out the same branch of Complement as the PR. If it # Attempt to check out the same branch of Complement as the PR. If it
# doesn't exist, fallback to HEAD. # doesn't exist, fallback to master.
- name: Checkout complement - name: Checkout complement
shell: bash shell: bash
run: | run: |
@@ -344,8 +353,8 @@ jobs:
# for pull requests, otherwise GITHUB_REF). # for pull requests, otherwise GITHUB_REF).
# 2. Attempt to use the base branch, e.g. when merging into release-vX.Y # 2. Attempt to use the base branch, e.g. when merging into release-vX.Y
# (GITHUB_BASE_REF for pull requests). # (GITHUB_BASE_REF for pull requests).
# 3. Use the default complement branch ("HEAD"). # 3. Use the default complement branch ("master").
for BRANCH_NAME in "$GITHUB_HEAD_REF" "$GITHUB_BASE_REF" "${GITHUB_REF#refs/heads/}" "HEAD"; do for BRANCH_NAME in "$GITHUB_HEAD_REF" "$GITHUB_BASE_REF" "${GITHUB_REF#refs/heads/}" "master"; do
# Skip empty branch names and merge commits. # Skip empty branch names and merge commits.
if [[ -z "$BRANCH_NAME" || $BRANCH_NAME =~ ^refs/pull/.* ]]; then if [[ -z "$BRANCH_NAME" || $BRANCH_NAME =~ ^refs/pull/.* ]]; then
continue continue
@@ -354,32 +363,55 @@ jobs:
(wget -O - "https://github.com/matrix-org/complement/archive/$BRANCH_NAME.tar.gz" | tar -xz --strip-components=1 -C complement) && break (wget -O - "https://github.com/matrix-org/complement/archive/$BRANCH_NAME.tar.gz" | tar -xz --strip-components=1 -C complement) && break
done done
- run: | # Build initial Synapse image
set -o pipefail - run: docker build -t matrixdotorg/synapse:latest -f docker/Dockerfile .
COMPLEMENT_DIR=`pwd`/complement synapse/scripts-dev/complement.sh -json 2>&1 | gotestfmt working-directory: synapse
shell: bash
name: Run Complement Tests # Build a ready-to-run Synapse image based on the initial image above.
# This new image includes a config file, keys for signing and TLS, and
# other settings to make it suitable for testing under Complement.
- run: docker build -t complement-synapse -f Synapse.Dockerfile .
working-directory: complement/dockerfiles
# Run Complement
- run: go test -v -tags synapse_blacklist,msc2403,msc2946,msc3083 ./tests/...
env:
COMPLEMENT_BASE_IMAGE: complement-synapse:latest
working-directory: complement
# a job which marks all the other jobs as complete, thus allowing PRs to be merged. # a job which marks all the other jobs as complete, thus allowing PRs to be merged.
tests-done: tests-done:
if: ${{ always() }} if: ${{ always() }}
needs: needs:
- check-sampleconfig
- lint - lint
- lint-crlf - lint-crlf
- lint-newsfile - lint-newsfile
- lint-sdist
- trial - trial
- trial-olddeps - trial-olddeps
- sytest - sytest
- export-data
- portdb - portdb
- complement - complement
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: matrix-org/done-action@v2 - name: Set build result
with: env:
needs: ${{ toJSON(needs) }} NEEDS_CONTEXT: ${{ toJSON(needs) }}
# the `jq` incantation dumps out a series of "<job> <result>" lines.
# we set it to an intermediate variable to avoid a pipe, which makes it
# hard to set $rc.
run: |
rc=0
results=$(jq -r 'to_entries[] | [.key,.value.result] | join(" ")' <<< $NEEDS_CONTEXT)
while read job result ; do
# The newsfile lint may be skipped on non PR builds
if [ $result == "skipped" ] && [ $job == "lint-newsfile" ]; then
continue
fi
# The newsfile lint may be skipped on non PR builds if [ "$result" != "success" ]; then
skippable: echo "::set-failed ::Job $job returned $result"
lint-newsfile rc=1
fi
done <<< $results
exit $rc

View File

@@ -6,27 +6,16 @@ on:
workflow_dispatch: workflow_dispatch:
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs: jobs:
mypy: mypy:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v2 - uses: actions/checkout@v2
- uses: matrix-org/setup-python-poetry@v1 - uses: actions/setup-python@v2
with: - run: .ci/patch_for_twisted_trunk.sh
python-version: "3.x" - run: pip install tox
extras: "all" - run: tox -e mypy
- run: |
poetry remove twisted
poetry add --extras tls git+https://github.com/twisted/twisted.git#trunk
poetry install --no-interaction --extras "all test"
- name: Remove warn_unused_ignores from mypy config
run: sed '/warn_unused_ignores = True/d' -i mypy.ini
- run: poetry run mypy
trial: trial:
runs-on: ubuntu-latest runs-on: ubuntu-latest
@@ -34,15 +23,14 @@ jobs:
steps: steps:
- uses: actions/checkout@v2 - uses: actions/checkout@v2
- run: sudo apt-get -qq install xmlsec1 - run: sudo apt-get -qq install xmlsec1
- uses: matrix-org/setup-python-poetry@v1 - uses: actions/setup-python@v2
with: with:
python-version: "3.x" python-version: 3.6
extras: "all test" - run: .ci/patch_for_twisted_trunk.sh
- run: | - run: pip install tox
poetry remove twisted - run: tox -e py
poetry add --extras tls git+https://github.com/twisted/twisted.git#trunk env:
poetry install --no-interaction --extras "all test" TRIAL_FLAGS: "--jobs=2"
- run: poetry run trial --jobs 2 tests
- name: Dump logs - name: Dump logs
# Logs are most useful when the command fails, always include them. # Logs are most useful when the command fails, always include them.
@@ -67,23 +55,11 @@ jobs:
steps: steps:
- uses: actions/checkout@v2 - uses: actions/checkout@v2
- name: Patch dependencies - name: Patch dependencies
# Note: The poetry commands want to create a virtualenv in /src/.venv/, run: .ci/patch_for_twisted_trunk.sh
# but the sytest-synapse container expects it to be in /venv/.
# We symlink it before running poetry so that poetry actually
# ends up installing to `/venv`.
run: |
ln -s -T /venv /src/.venv
poetry remove twisted
poetry add --extras tls git+https://github.com/twisted/twisted.git#trunk
poetry install --no-interaction --extras "all test"
working-directory: /src working-directory: /src
- name: Run SyTest - name: Run SyTest
run: /bootstrap.sh synapse run: /bootstrap.sh synapse
working-directory: /src working-directory: /src
env:
# Use offline mode to avoid reinstalling the pinned version of
# twisted.
OFFLINE: 1
- name: Summarise results.tap - name: Summarise results.tap
if: ${{ always() }} if: ${{ always() }}
run: /sytest/scripts/tap_to_gha.pl /logs/results.tap run: /sytest/scripts/tap_to_gha.pl /logs/results.tap

10
.gitignore vendored
View File

@@ -15,9 +15,6 @@ _trial_temp*/
.DS_Store .DS_Store
__pycache__/ __pycache__/
# We do want the poetry lockfile.
!poetry.lock
# stuff that is likely to exist when you run a server locally # stuff that is likely to exist when you run a server locally
/*.db /*.db
/*.log /*.log
@@ -33,9 +30,6 @@ __pycache__/
/media_store/ /media_store/
/uploads /uploads
# For direnv users
/.envrc
# IDEs # IDEs
/.idea/ /.idea/
/.ropeproject/ /.ropeproject/
@@ -56,7 +50,3 @@ __pycache__/
# docs # docs
book/ book/
# complement
/complement-*
/master.tar.gz

9586
CHANGES.md

File diff suppressed because it is too large Load Diff

56
MANIFEST.in Normal file
View File

@@ -0,0 +1,56 @@
include synctl
include LICENSE
include VERSION
include *.rst
include *.md
include demo/README
include demo/demo.tls.dh
include demo/*.py
include demo/*.sh
include synapse/py.typed
recursive-include synapse/storage *.sql
recursive-include synapse/storage *.sql.postgres
recursive-include synapse/storage *.sql.sqlite
recursive-include synapse/storage *.py
recursive-include synapse/storage *.txt
recursive-include synapse/storage *.md
recursive-include docs *
recursive-include scripts *
recursive-include scripts-dev *
recursive-include synapse *.pyi
recursive-include tests *.py
recursive-include tests *.pem
recursive-include tests *.p8
recursive-include tests *.crt
recursive-include tests *.key
recursive-include synapse/res *
recursive-include synapse/static *.css
recursive-include synapse/static *.gif
recursive-include synapse/static *.html
recursive-include synapse/static *.js
exclude .codecov.yml
exclude .coveragerc
exclude .dockerignore
exclude .editorconfig
exclude Dockerfile
exclude mypy.ini
exclude sytest-blacklist
exclude test_postgresql.sh
include book.toml
include pyproject.toml
recursive-include changelog.d *
prune .circleci
prune .github
prune .ci
prune contrib
prune debian
prune demo/etc
prune docker
prune snap
prune stubs

View File

@@ -55,7 +55,7 @@ solutions. The hope is for Matrix to act as the building blocks for a new
generation of fully open and interoperable messaging and VoIP apps for the generation of fully open and interoperable messaging and VoIP apps for the
internet. internet.
Synapse is a Matrix "homeserver" implementation developed by the matrix.org core Synapse is a Matrix "homeserver" implementation developed by the matrix.org core
team, written in Python 3/Twisted. team, written in Python 3/Twisted.
In Matrix, every user runs one or more Matrix clients, which connect through to In Matrix, every user runs one or more Matrix clients, which connect through to
@@ -246,7 +246,7 @@ Password reset
============== ==============
Users can reset their password through their client. Alternatively, a server admin Users can reset their password through their client. Alternatively, a server admin
can reset a users password using the `admin API <docs/admin_api/user_admin_api.md#reset-password>`_ can reset a users password using the `admin API <docs/admin_api/user_admin_api.rst#reset-password>`_
or by directly editing the database as shown below. or by directly editing the database as shown below.
First calculate the hash of the new password:: First calculate the hash of the new password::
@@ -293,42 +293,36 @@ directory of your choice::
git clone https://github.com/matrix-org/synapse.git git clone https://github.com/matrix-org/synapse.git
cd synapse cd synapse
Synapse has a number of external dependencies. We maintain a fixed development Synapse has a number of external dependencies, that are easiest
environment using `Poetry <https://python-poetry.org/>`_. First, install poetry. We recommend:: to install using pip and a virtualenv::
pip install --user pipx python3 -m venv ./env
pipx install poetry source ./env/bin/activate
pip install -e ".[all,dev]"
as described `here <https://python-poetry.org/docs/#installing-with-pipx>`_.
(See `poetry's installation docs <https://python-poetry.org/docs/#installation>`_
for other installation methods.) Then ask poetry to create a virtual environment
from the project and install Synapse's dependencies::
poetry install --extras "all test"
This will run a process of downloading and installing all the needed This will run a process of downloading and installing all the needed
dependencies into a virtual env. dependencies into a virtual env. If any dependencies fail to install,
try installing the failing modules individually::
We recommend using the demo which starts 3 federated instances running on ports `8080` - `8082`:: pip install -e "module-name"
poetry run ./demo/start.sh We recommend using the demo which starts 3 federated instances running on ports `8080` - `8082`
(to stop, you can use ``poetry run ./demo/stop.sh``) ./demo/start.sh
See the `demo documentation <https://matrix-org.github.io/synapse/develop/development/demo.html>`_ (to stop, you can use `./demo/stop.sh`)
for more information.
If you just want to start a single instance of the app and run it directly:: If you just want to start a single instance of the app and run it directly::
# Create the homeserver.yaml config once # Create the homeserver.yaml config once
poetry run synapse_homeserver \ python -m synapse.app.homeserver \
--server-name my.domain.name \ --server-name my.domain.name \
--config-path homeserver.yaml \ --config-path homeserver.yaml \
--generate-config \ --generate-config \
--report-stats=[yes|no] --report-stats=[yes|no]
# Start the app # Start the app
poetry run synapse_homeserver --config-path homeserver.yaml python -m synapse.app.homeserver --config-path homeserver.yaml
Running the unit tests Running the unit tests
@@ -337,7 +331,7 @@ Running the unit tests
After getting up and running, you may wish to run Synapse's unit tests to After getting up and running, you may wish to run Synapse's unit tests to
check that everything is installed correctly:: check that everything is installed correctly::
poetry run trial tests trial tests
This should end with a 'PASSED' result (note that exact numbers will This should end with a 'PASSED' result (note that exact numbers will
differ):: differ)::

View File

@@ -1 +0,0 @@
Improve event caching mechanism to avoid having multiple copies of an event in memory at a time.

View File

@@ -0,0 +1 @@
Add a new version of delete room admin API `DELETE /_synapse/admin/v2/rooms/<room_id>` to run it in background. Contributed by @dklimpel.

View File

@@ -0,0 +1 @@
Allow the admin [Delete Room API](https://matrix-org.github.io/synapse/latest/admin_api/rooms.html#delete-room-api) to block a room without the need to join it.

2
changelog.d/11230.bugfix Normal file
View File

@@ -0,0 +1,2 @@
Fix a long-standing bug wherein display names or avatar URLs containing null bytes cause an internal server error
when stored in the DB.

View File

@@ -0,0 +1 @@
Support filtering by relation senders & types per [MSC3440](https://github.com/matrix-org/matrix-doc/pull/3440).

1
changelog.d/11242.misc Normal file
View File

@@ -0,0 +1 @@
Split out federated PDU retrieval function into a non-cached version.

1
changelog.d/11247.misc Normal file
View File

@@ -0,0 +1 @@
Clean up code relating to to-device messages and sending ephemeral events to application services.

1
changelog.d/11278.misc Normal file
View File

@@ -0,0 +1 @@
Fix a small typo in the error response when a relation type other than 'm.annotation' is passed to `GET /rooms/{room_id}/aggregations/{event_id}`.

1
changelog.d/11280.misc Normal file
View File

@@ -0,0 +1 @@
Drop unused db tables `room_stats_historical` and `user_stats_historical`.

1
changelog.d/11281.doc Normal file
View File

@@ -0,0 +1 @@
Suggest users of the Debian packages add configuration to `/etc/matrix-synapse/conf.d/` to prevent, upon upgrade, being asked to choose between their configuration and the maintainer's.

1
changelog.d/11282.misc Normal file
View File

@@ -0,0 +1 @@
Require all files in synapse/ and tests/ to pass mypy unless specifically excluded.

1
changelog.d/11285.misc Normal file
View File

@@ -0,0 +1 @@
Require all files in synapse/ and tests/ to pass mypy unless specifically excluded.

1
changelog.d/11286.doc Normal file
View File

@@ -0,0 +1 @@
Fix typo in the word `available` and fix HTTP method (should be `GET`) for the `username_available` admin API. Contributed by Stanislav Motylkov.

1
changelog.d/11287.misc Normal file
View File

@@ -0,0 +1 @@
Add missing type hints to `synapse.app`.

1
changelog.d/11288.bugfix Normal file
View File

@@ -0,0 +1 @@
Fix a long-standing bug where uploading extremely thin images (e.g. 1000x1) would fail. Contributed by @Neeeflix.

1
changelog.d/11292.misc Normal file
View File

@@ -0,0 +1 @@
Remove unused parameters on `FederationEventHandler._check_event_auth`.

1
changelog.d/11297.misc Normal file
View File

@@ -0,0 +1 @@
Add type hints to `synapse._scripts`.

1
changelog.d/11298.doc Normal file
View File

@@ -0,0 +1 @@
Add Single Sign-On, SAML and CAS pages to the documentation.

1
changelog.d/11303.misc Normal file
View File

@@ -0,0 +1 @@
Fix an issue which prevented the 'remove deleted devices from device_inbox column' background process from running when updating from a recent Synapse version.

1
changelog.d/11307.misc Normal file
View File

@@ -0,0 +1 @@
Add type hints to storage classes.

1
changelog.d/11310.misc Normal file
View File

@@ -0,0 +1 @@
Add type hints to storage classes.

1
changelog.d/11311.misc Normal file
View File

@@ -0,0 +1 @@
Add type hints to storage classes.

1
changelog.d/11312.misc Normal file
View File

@@ -0,0 +1 @@
Add type hints to storage classes.

1
changelog.d/11313.misc Normal file
View File

@@ -0,0 +1 @@
Add type hints to storage classes.

1
changelog.d/11314.misc Normal file
View File

@@ -0,0 +1 @@
Add type hints to storage classes.

1
changelog.d/11316.misc Normal file
View File

@@ -0,0 +1 @@
Add type hints to storage classes.

1
changelog.d/11321.misc Normal file
View File

@@ -0,0 +1 @@
Add type hints to `synapse.util`.

1
changelog.d/11322.misc Normal file
View File

@@ -0,0 +1 @@
Add type hints to storage classes.

1
changelog.d/11323.misc Normal file
View File

@@ -0,0 +1 @@
Improve type annotations in Synapse's test suite.

1
changelog.d/11327.misc Normal file
View File

@@ -0,0 +1 @@
Test that room alias deletion works as intended.

1
changelog.d/11332.misc Normal file
View File

@@ -0,0 +1 @@
Add type hints to storage classes.

View File

@@ -0,0 +1 @@
Support the stable version of [MSC2778](https://github.com/matrix-org/matrix-doc/pull/2778): the `m.login.application_service` login type. Contributed by @tulir.

1
changelog.d/11339.misc Normal file
View File

@@ -0,0 +1 @@
Add type hints to storage classes.

1
changelog.d/11342.misc Normal file
View File

@@ -0,0 +1 @@
Add type hints to storage classes.

1
changelog.d/11357.misc Normal file
View File

@@ -0,0 +1 @@
Add a development script for visualising the storage class inheritance hierarchy.

View File

@@ -1 +0,0 @@
Add some type hints to datastore.

View File

@@ -1 +0,0 @@
Preparation for faster-room-join work: return subsets of room state which we already have, immediately.

View File

@@ -1 +0,0 @@
Measure the time taken in spam-checking callbacks and expose those measurements as metrics.

View File

@@ -1 +0,0 @@
Replace string literal instances of stream key types with typed constants.

View File

@@ -1 +0,0 @@
Add `@cancellable` decorator, for use on endpoint methods that can be cancelled when clients disconnect.

View File

@@ -1 +0,0 @@
Add ability to cancel disconnected requests to `SynapseRequest`.

View File

@@ -1 +0,0 @@
Add a `default_power_level_content_override` config option to set default room power levels per room preset.

View File

@@ -1 +0,0 @@
Add support for [MSC3787: Allowing knocks to restricted rooms](https://github.com/matrix-org/matrix-spec-proposals/pull/3787).

View File

@@ -1 +0,0 @@
Add a helper class for testing request cancellation.

View File

@@ -1 +0,0 @@
Send `USER_IP` commands on a different Redis channel, in order to reduce traffic to workers that do not process these commands.

View File

@@ -1 +0,0 @@
Synapse will now reload [cache config](https://matrix-org.github.io/synapse/latest/usage/configuration/config_documentation.html#caching) when it receives a [SIGHUP](https://en.wikipedia.org/wiki/SIGHUP) signal.

View File

@@ -1 +0,0 @@
Improve documentation of the `synapse.push` module.

View File

@@ -1 +0,0 @@
Refactor functions to on `PushRuleEvaluatorForEvent`.

View File

@@ -1 +0,0 @@
Preparation for database schema simplifications: stop writing to `event_reference_hashes`.

View File

@@ -1 +0,0 @@
Remove code which updates unused database column `application_services_state.last_txn`.

View File

@@ -1 +0,0 @@
Fix a bug introduced in Synapse 1.57.0 where `/messages` would throw a 500 error when querying for a non-existent room.

View File

@@ -1 +0,0 @@
Add a unique index to `state_group_edges` to prevent duplicates being accidentally introduced and the consequential impact to performance.

View File

@@ -1 +0,0 @@
Refactor `EventContext` class.

View File

@@ -1 +0,0 @@
Remove an unneeded class in the push code.

View File

@@ -1 +0,0 @@
Consolidate parsing of relation information from events.

View File

@@ -1 +0,0 @@
Capture the `Deferred` for request cancellation in `_AsyncResource`.

View File

@@ -1 +0,0 @@
Fixes an incorrect type hint for `Filter._check_event_relations`.

View File

@@ -1 +0,0 @@
Fix a long-standing bug where an empty room would be created when a user with an insufficient power level tried to upgrade a room.

View File

@@ -1 +0,0 @@
Respect the `@cancellable` flag for `DirectServe{Html,Json}Resource`s.

View File

@@ -1 +0,0 @@
Respect the `@cancellable` flag for `RestServlet`s and `BaseFederationServlet`s.

View File

@@ -1 +0,0 @@
Respect the `@cancellable` flag for `ReplicationEndpoint`s.

View File

@@ -1 +0,0 @@
Add a config options to allow for auto-tuning of caches.

View File

@@ -1 +0,0 @@
Convert namespace class `Codes` into a string enum.

View File

@@ -1 +0,0 @@
Complain if a federation endpoint has the `@cancellable` flag, since some of the wrapper code may not handle cancellation correctly yet.

View File

@@ -1 +0,0 @@
Enable cancellation of `GET /rooms/$room_id/members`, `GET /rooms/$room_id/state` and `GET /rooms/$room_id/state/$event_type/*` requests.

View File

@@ -1 +0,0 @@
Require a body in POST requests to `/rooms/{roomId}/receipt/{receiptType}/{eventId}`, as required by the [Matrix specification](https://spec.matrix.org/v1.2/client-server-api/#post_matrixclientv3roomsroomidreceiptreceipttypeeventid). This breaks compatibility with Element Android 1.2.0 and earlier: users of those clients will be unable to send read receipts.

View File

@@ -1 +0,0 @@
Optimize private read receipt filtering.

View File

@@ -1 +0,0 @@
Fix a bug introduced in Synapse 1.30.0 where empty rooms could be automatically created if a monthly active users limit is set.

View File

@@ -1 +0,0 @@
Fix a typo in the Media Admin API documentation.

View File

@@ -1 +0,0 @@
Add type annotations to increase the number of modules passing `disallow-untyped-defs`.

View File

@@ -1 +0,0 @@
Add some type hints to datastore.

View File

@@ -1 +0,0 @@
Drop the logging level of status messages for the URL preview cache expiry job from INFO to DEBUG.

View File

@@ -1 +0,0 @@
Fix push to dismiss notifications when read on another client. Contributed by @SpiritCroc @ Beeper.

View File

@@ -1 +0,0 @@
Downgrade some OIDC errors to warnings in the logs, to reduce the noise of Sentry reports.

View File

@@ -1 +0,0 @@
Add type annotations to increase the number of modules passing `disallow-untyped-defs`.

View File

@@ -1 +0,0 @@
Update the OpenID Connect example for Keycloak to be compatible with newer versions of Keycloak. Contributed by @nhh.

View File

@@ -1 +0,0 @@
Update configs used by Complement to allow more invites/3PID validations during tests.

View File

@@ -1 +0,0 @@
Tidy up and type-hint the database engine modules.

View File

@@ -1 +0,0 @@
Fix typo in server listener documentation.

View File

@@ -1 +0,0 @@
Fix poor database performance when reading the cache invalidation stream for large servers with lots of workers.

View File

@@ -1 +0,0 @@
Link to the configuration manual from the welcome page of the documentation.

View File

@@ -1 +0,0 @@
Fix typo in 'run_background_tasks_on' option name in configuration manual documentation.

Some files were not shown because too many files have changed in this diff Show More