1
0

Compare commits

..

1 Commits

Author SHA1 Message Date
Action Bot
3c10dacc36 Version picker added for v1.41 docs 2023-12-11 14:52:48 +00:00
763 changed files with 20043 additions and 49437 deletions

View File

@@ -1,8 +0,0 @@
#!/bin/sh
# replaces the dependency on Twisted in `python_dependencies` with trunk.
set -e
cd "$(dirname "$0")"/..
sed -i -e 's#"Twisted.*"#"Twisted @ git+https://github.com/twisted/twisted"#' synapse/python_dependencies.py

View File

@@ -1,57 +0,0 @@
#!/usr/bin/env bash
# Test for the export-data admin command against sqlite and postgres
set -xe
cd "$(dirname "$0")/../.."
echo "--- Install dependencies"
# Install dependencies for this test.
pip install psycopg2
# Install Synapse itself. This won't update any libraries.
pip install -e .
echo "--- Generate the signing key"
# Generate the server's signing key.
python -m synapse.app.homeserver --generate-keys -c .ci/sqlite-config.yaml
echo "--- Prepare test database"
# Make sure the SQLite3 database is using the latest schema and has no pending background update.
scripts/update_synapse_database --database-config .ci/sqlite-config.yaml --run-background-updates
# Run the export-data command on the sqlite test database
python -m synapse.app.admin_cmd -c .ci/sqlite-config.yaml export-data @anon-20191002_181700-832:localhost:8800 \
--output-directory /tmp/export_data
# Test that the output directory exists and contains the rooms directory
dir="/tmp/export_data/rooms"
if [ -d "$dir" ]; then
echo "Command successful, this test passes"
else
echo "No output directories found, the command fails against a sqlite database."
exit 1
fi
# Create the PostgreSQL database.
.ci/scripts/postgres_exec.py "CREATE DATABASE synapse"
# Port the SQLite databse to postgres so we can check command works against postgres
echo "+++ Port SQLite3 databse to postgres"
scripts/synapse_port_db --sqlite-database .ci/test_db.db --postgres-config .ci/postgres-config.yaml
# Run the export-data command on postgres database
python -m synapse.app.admin_cmd -c .ci/postgres-config.yaml export-data @anon-20191002_181700-832:localhost:8800 \
--output-directory /tmp/export_data2
# Test that the output directory exists and contains the rooms directory
dir2="/tmp/export_data2/rooms"
if [ -d "$dir2" ]; then
echo "Command successful, this test passes"
else
echo "No output directories found, the command fails against a postgres database."
exit 1
fi

View File

@@ -7,7 +7,7 @@
set -xe
cd "$(dirname "$0")/../.."
cd `dirname $0`/../..
echo "--- Install dependencies"
@@ -25,7 +25,7 @@ python -m synapse.app.homeserver --generate-keys -c .ci/sqlite-config.yaml
echo "--- Prepare test database"
# Make sure the SQLite3 database is using the latest schema and has no pending background update.
scripts/update_synapse_database --database-config .ci/sqlite-config.yaml --run-background-updates
scripts-dev/update_database --database-config .ci/sqlite-config.yaml
# Create the PostgreSQL database.
.ci/scripts/postgres_exec.py "CREATE DATABASE synapse"
@@ -46,7 +46,7 @@ echo "--- Prepare empty SQLite database"
# we do this by deleting the sqlite db, and then doing the same again.
rm .ci/test_db.db
scripts/update_synapse_database --database-config .ci/sqlite-config.yaml --run-background-updates
scripts-dev/update_database --database-config .ci/sqlite-config.yaml
# re-create the PostgreSQL database.
.ci/scripts/postgres_exec.py \

View File

@@ -1,4 +0,0 @@
---
title: CI run against Twisted trunk is failing
---
See https://github.com/{{env.GITHUB_REPOSITORY}}/actions/runs/{{env.GITHUB_RUN_ID}}

View File

@@ -1,2 +1,10 @@
# This file serves as a blacklist for SyTest tests that we expect will fail in
# Synapse when run under worker mode. For more details, see sytest-blacklist.
Can re-join room if re-invited
# new failures as of https://github.com/matrix-org/sytest/pull/732
Device list doesn't change if remote server is down
# https://buildkite.com/matrix-dot-org/synapse/builds/6134#6f67bf47-e234-474d-80e8-c6e1868b15c5
Server correctly handles incoming m.device_list_update

2
.github/CODEOWNERS vendored
View File

@@ -1,2 +0,0 @@
# Automatically request reviews from the synapse-core team when a pull request comes in.
* @matrix-org/synapse-core

View File

@@ -1,14 +1,12 @@
### Pull Request Checklist
<!-- Please read https://matrix-org.github.io/synapse/latest/development/contributing_guide.html before submitting your pull request -->
<!-- Please read CONTRIBUTING.md before submitting your pull request -->
* [ ] Pull request is based on the develop branch
* [ ] Pull request includes a [changelog file](https://matrix-org.github.io/synapse/latest/development/contributing_guide.html#changelog). The entry should:
* [ ] Pull request includes a [changelog file](https://github.com/matrix-org/synapse/blob/master/CONTRIBUTING.md#changelog). The entry should:
- Be a short description of your change which makes sense to users. "Fixed a bug that prevented receiving messages from other servers." instead of "Moved X method from `EventStore` to `EventWorkerStore`.".
- Use markdown where necessary, mostly for `code blocks`.
- End with either a period (.) or an exclamation mark (!).
- Start with a capital letter.
- Feel free to credit yourself, by adding a sentence "Contributed by @github_username." or "Contributed by [Your Name]." to the end of the entry.
* [ ] Pull request includes a [sign off](https://matrix-org.github.io/synapse/latest/development/contributing_guide.html#sign-off)
* [ ] [Code style](https://matrix-org.github.io/synapse/latest/code_style.html) is correct
(run the [linters](https://matrix-org.github.io/synapse/latest/development/contributing_guide.html#run-the-linters))
* [ ] Pull request includes a [sign off](https://github.com/matrix-org/synapse/blob/master/CONTRIBUTING.md#sign-off)
* [ ] Code style is correct (run the [linters](https://github.com/matrix-org/synapse/blob/master/CONTRIBUTING.md#code-style))

View File

@@ -5,7 +5,7 @@ name: Build docker images
on:
push:
tags: ["v*"]
branches: [ master, main, develop ]
branches: [ master, main ]
workflow_dispatch:
permissions:
@@ -38,9 +38,6 @@ jobs:
id: set-tag
run: |
case "${GITHUB_REF}" in
refs/heads/develop)
tag=develop
;;
refs/heads/master|refs/heads/main)
tag=latest
;;

View File

@@ -61,5 +61,6 @@ jobs:
uses: peaceiris/actions-gh-pages@068dc23d9710f1ba62e86896f84735d869951305 # v3.8.0
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
keep_files: true
publish_dir: ./book
destination_dir: ./${{ steps.vars.outputs.branch-version }}

View File

@@ -76,25 +76,22 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.7", "3.8", "3.9", "3.10"]
python-version: ["3.6", "3.7", "3.8", "3.9"]
database: ["sqlite"]
toxenv: ["py"]
include:
# Newest Python without optional deps
- python-version: "3.10"
toxenv: "py-noextras"
- python-version: "3.9"
toxenv: "py-noextras,combine"
# Oldest Python with PostgreSQL
- python-version: "3.7"
- python-version: "3.6"
database: "postgres"
postgres-version: "10"
toxenv: "py"
postgres-version: "9.6"
# Newest Python with newest PostgreSQL
- python-version: "3.10"
# Newest Python with PostgreSQL
- python-version: "3.9"
database: "postgres"
postgres-version: "14"
toxenv: "py"
postgres-version: "13"
steps:
- uses: actions/checkout@v2
@@ -114,7 +111,7 @@ jobs:
if: ${{ matrix.postgres-version }}
timeout-minutes: 2
run: until pg_isready -h localhost; do sleep 1; done
- run: tox -e ${{ matrix.toxenv }}
- run: tox -e py,combine
env:
TRIAL_FLAGS: "--jobs=2"
SYNAPSE_POSTGRES: ${{ matrix.database == 'postgres' || '' }}
@@ -122,8 +119,6 @@ jobs:
SYNAPSE_POSTGRES_USER: postgres
SYNAPSE_POSTGRES_PASSWORD: postgres
- name: Dump logs
# Logs are most useful when the command fails, always include them.
if: ${{ always() }}
# Note: Dumps to workflow logs instead of using actions/upload-artifact
# This keeps logs colocated with failing jobs
# It also ignores find's exit code; this is a best effort affair
@@ -148,8 +143,6 @@ jobs:
env:
TRIAL_FLAGS: "--jobs=2"
- name: Dump logs
# Logs are most useful when the command fails, always include them.
if: ${{ always() }}
# Note: Dumps to workflow logs instead of using actions/upload-artifact
# This keeps logs colocated with failing jobs
# It also ignores find's exit code; this is a best effort affair
@@ -167,7 +160,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["pypy-3.7"]
python-version: ["pypy-3.6"]
steps:
- uses: actions/checkout@v2
@@ -176,12 +169,10 @@ jobs:
with:
python-version: ${{ matrix.python-version }}
- run: pip install tox
- run: tox -e py
- run: tox -e py,combine
env:
TRIAL_FLAGS: "--jobs=2"
- name: Dump logs
# Logs are most useful when the command fails, always include them.
if: ${{ always() }}
# Note: Dumps to workflow logs instead of using actions/upload-artifact
# This keeps logs colocated with failing jobs
# It also ignores find's exit code; this is a best effort affair
@@ -201,7 +192,6 @@ jobs:
volumes:
- ${{ github.workspace }}:/src
env:
SYTEST_BRANCH: ${{ github.head_ref }}
POSTGRES: ${{ matrix.postgres && 1}}
MULTI_POSTGRES: ${{ (matrix.postgres == 'multi-postgres') && 1}}
WORKERS: ${{ matrix.workers && 1 }}
@@ -253,35 +243,6 @@ jobs:
/logs/results.tap
/logs/**/*.log*
export-data:
if: ${{ !failure() && !cancelled() }} # Allow previous steps to be skipped, but not fail
needs: [linting-done, portdb]
runs-on: ubuntu-latest
env:
TOP: ${{ github.workspace }}
services:
postgres:
image: postgres
ports:
- 5432:5432
env:
POSTGRES_PASSWORD: "postgres"
POSTGRES_INITDB_ARGS: "--lc-collate C --lc-ctype C --encoding UTF8"
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5
steps:
- uses: actions/checkout@v2
- run: sudo apt-get -qq install xmlsec1
- uses: actions/setup-python@v2
with:
python-version: "3.9"
- run: .ci/scripts/test_export_data_command.sh
portdb:
if: ${{ !failure() && !cancelled() }} # Allow previous steps to be skipped, but not fail
needs: linting-done
@@ -291,11 +252,11 @@ jobs:
strategy:
matrix:
include:
- python-version: "3.7"
postgres-version: "10"
- python-version: "3.6"
postgres-version: "9.6"
- python-version: "3.10"
postgres-version: "14"
- python-version: "3.9"
postgres-version: "13"
services:
postgres:
@@ -366,8 +327,6 @@ jobs:
# Build initial Synapse image
- run: docker build -t matrixdotorg/synapse:latest -f docker/Dockerfile .
working-directory: synapse
env:
DOCKER_BUILDKIT: 1
# Build a ready-to-run Synapse image based on the initial image above.
# This new image includes a config file, keys for signing and TLS, and
@@ -376,8 +335,7 @@ jobs:
working-directory: complement/dockerfiles
# Run Complement
- run: set -o pipefail && go test -v -json -tags synapse_blacklist,msc2403 ./tests/... 2>&1 | gotestfmt
shell: bash
- run: go test -v -tags synapse_blacklist,msc2403,msc2946,msc3083 ./tests/...
env:
COMPLEMENT_BASE_IMAGE: complement-synapse:latest
working-directory: complement

View File

@@ -1,92 +0,0 @@
name: Twisted Trunk
on:
schedule:
- cron: 0 8 * * *
workflow_dispatch:
jobs:
mypy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- run: .ci/patch_for_twisted_trunk.sh
- run: pip install tox
- run: tox -e mypy
trial:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- run: sudo apt-get -qq install xmlsec1
- uses: actions/setup-python@v2
with:
python-version: 3.6
- run: .ci/patch_for_twisted_trunk.sh
- run: pip install tox
- run: tox -e py
env:
TRIAL_FLAGS: "--jobs=2"
- name: Dump logs
# Logs are most useful when the command fails, always include them.
if: ${{ always() }}
# Note: Dumps to workflow logs instead of using actions/upload-artifact
# This keeps logs colocated with failing jobs
# It also ignores find's exit code; this is a best effort affair
run: >-
find _trial_temp -name '*.log'
-exec echo "::group::{}" \;
-exec cat {} \;
-exec echo "::endgroup::" \;
|| true
sytest:
runs-on: ubuntu-latest
container:
image: matrixdotorg/sytest-synapse:buster
volumes:
- ${{ github.workspace }}:/src
steps:
- uses: actions/checkout@v2
- name: Patch dependencies
run: .ci/patch_for_twisted_trunk.sh
working-directory: /src
- name: Run SyTest
run: /bootstrap.sh synapse
working-directory: /src
- name: Summarise results.tap
if: ${{ always() }}
run: /sytest/scripts/tap_to_gha.pl /logs/results.tap
- name: Upload SyTest logs
uses: actions/upload-artifact@v2
if: ${{ always() }}
with:
name: Sytest Logs - ${{ job.status }} - (${{ join(matrix.*, ', ') }})
path: |
/logs/results.tap
/logs/**/*.log*
# open an issue if the build fails, so we know about it.
open-issue:
if: failure()
needs:
- mypy
- trial
- sytest
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: JasonEtco/create-an-issue@5d9504915f79f9cc6d791934b8ef34f2353dd74d # v2.5.0, 2020-12-06
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
update_existing: true
filename: .ci/twisted_trunk_build_failed_issue_template.md

5
.gitignore vendored
View File

@@ -40,7 +40,6 @@ __pycache__/
/.coverage*
/.mypy_cache/
/.tox
/.tox-pg-container
/build/
/coverage.*
/dist/
@@ -50,7 +49,3 @@ __pycache__/
# docs
book/
# complement
/complement-master
/master.tar.gz

File diff suppressed because it is too large Load Diff

View File

@@ -1,3 +1,443 @@
# Welcome to Synapse
Welcome to Synapse
Please see the [contributors' guide](https://matrix-org.github.io/synapse/latest/development/contributing_guide.html) in our rendered documentation.
This document aims to get you started with contributing to this repo!
- [1. Who can contribute to Synapse?](#1-who-can-contribute-to-synapse)
- [2. What do I need?](#2-what-do-i-need)
- [3. Get the source.](#3-get-the-source)
- [4. Install the dependencies](#4-install-the-dependencies)
* [Under Unix (macOS, Linux, BSD, ...)](#under-unix-macos-linux-bsd-)
* [Under Windows](#under-windows)
- [5. Get in touch.](#5-get-in-touch)
- [6. Pick an issue.](#6-pick-an-issue)
- [7. Turn coffee and documentation into code and documentation!](#7-turn-coffee-and-documentation-into-code-and-documentation)
- [8. Test, test, test!](#8-test-test-test)
* [Run the linters.](#run-the-linters)
* [Run the unit tests.](#run-the-unit-tests-twisted-trial)
* [Run the integration tests (SyTest).](#run-the-integration-tests-sytest)
* [Run the integration tests (Complement).](#run-the-integration-tests-complement)
- [9. Submit your patch.](#9-submit-your-patch)
* [Changelog](#changelog)
+ [How do I know what to call the changelog file before I create the PR?](#how-do-i-know-what-to-call-the-changelog-file-before-i-create-the-pr)
+ [Debian changelog](#debian-changelog)
* [Sign off](#sign-off)
- [10. Turn feedback into better code.](#10-turn-feedback-into-better-code)
- [11. Find a new issue.](#11-find-a-new-issue)
- [Notes for maintainers on merging PRs etc](#notes-for-maintainers-on-merging-prs-etc)
- [Conclusion](#conclusion)
# 1. Who can contribute to Synapse?
Everyone is welcome to contribute code to [matrix.org
projects](https://github.com/matrix-org), provided that they are willing to
license their contributions under the same license as the project itself. We
follow a simple 'inbound=outbound' model for contributions: the act of
submitting an 'inbound' contribution means that the contributor agrees to
license the code under the same terms as the project's overall 'outbound'
license - in our case, this is almost always Apache Software License v2 (see
[LICENSE](LICENSE)).
# 2. What do I need?
The code of Synapse is written in Python 3. To do pretty much anything, you'll need [a recent version of Python 3](https://wiki.python.org/moin/BeginnersGuide/Download).
The source code of Synapse is hosted on GitHub. You will also need [a recent version of git](https://github.com/git-guides/install-git).
For some tests, you will need [a recent version of Docker](https://docs.docker.com/get-docker/).
# 3. Get the source.
The preferred and easiest way to contribute changes is to fork the relevant
project on GitHub, and then [create a pull request](
https://help.github.com/articles/using-pull-requests/) to ask us to pull your
changes into our repo.
Please base your changes on the `develop` branch.
```sh
git clone git@github.com:YOUR_GITHUB_USER_NAME/synapse.git
git checkout develop
```
If you need help getting started with git, this is beyond the scope of the document, but you
can find many good git tutorials on the web.
# 4. Install the dependencies
## Under Unix (macOS, Linux, BSD, ...)
Once you have installed Python 3 and added the source, please open a terminal and
setup a *virtualenv*, as follows:
```sh
cd path/where/you/have/cloned/the/repository
python3 -m venv ./env
source ./env/bin/activate
pip install -e ".[all,lint,mypy,test]"
pip install tox
```
This will install the developer dependencies for the project.
## Under Windows
TBD
# 5. Get in touch.
Join our developer community on Matrix: #synapse-dev:matrix.org !
# 6. Pick an issue.
Fix your favorite problem or perhaps find a [Good First Issue](https://github.com/matrix-org/synapse/issues?q=is%3Aopen+is%3Aissue+label%3A%22Good+First+Issue%22)
to work on.
# 7. Turn coffee and documentation into code and documentation!
Synapse's code style is documented [here](docs/code_style.md). Please follow
it, including the conventions for the [sample configuration
file](docs/code_style.md#configuration-file-format).
There is a growing amount of documentation located in the [docs](docs)
directory. This documentation is intended primarily for sysadmins running their
own Synapse instance, as well as developers interacting externally with
Synapse. [docs/dev](docs/dev) exists primarily to house documentation for
Synapse developers. [docs/admin_api](docs/admin_api) houses documentation
regarding Synapse's Admin API, which is used mostly by sysadmins and external
service developers.
If you add new files added to either of these folders, please use [GitHub-Flavoured
Markdown](https://guides.github.com/features/mastering-markdown/).
Some documentation also exists in [Synapse's GitHub
Wiki](https://github.com/matrix-org/synapse/wiki), although this is primarily
contributed to by community authors.
# 8. Test, test, test!
<a name="test-test-test"></a>
While you're developing and before submitting a patch, you'll
want to test your code.
## Run the linters.
The linters look at your code and do two things:
- ensure that your code follows the coding style adopted by the project;
- catch a number of errors in your code.
They're pretty fast, don't hesitate!
```sh
source ./env/bin/activate
./scripts-dev/lint.sh
```
Note that this script *will modify your files* to fix styling errors.
Make sure that you have saved all your files.
If you wish to restrict the linters to only the files changed since the last commit
(much faster!), you can instead run:
```sh
source ./env/bin/activate
./scripts-dev/lint.sh -d
```
Or if you know exactly which files you wish to lint, you can instead run:
```sh
source ./env/bin/activate
./scripts-dev/lint.sh path/to/file1.py path/to/file2.py path/to/folder
```
## Run the unit tests (Twisted trial).
The unit tests run parts of Synapse, including your changes, to see if anything
was broken. They are slower than the linters but will typically catch more errors.
```sh
source ./env/bin/activate
trial tests
```
If you wish to only run *some* unit tests, you may specify
another module instead of `tests` - or a test class or a method:
```sh
source ./env/bin/activate
trial tests.rest.admin.test_room tests.handlers.test_admin.ExfiltrateData.test_invite
```
If your tests fail, you may wish to look at the logs (the default log level is `ERROR`):
```sh
less _trial_temp/test.log
```
To increase the log level for the tests, set `SYNAPSE_TEST_LOG_LEVEL`:
```sh
SYNAPSE_TEST_LOG_LEVEL=DEBUG trial tests
```
## Run the integration tests ([Sytest](https://github.com/matrix-org/sytest)).
The integration tests are a more comprehensive suite of tests. They
run a full version of Synapse, including your changes, to check if
anything was broken. They are slower than the unit tests but will
typically catch more errors.
The following command will let you run the integration test with the most common
configuration:
```sh
$ docker run --rm -it -v /path/where/you/have/cloned/the/repository\:/src:ro -v /path/to/where/you/want/logs\:/logs matrixdotorg/sytest-synapse:buster
```
This configuration should generally cover your needs. For more details about other configurations, see [documentation in the SyTest repo](https://github.com/matrix-org/sytest/blob/develop/docker/README.md).
## Run the integration tests ([Complement](https://github.com/matrix-org/complement)).
[Complement](https://github.com/matrix-org/complement) is a suite of black box tests that can be run on any homeserver implementation. It can also be thought of as end-to-end (e2e) tests.
It's often nice to develop on Synapse and write Complement tests at the same time.
Here is how to run your local Synapse checkout against your local Complement checkout.
(checkout [`complement`](https://github.com/matrix-org/complement) alongside your `synapse` checkout)
```sh
COMPLEMENT_DIR=../complement ./scripts-dev/complement.sh
```
To run a specific test file, you can pass the test name at the end of the command. The name passed comes from the naming structure in your Complement tests. If you're unsure of the name, you can do a full run and copy it from the test output:
```sh
COMPLEMENT_DIR=../complement ./scripts-dev/complement.sh TestBackfillingHistory
```
To run a specific test, you can specify the whole name structure:
```sh
COMPLEMENT_DIR=../complement ./scripts-dev/complement.sh TestBackfillingHistory/parallel/Backfilled_historical_events_resolve_with_proper_state_in_correct_order
```
### Access database for homeserver after Complement test runs.
If you're curious what the database looks like after you run some tests, here are some steps to get you going in Synapse:
1. In your Complement test comment out `defer deployment.Destroy(t)` and replace with `defer time.Sleep(2 * time.Hour)` to keep the homeserver running after the tests complete
1. Start the Complement tests
1. Find the name of the container, `docker ps -f name=complement_` (this will filter for just the Compelement related Docker containers)
1. Access the container replacing the name with what you found in the previous step: `docker exec -it complement_1_hs_with_application_service.hs1_2 /bin/bash`
1. Install sqlite (database driver), `apt-get update && apt-get install -y sqlite3`
1. Then run `sqlite3` and open the database `.open /conf/homeserver.db` (this db path comes from the Synapse homeserver.yaml)
# 9. Submit your patch.
Once you're happy with your patch, it's time to prepare a Pull Request.
To prepare a Pull Request, please:
1. verify that [all the tests pass](#test-test-test), including the coding style;
2. [sign off](#sign-off) your contribution;
3. `git push` your commit to your fork of Synapse;
4. on GitHub, [create the Pull Request](https://docs.github.com/en/github/collaborating-with-issues-and-pull-requests/creating-a-pull-request);
5. add a [changelog entry](#changelog) and push it to your Pull Request;
6. for most contributors, that's all - however, if you are a member of the organization `matrix-org`, on GitHub, please request a review from `matrix.org / Synapse Core`.
7. if you need to update your PR, please avoid rebasing and just add new commits to your branch.
## Changelog
All changes, even minor ones, need a corresponding changelog / newsfragment
entry. These are managed by [Towncrier](https://github.com/hawkowl/towncrier).
To create a changelog entry, make a new file in the `changelog.d` directory named
in the format of `PRnumber.type`. The type can be one of the following:
* `feature`
* `bugfix`
* `docker` (for updates to the Docker image)
* `doc` (for updates to the documentation)
* `removal` (also used for deprecations)
* `misc` (for internal-only changes)
This file will become part of our [changelog](
https://github.com/matrix-org/synapse/blob/master/CHANGES.md) at the next
release, so the content of the file should be a short description of your
change in the same style as the rest of the changelog. The file can contain Markdown
formatting, and should end with a full stop (.) or an exclamation mark (!) for
consistency.
Adding credits to the changelog is encouraged, we value your
contributions and would like to have you shouted out in the release notes!
For example, a fix in PR #1234 would have its changelog entry in
`changelog.d/1234.bugfix`, and contain content like:
> The security levels of Florbs are now validated when received
> via the `/federation/florb` endpoint. Contributed by Jane Matrix.
If there are multiple pull requests involved in a single bugfix/feature/etc,
then the content for each `changelog.d` file should be the same. Towncrier will
merge the matching files together into a single changelog entry when we come to
release.
### How do I know what to call the changelog file before I create the PR?
Obviously, you don't know if you should call your newsfile
`1234.bugfix` or `5678.bugfix` until you create the PR, which leads to a
chicken-and-egg problem.
There are two options for solving this:
1. Open the PR without a changelog file, see what number you got, and *then*
add the changelog file to your branch (see [Updating your pull
request](#updating-your-pull-request)), or:
1. Look at the [list of all
issues/PRs](https://github.com/matrix-org/synapse/issues?q=), add one to the
highest number you see, and quickly open the PR before somebody else claims
your number.
[This
script](https://github.com/richvdh/scripts/blob/master/next_github_number.sh)
might be helpful if you find yourself doing this a lot.
Sorry, we know it's a bit fiddly, but it's *really* helpful for us when we come
to put together a release!
### Debian changelog
Changes which affect the debian packaging files (in `debian`) are an
exception to the rule that all changes require a `changelog.d` file.
In this case, you will need to add an entry to the debian changelog for the
next release. For this, run the following command:
```
dch
```
This will make up a new version number (if there isn't already an unreleased
version in flight), and open an editor where you can add a new changelog entry.
(Our release process will ensure that the version number and maintainer name is
corrected for the release.)
If your change affects both the debian packaging *and* files outside the debian
directory, you will need both a regular newsfragment *and* an entry in the
debian changelog. (Though typically such changes should be submitted as two
separate pull requests.)
## Sign off
In order to have a concrete record that your contribution is intentional
and you agree to license it under the same terms as the project's license, we've adopted the
same lightweight approach that the Linux Kernel
[submitting patches process](
https://www.kernel.org/doc/html/latest/process/submitting-patches.html#sign-your-work-the-developer-s-certificate-of-origin>),
[Docker](https://github.com/docker/docker/blob/master/CONTRIBUTING.md), and many other
projects use: the DCO (Developer Certificate of Origin:
http://developercertificate.org/). This is a simple declaration that you wrote
the contribution or otherwise have the right to contribute it to Matrix:
```
Developer Certificate of Origin
Version 1.1
Copyright (C) 2004, 2006 The Linux Foundation and its contributors.
660 York Street, Suite 102,
San Francisco, CA 94110 USA
Everyone is permitted to copy and distribute verbatim copies of this
license document, but changing it is not allowed.
Developer's Certificate of Origin 1.1
By making a contribution to this project, I certify that:
(a) The contribution was created in whole or in part by me and I
have the right to submit it under the open source license
indicated in the file; or
(b) The contribution is based upon previous work that, to the best
of my knowledge, is covered under an appropriate open source
license and I have the right under that license to submit that
work with modifications, whether created in whole or in part
by me, under the same open source license (unless I am
permitted to submit under a different license), as indicated
in the file; or
(c) The contribution was provided directly to me by some other
person who certified (a), (b) or (c) and I have not modified
it.
(d) I understand and agree that this project and the contribution
are public and that a record of the contribution (including all
personal information I submit with it, including my sign-off) is
maintained indefinitely and may be redistributed consistent with
this project or the open source license(s) involved.
```
If you agree to this for your contribution, then all that's needed is to
include the line in your commit or pull request comment:
```
Signed-off-by: Your Name <your@email.example.org>
```
We accept contributions under a legally identifiable name, such as
your name on government documentation or common-law names (names
claimed by legitimate usage or repute). Unfortunately, we cannot
accept anonymous contributions at this time.
Git allows you to add this signoff automatically when using the `-s`
flag to `git commit`, which uses the name and email set in your
`user.name` and `user.email` git configs.
# 10. Turn feedback into better code.
Once the Pull Request is opened, you will see a few things:
1. our automated CI (Continuous Integration) pipeline will run (again) the linters, the unit tests, the integration tests and more;
2. one or more of the developers will take a look at your Pull Request and offer feedback.
From this point, you should:
1. Look at the results of the CI pipeline.
- If there is any error, fix the error.
2. If a developer has requested changes, make these changes and let us know if it is ready for a developer to review again.
3. Create a new commit with the changes.
- Please do NOT overwrite the history. New commits make the reviewer's life easier.
- Push this commits to your Pull Request.
4. Back to 1.
Once both the CI and the developers are happy, the patch will be merged into Synapse and released shortly!
# 11. Find a new issue.
By now, you know the drill!
# Notes for maintainers on merging PRs etc
There are some notes for those with commit access to the project on how we
manage git [here](docs/development/git.md).
# Conclusion
That's it! Matrix is a very open and collaborative project as you might expect
given our obsession with open communication. If we're going to successfully
matrix together all the fragmented communication technologies out there we are
reliant on contributions and collaboration from the community to do so. So
please get involved - and we hope you have as much fun hacking on Matrix as we
do!

View File

@@ -8,7 +8,6 @@ include demo/demo.tls.dh
include demo/*.py
include demo/*.sh
include synapse/py.typed
recursive-include synapse/storage *.sql
recursive-include synapse/storage *.sql.postgres
recursive-include synapse/storage *.sql.sqlite

View File

@@ -1,6 +1,6 @@
=========================================================================
Synapse |support| |development| |documentation| |license| |pypi| |python|
=========================================================================
=========================================================
Synapse |support| |development| |license| |pypi| |python|
=========================================================
.. contents::
@@ -55,8 +55,11 @@ solutions. The hope is for Matrix to act as the building blocks for a new
generation of fully open and interoperable messaging and VoIP apps for the
internet.
Synapse is a Matrix "homeserver" implementation developed by the matrix.org core
team, written in Python 3/Twisted.
Synapse is a reference "homeserver" implementation of Matrix from the core
development team at matrix.org, written in Python/Twisted. It is intended to
showcase the concept of Matrix and let folks see the spec in the context of a
codebase and let you run your own homeserver and generally help bootstrap the
ecosystem.
In Matrix, every user runs one or more Matrix clients, which connect through to
a Matrix homeserver. The homeserver stores all their personal chat history and
@@ -82,14 +85,9 @@ For support installing or managing Synapse, please join |room|_ (from a matrix.o
account if necessary) and ask questions there. We do not use GitHub issues for
support requests, only for bug reports and feature requests.
Synapse's documentation is `nicely rendered on GitHub Pages <https://matrix-org.github.io/synapse>`_,
with its source available in |docs|_.
.. |room| replace:: ``#synapse:matrix.org``
.. _room: https://matrix.to/#/#synapse:matrix.org
.. |docs| replace:: ``docs``
.. _docs: docs
Synapse Installation
====================
@@ -265,27 +263,11 @@ Then update the ``users`` table in the database::
Synapse Development
===================
The best place to get started is our
`guide for contributors <https://matrix-org.github.io/synapse/latest/development/contributing_guide.html>`_.
This is part of our larger `documentation <https://matrix-org.github.io/synapse/latest>`_, which includes
information for synapse developers as well as synapse administrators.
Developers might be particularly interested in:
* `Synapse's database schema <https://matrix-org.github.io/synapse/latest/development/database_schema.html>`_,
* `notes on Synapse's implementation details <https://matrix-org.github.io/synapse/latest/development/internal_documentation/index.html>`_, and
* `how we use git <https://matrix-org.github.io/synapse/latest/development/git.html>`_.
Alongside all that, join our developer community on Matrix:
`#synapse-dev:matrix.org <https://matrix.to/#/#synapse-dev:matrix.org>`_, featuring real humans!
Quick start
-----------
Join our developer community on Matrix: `#synapse-dev:matrix.org <https://matrix.to/#/#synapse-dev:matrix.org>`_
Before setting up a development environment for synapse, make sure you have the
system dependencies (such as the python header files) installed - see
`Platform-specific prerequisites <https://matrix-org.github.io/synapse/latest/setup/installation.html#platform-specific-prerequisites>`_.
`Installing from source <https://matrix-org.github.io/synapse/latest/setup/installation.html#installing-from-source>`_.
To check out a synapse for development, clone the git repo into a working
directory of your choice::
@@ -298,7 +280,7 @@ to install using pip and a virtualenv::
python3 -m venv ./env
source ./env/bin/activate
pip install -e ".[all,dev]"
pip install -e ".[all,test]"
This will run a process of downloading and installing all the needed
dependencies into a virtual env. If any dependencies fail to install,
@@ -326,7 +308,7 @@ If you just want to start a single instance of the app and run it directly::
Running the unit tests
----------------------
======================
After getting up and running, you may wish to run Synapse's unit tests to
check that everything is installed correctly::
@@ -345,7 +327,7 @@ to see the logging output, see the `CONTRIBUTING doc <CONTRIBUTING.md#run-the-un
Running the Integration Tests
-----------------------------
=============================
Synapse is accompanied by `SyTest <https://github.com/matrix-org/sytest>`_,
a Matrix homeserver integration testing suite, which uses HTTP requests to
@@ -463,10 +445,6 @@ This is normally caused by a misconfiguration in your reverse-proxy. See
:alt: (discuss development on #synapse-dev:matrix.org)
:target: https://matrix.to/#/#synapse-dev:matrix.org
.. |documentation| image:: https://img.shields.io/badge/documentation-%E2%9C%93-success
:alt: (Rendered documentation on GitHub Pages)
:target: https://matrix-org.github.io/synapse/latest/
.. |license| image:: https://img.shields.io/github/license/matrix-org/synapse
:alt: (check license in LICENSE file)
:target: LICENSE

View File

@@ -34,6 +34,14 @@ additional-css = [
"docs/website_files/table-of-contents.css",
"docs/website_files/remove-nav-buttons.css",
"docs/website_files/indent-section-headers.css",
"docs/website_files/version-picker.css",
]
additional-js = ["docs/website_files/table-of-contents.js"]
theme = "docs/website_files/theme"
additional-js = [
"docs/website_files/table-of-contents.js",
"docs/website_files/version-picker.js",
"docs/website_files/version.js",
]
theme = "docs/website_files/theme"
[preprocessor.schema_versions]
command = "./scripts-dev/schema_versions.py"

1
changelog.d/10713.bugfix Normal file
View File

@@ -0,0 +1 @@
Fix a regression introduced in Synapse 1.41 which broke email transmission on Systems using older versions of the Twisted library.

1
changelog.d/10723.bugfix Normal file
View File

@@ -0,0 +1 @@
Fix unauthorised exposure of room metadata to communities.

View File

@@ -1,2 +0,0 @@
Fix a long-standing issue which could cause Synapse to incorrectly accept data in the unsigned field of events
received over federation.

View File

@@ -1 +0,0 @@
Add `track_puppeted_user_ips` config flag to record client IP addresses against puppeted users, and include the puppeted users in monthly active user counts.

View File

@@ -1 +0,0 @@
Remove the `"password_hash"` field from the response dictionaries of the [Users Admin API](https://matrix-org.github.io/synapse/latest/admin_api/user_admin_api.html).

View File

@@ -1 +0,0 @@
Include whether the requesting user has participated in a thread when generating a summary for [MSC3440](https://github.com/matrix-org/matrix-doc/pull/3440).

View File

@@ -1 +0,0 @@
Fix a performance regression in `/sync` handling, introduced in 1.49.0.

View File

@@ -1 +0,0 @@
Fix a long-standing bug where Synapse wouldn't cache a response indicating that a remote user has no devices.

View File

@@ -1 +0,0 @@
Fix an error in to get federation status of a destination server even if no error has occurred. This admin API was new introduced in Synapse 1.49.0.

View File

@@ -1 +0,0 @@
Avoid database access in the JSON serialization process.

View File

@@ -1 +0,0 @@
Include the bundled aggregations in the `/sync` response, per [MSC2675](https://github.com/matrix-org/matrix-doc/pull/2675).

View File

@@ -1 +0,0 @@
Fix `/_matrix/client/v1/room/{roomId}/hierarchy` endpoint returning incorrect fields which have been present since Synapse 1.49.0.

View File

@@ -1 +0,0 @@
Fix preview of some gif URLs (like tenor.com). Contributed by Philippe Daouadi.

View File

@@ -1 +0,0 @@
Return an `M_FORBIDDEN` error code instead of `M_UNKNOWN` when a spam checker module prevents a user from creating a room.

View File

@@ -1 +0,0 @@
Add a flag to the `synapse_review_recent_signups` script to ignore and filter appservice users.

View File

@@ -1 +0,0 @@
Remove the unstable `/send_relation` endpoint.

View File

@@ -1 +0,0 @@
Run `pyupgrade --py37-plus --keep-percent-format` on Synapse.

View File

@@ -1 +0,0 @@
Warn against using a Let's Encrypt certificate for TLS/DTLS TURN server client connections, and suggest using ZeroSSL certificate instead. This bypasses client-side connectivity errors caused by WebRTC libraries that reject Let's Encrypt certificates. Contibuted by @AndrewFerr.

View File

@@ -1 +0,0 @@
Use buildkit's cache feature to speed up docker builds.

View File

@@ -1 +0,0 @@
Use `auto_attribs` and native type hints for attrs classes.

View File

@@ -1 +0,0 @@
Remove debug logging for #4422, which has been closed since Synapse 0.99.

View File

@@ -1 +0,0 @@
Fix a bug where the only the first 50 rooms from a space were returned from the `/hierarchy` API. This has existed since the introduction of the API in Synapse v1.41.0.

View File

@@ -1 +0,0 @@
Remove fallback code for Python 2.

View File

@@ -1 +0,0 @@
Add a test for [an edge case](https://github.com/matrix-org/synapse/pull/11532#discussion_r769104461) in the `/sync` logic.

View File

@@ -1 +0,0 @@
Add the option to write sqlite test dbs to disk when running tests.

View File

@@ -1 +0,0 @@
Improve Complement test output for Gitub Actions.

View File

@@ -1 +0,0 @@
Fix a bug introduced in Synapse v1.18.0 where password reset and address validation emails would not be sent if their subject was configured to use the 'app' template variable. Contributed by @br4nnigan.

View File

@@ -1 +0,0 @@
Fix a typechecker problem related to our (ab)use of `nacl.signing.SigningKey`s.

View File

@@ -1 +0,0 @@
Document the new `SYNAPSE_TEST_PERSIST_SQLITE_DB` environment variable in the contributing guide.

View File

@@ -1 +0,0 @@
Fix docstring on `add_account_data_for_user`.

View File

@@ -1 +0,0 @@
Complement environment variable name change and update `.gitignore`.

View File

@@ -1 +0,0 @@
Simplify calculation of prometheus metrics for garbage collection.

View File

@@ -1 +0,0 @@
Improve accuracy of `python_twisted_reactor_tick_time` prometheus metric.

View File

@@ -1 +0,0 @@
Remove `python_twisted_reactor_pending_calls` prometheus metric.

View File

@@ -1 +0,0 @@
Document that now the minimum supported PostgreSQL version is 10.

View File

@@ -1 +0,0 @@
Fix typo in demo docs: differnt.

View File

@@ -1 +0,0 @@
Make the list rooms admin api sort stable. Contributed by Daniël Sonck.

View File

@@ -1 +0,0 @@
Update room spec url in config files.

View File

@@ -1 +0,0 @@
Mention python3-venv and libpq-dev dependencies in contribution guide.

View File

@@ -1 +0,0 @@
Minor efficiency improvements when inserting many values into the database.

View File

@@ -1 +0,0 @@
Invite PR authors to give themselves credit in the changelog.

View File

@@ -1 +0,0 @@
Fix a bug introduced in Synapse v1.18.0 where password reset and address validation emails would not be sent if their subject was configured to use the 'app' template variable. Contributed by @br4nnigan.

View File

@@ -1 +0,0 @@
Add `track_puppeted_user_ips` config flag to record client IP addresses against puppeted users, and include the puppeted users in monthly active user counts.

View File

@@ -1 +0,0 @@
Update documentation for configuring login with facebook.

View File

@@ -1 +0,0 @@
Add `track_puppeted_user_ips` config flag to record client IP addresses against puppeted users, and include the puppeted users in monthly active user counts.

View File

@@ -1 +0,0 @@
Remove `log_function` utility function and its uses.

View File

@@ -1 +0,0 @@
Use `auto_attribs` and native type hints for attrs classes.

View File

@@ -14,7 +14,6 @@ services:
# failure
restart: unless-stopped
# See the readme for a full documentation of the environment settings
# NOTE: You must edit homeserver.yaml to use postgres, it defaults to sqlite
environment:
- SYNAPSE_CONFIG_PATH=/data/homeserver.yaml
volumes:

View File

@@ -6785,7 +6785,7 @@
"expr": "rate(synapse_util_caches_cache:evicted_size{instance=\"$instance\",job=~\"$job\",index=~\"$index\"}[$bucket_size])",
"format": "time_series",
"intervalFactor": 1,
"legendFormat": "{{name}} ({{reason}}) {{job}}-{{index}}",
"legendFormat": "{{name}} {{job}}-{{index}}",
"refId": "A"
}
],
@@ -10888,5 +10888,5 @@
"timezone": "",
"title": "Synapse",
"uid": "000000012",
"version": 100
"version": 99
}

View File

@@ -92,6 +92,22 @@ new PromConsole.Graph({
})
</script>
<h3>Pending calls per tick</h3>
<div id="reactor_pending_calls"></div>
<script>
new PromConsole.Graph({
node: document.querySelector("#reactor_pending_calls"),
expr: "rate(python_twisted_reactor_pending_calls_sum[30s]) / rate(python_twisted_reactor_pending_calls_count[30s])",
name: "[[job]]-[[index]]",
min: 0,
renderer: "line",
height: 150,
yAxisFormatter: PromConsole.NumberFormatter.humanize,
yHoverFormatter: PromConsole.NumberFormatter.humanize,
yTitle: "Pending Calls"
})
</script>
<h1>Storage</h1>
<h3>Queries</h3>

View File

@@ -84,9 +84,7 @@ AUTH="Authorization: Bearer $TOKEN"
###################################################################################################
# finally start pruning the room:
###################################################################################################
# this will really delete local events, so the messages in the room really
# disappear unless they are restored by remote federation. This is because
# we pass {"delete_local_events":true} to the curl invocation below.
POSTDATA='{"delete_local_events":"true"}' # this will really delete local events, so the messages in the room really disappear unless they are restored by remote federation
for ROOM in "${ROOMS_ARRAY[@]}"; do
echo "########################################### $(date) ################# "
@@ -106,7 +104,7 @@ for ROOM in "${ROOMS_ARRAY[@]}"; do
SLEEP=2
set -x
# call purge
OUT=$(curl --header "$AUTH" -s -d '{"delete_local_events":true}' POST "$API_URL/admin/purge_history/$ROOM/$EVENT_ID")
OUT=$(curl --header "$AUTH" -s -d $POSTDATA POST "$API_URL/admin/purge_history/$ROOM/$EVENT_ID")
PURGE_ID=$(echo "$OUT" |grep purge_id|cut -d'"' -f4 )
if [ "$PURGE_ID" == "" ]; then
# probably the history purge is already in progress for $ROOM

View File

@@ -15,7 +15,7 @@ export DH_VIRTUALENV_INSTALL_ROOT=/opt/venvs
# python won't look in the right directory. At least this way, the error will
# be a *bit* more obvious.
#
SNAKE=$(readlink -e /usr/bin/python3)
SNAKE=`readlink -e /usr/bin/python3`
# try to set the CFLAGS so any compiled C extensions are compiled with the most
# generic as possible x64 instructions, so that compiling it on a new Intel chip
@@ -24,7 +24,7 @@ SNAKE=$(readlink -e /usr/bin/python3)
# TODO: add similar things for non-amd64, or figure out a more generic way to
# do this.
case $(dpkg-architecture -q DEB_HOST_ARCH) in
case `dpkg-architecture -q DEB_HOST_ARCH` in
amd64)
export CFLAGS=-march=x86-64
;;
@@ -40,7 +40,6 @@ dh_virtualenv \
--upgrade-pip \
--preinstall="lxml" \
--preinstall="mock" \
--preinstall="wheel" \
--extra-pip-arg="--no-cache-dir" \
--extra-pip-arg="--compile" \
--extras="all,systemd,test"
@@ -57,8 +56,8 @@ case "$DEB_BUILD_OPTIONS" in
*)
# Copy tests to a temporary directory so that we can put them on the
# PYTHONPATH without putting the uninstalled synapse on the pythonpath.
tmpdir=$(mktemp -d)
trap 'rm -r $tmpdir' EXIT
tmpdir=`mktemp -d`
trap "rm -r $tmpdir" EXIT
cp -r tests "$tmpdir"
@@ -99,7 +98,7 @@ esac
--output-file="${PACKAGE_BUILD_DIR}/etc/matrix-synapse/log.yaml"
# add a dependency on the right version of python to substvars.
PYPKG=$(basename "$SNAKE")
PYPKG=`basename $SNAKE`
echo "synapse:pydepends=$PYPKG" >> debian/matrix-synapse-py3.substvars

196
debian/changelog vendored
View File

@@ -1,199 +1,3 @@
matrix-synapse-py3 (1.50.1) stable; urgency=medium
* New synapse release 1.50.1.
-- Synapse Packaging team <packages@matrix.org> Tue, 18 Jan 2022 16:06:26 +0000
matrix-synapse-py3 (1.50.0) stable; urgency=medium
* New synapse release 1.50.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 18 Jan 2022 10:40:38 +0000
matrix-synapse-py3 (1.50.0~rc2) stable; urgency=medium
* New synapse release 1.50.0~rc2.
-- Synapse Packaging team <packages@matrix.org> Fri, 14 Jan 2022 11:18:06 +0000
matrix-synapse-py3 (1.50.0~rc1) stable; urgency=medium
* New synapse release 1.50.0~rc1.
-- Synapse Packaging team <packages@matrix.org> Wed, 05 Jan 2022 12:36:17 +0000
matrix-synapse-py3 (1.49.2) stable; urgency=medium
* New synapse release 1.49.2.
-- Synapse Packaging team <packages@matrix.org> Tue, 21 Dec 2021 17:31:03 +0000
matrix-synapse-py3 (1.49.1) stable; urgency=medium
* New synapse release 1.49.1.
-- Synapse Packaging team <packages@matrix.org> Tue, 21 Dec 2021 11:07:30 +0000
matrix-synapse-py3 (1.49.0) stable; urgency=medium
* New synapse release 1.49.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 14 Dec 2021 12:39:46 +0000
matrix-synapse-py3 (1.49.0~rc1) stable; urgency=medium
* New synapse release 1.49.0~rc1.
-- Synapse Packaging team <packages@matrix.org> Tue, 07 Dec 2021 13:52:21 +0000
matrix-synapse-py3 (1.48.0) stable; urgency=medium
* New synapse release 1.48.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 30 Nov 2021 11:24:15 +0000
matrix-synapse-py3 (1.48.0~rc1) stable; urgency=medium
* New synapse release 1.48.0~rc1.
-- Synapse Packaging team <packages@matrix.org> Thu, 25 Nov 2021 15:56:03 +0000
matrix-synapse-py3 (1.47.1) stable; urgency=medium
* New synapse release 1.47.1.
-- Synapse Packaging team <packages@matrix.org> Fri, 19 Nov 2021 13:44:32 +0000
matrix-synapse-py3 (1.47.0) stable; urgency=medium
* New synapse release 1.47.0.
-- Synapse Packaging team <packages@matrix.org> Wed, 17 Nov 2021 13:09:43 +0000
matrix-synapse-py3 (1.47.0~rc3) stable; urgency=medium
* New synapse release 1.47.0~rc3.
-- Synapse Packaging team <packages@matrix.org> Tue, 16 Nov 2021 14:32:47 +0000
matrix-synapse-py3 (1.47.0~rc2) stable; urgency=medium
[ Dan Callahan ]
* Update scripts to pass Shellcheck lints.
* Remove unused Vagrant scripts from debian/ directory.
* Allow building Debian packages for any architecture, not just amd64.
* Preinstall the "wheel" package when building virtualenvs.
* Do not error if /etc/default/matrix-synapse is missing.
[ Synapse Packaging team ]
* New synapse release 1.47.0~rc2.
-- Synapse Packaging team <packages@matrix.org> Wed, 10 Nov 2021 09:41:01 +0000
matrix-synapse-py3 (1.46.0) stable; urgency=medium
[ Richard van der Hoff ]
* Compress debs with xz, to fix incompatibility of impish debs with reprepro.
[ Synapse Packaging team ]
* New synapse release 1.46.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 02 Nov 2021 13:22:53 +0000
matrix-synapse-py3 (1.46.0~rc1) stable; urgency=medium
* New synapse release 1.46.0~rc1.
-- Synapse Packaging team <packages@matrix.org> Tue, 26 Oct 2021 14:04:04 +0100
matrix-synapse-py3 (1.45.1) stable; urgency=medium
* New synapse release 1.45.1.
-- Synapse Packaging team <packages@matrix.org> Wed, 20 Oct 2021 11:58:27 +0100
matrix-synapse-py3 (1.45.0) stable; urgency=medium
* New synapse release 1.45.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 19 Oct 2021 11:18:53 +0100
matrix-synapse-py3 (1.45.0~rc2) stable; urgency=medium
* New synapse release 1.45.0~rc2.
-- Synapse Packaging team <packages@matrix.org> Thu, 14 Oct 2021 10:58:24 +0100
matrix-synapse-py3 (1.45.0~rc1) stable; urgency=medium
[ Nick @ Beeper ]
* Include an `update_synapse_database` script in the distribution.
[ Synapse Packaging team ]
* New synapse release 1.45.0~rc1.
-- Synapse Packaging team <packages@matrix.org> Tue, 12 Oct 2021 10:46:27 +0100
matrix-synapse-py3 (1.44.0) stable; urgency=medium
* New synapse release 1.44.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 05 Oct 2021 13:43:57 +0100
matrix-synapse-py3 (1.44.0~rc3) stable; urgency=medium
* New synapse release 1.44.0~rc3.
-- Synapse Packaging team <packages@matrix.org> Mon, 04 Oct 2021 14:57:22 +0100
matrix-synapse-py3 (1.44.0~rc2) stable; urgency=medium
* New synapse release 1.44.0~rc2.
-- Synapse Packaging team <packages@matrix.org> Thu, 30 Sep 2021 12:39:10 +0100
matrix-synapse-py3 (1.44.0~rc1) stable; urgency=medium
* New synapse release 1.44.0~rc1.
-- Synapse Packaging team <packages@matrix.org> Tue, 28 Sep 2021 13:41:28 +0100
matrix-synapse-py3 (1.43.0) stable; urgency=medium
* New synapse release 1.43.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 21 Sep 2021 11:49:05 +0100
matrix-synapse-py3 (1.43.0~rc2) stable; urgency=medium
* New synapse release 1.43.0~rc2.
-- Synapse Packaging team <packages@matrix.org> Fri, 17 Sep 2021 10:43:21 +0100
matrix-synapse-py3 (1.43.0~rc1) stable; urgency=medium
* New synapse release 1.43.0~rc1.
-- Synapse Packaging team <packages@matrix.org> Tue, 14 Sep 2021 11:39:46 +0100
matrix-synapse-py3 (1.42.0) stable; urgency=medium
* New synapse release 1.42.0.
-- Synapse Packaging team <packages@matrix.org> Tue, 07 Sep 2021 16:19:09 +0100
matrix-synapse-py3 (1.42.0~rc2) stable; urgency=medium
* New synapse release 1.42.0~rc2.
-- Synapse Packaging team <packages@matrix.org> Mon, 06 Sep 2021 15:25:13 +0100
matrix-synapse-py3 (1.42.0~rc1) stable; urgency=medium
* New synapse release 1.42.0rc1.
-- Synapse Packaging team <packages@matrix.org> Wed, 01 Sep 2021 11:37:48 +0100
matrix-synapse-py3 (1.41.1) stable; urgency=high
* New synapse release 1.41.1.

2
debian/control vendored
View File

@@ -19,7 +19,7 @@ Standards-Version: 3.9.8
Homepage: https://github.com/matrix-org/synapse
Package: matrix-synapse-py3
Architecture: any
Architecture: amd64
Provides: matrix-synapse
Conflicts:
matrix-synapse (<< 0.34.0.1-0matrix2),

View File

@@ -2,7 +2,6 @@
set -e
# shellcheck disable=SC1091
. /usr/share/debconf/confmodule
# try to update the debconf db according to whatever is in the config files

View File

@@ -3,4 +3,3 @@ opt/venvs/matrix-synapse/bin/register_new_matrix_user usr/bin/register_new_matri
opt/venvs/matrix-synapse/bin/synapse_port_db usr/bin/synapse_port_db
opt/venvs/matrix-synapse/bin/synapse_review_recent_signups usr/bin/synapse_review_recent_signups
opt/venvs/matrix-synapse/bin/synctl usr/bin/synctl
opt/venvs/matrix-synapse/bin/update_synapse_database usr/bin/update_synapse_database

View File

@@ -1,6 +1,5 @@
#!/bin/sh -e
# shellcheck disable=SC1091
. /usr/share/debconf/confmodule
CONFIGFILE_SERVERNAME="/etc/matrix-synapse/conf.d/server_name.yaml"

View File

@@ -5,7 +5,7 @@ Description=Synapse Matrix homeserver
Type=notify
User=matrix-synapse
WorkingDirectory=/var/lib/matrix-synapse
EnvironmentFile=-/etc/default/matrix-synapse
EnvironmentFile=/etc/default/matrix-synapse
ExecStartPre=/opt/venvs/matrix-synapse/bin/python -m synapse.app.homeserver --config-path=/etc/matrix-synapse/homeserver.yaml --config-path=/etc/matrix-synapse/conf.d/ --generate-keys
ExecStart=/opt/venvs/matrix-synapse/bin/python -m synapse.app.homeserver --config-path=/etc/matrix-synapse/homeserver.yaml --config-path=/etc/matrix-synapse/conf.d/
ExecReload=/bin/kill -HUP $MAINPID

6
debian/rules vendored
View File

@@ -51,11 +51,5 @@ override_dh_shlibdeps:
override_dh_virtualenv:
./debian/build_virtualenv
override_dh_builddeb:
# force the compression to xzip, to stop dpkg-deb on impish defaulting to zstd
# (which requires reprepro 5.3.0-1.3, which is currently only in 'experimental' in Debian:
# https://metadata.ftp-master.debian.org/changelogs/main/r/reprepro/reprepro_5.3.0-1.3_changelog)
dh_builddeb -- -Zxz
%:
dh $@ --with python-virtualenv

2
debian/test/.gitignore vendored Normal file
View File

@@ -0,0 +1,2 @@
.vagrant
*.log

23
debian/test/provision.sh vendored Normal file
View File

@@ -0,0 +1,23 @@
#!/bin/bash
#
# provisioning script for vagrant boxes for testing the matrix-synapse debs.
#
# Will install the most recent matrix-synapse-py3 deb for this platform from
# the /debs directory.
set -e
apt-get update
apt-get install -y lsb-release
deb=`ls /debs/matrix-synapse-py3_*+$(lsb_release -cs)*.deb | sort | tail -n1`
debconf-set-selections <<EOF
matrix-synapse matrix-synapse/report-stats boolean false
matrix-synapse matrix-synapse/server-name string localhost:18448
EOF
dpkg -i "$deb"
sed -i -e '/port: 8...$/{s/8448/18448/; s/8008/18008/}' -e '$aregistration_shared_secret: secret' /etc/matrix-synapse/homeserver.yaml
systemctl restart matrix-synapse

13
debian/test/stretch/Vagrantfile vendored Normal file
View File

@@ -0,0 +1,13 @@
# -*- mode: ruby -*-
# vi: set ft=ruby :
ver = `cd ../../..; dpkg-parsechangelog -S Version`.strip()
Vagrant.configure("2") do |config|
config.vm.box = "debian/stretch64"
config.vm.synced_folder ".", "/vagrant", disabled: true
config.vm.synced_folder "../../../../debs", "/debs", type: "nfs"
config.vm.provision "shell", path: "../provision.sh"
end

10
debian/test/xenial/Vagrantfile vendored Normal file
View File

@@ -0,0 +1,10 @@
# -*- mode: ruby -*-
# vi: set ft=ruby :
Vagrant.configure("2") do |config|
config.vm.box = "ubuntu/xenial64"
config.vm.synced_folder ".", "/vagrant", disabled: true
config.vm.synced_folder "../../../../debs", "/debs"
config.vm.provision "shell", path: "../provision.sh"
end

View File

@@ -22,5 +22,5 @@ Logs and sqlitedb will be stored in demo/808{0,1,2}.{log,db}
Also note that when joining a public room on a different HS via "#foo:bar.net", then you are (in the current impl) joining a room with room_id "foo". This means that it won't work if your HS already has a room with that name.
Also note that when joining a public room on a differnt HS via "#foo:bar.net", then you are (in the current impl) joining a room with room_id "foo". This means that it won't work if your HS already has a room with that name.

View File

@@ -6,14 +6,14 @@ DIR="$( cd "$( dirname "$0" )" && pwd )"
PID_FILE="$DIR/servers.pid"
if [ -f "$PID_FILE" ]; then
if [ -f $PID_FILE ]; then
echo "servers.pid exists!"
exit 1
fi
for port in 8080 8081 8082; do
rm -rf "${DIR:?}/$port"
rm -rf "$DIR/media_store.$port"
rm -rf $DIR/$port
rm -rf $DIR/media_store.$port
done
rm -rf "${DIR:?}/etc"
rm -rf $DIR/etc

View File

@@ -4,22 +4,21 @@ DIR="$( cd "$( dirname "$0" )" && pwd )"
CWD=$(pwd)
cd "$DIR/.." || exit
cd "$DIR/.."
mkdir -p demo/etc
PYTHONPATH=$(readlink -f "$(pwd)")
export PYTHONPATH
export PYTHONPATH=$(readlink -f $(pwd))
echo "$PYTHONPATH"
echo $PYTHONPATH
for port in 8080 8081 8082; do
echo "Starting server on port $port... "
https_port=$((port + 400))
mkdir -p demo/$port
pushd demo/$port || exit
pushd demo/$port
#rm $DIR/etc/$port.config
python3 -m synapse.app.homeserver \
@@ -28,78 +27,75 @@ for port in 8080 8081 8082; do
--config-path "$DIR/etc/$port.config" \
--report-stats no
if ! grep -F "Customisation made by demo/start.sh" -q "$DIR/etc/$port.config"; then
if ! grep -F "Customisation made by demo/start.sh" -q $DIR/etc/$port.config; then
printf '\n\n# Customisation made by demo/start.sh\n' >> $DIR/etc/$port.config
echo "public_baseurl: http://localhost:$port/" >> $DIR/etc/$port.config
echo 'enable_registration: true' >> $DIR/etc/$port.config
# Warning, this heredoc depends on the interaction of tabs and spaces. Please don't
# accidentaly bork me with your fancy settings.
listeners=$(cat <<-PORTLISTENERS
# Configure server to listen on both $https_port and $port
# This overides some of the default settings above
listeners:
- port: $https_port
type: http
tls: true
resources:
- names: [client, federation]
- port: $port
tls: false
bind_addresses: ['::1', '127.0.0.1']
type: http
x_forwarded: true
resources:
- names: [client, federation]
compress: false
PORTLISTENERS
)
echo "${listeners}" >> $DIR/etc/$port.config
# Disable tls for the servers
printf '\n\n# Disable tls on the servers.' >> $DIR/etc/$port.config
echo '# DO NOT USE IN PRODUCTION' >> $DIR/etc/$port.config
echo 'use_insecure_ssl_client_just_for_testing_do_not_use: true' >> $DIR/etc/$port.config
echo 'federation_verify_certificates: false' >> $DIR/etc/$port.config
# Set tls paths
echo "tls_certificate_path: \"$DIR/etc/localhost:$https_port.tls.crt\"" >> $DIR/etc/$port.config
echo "tls_private_key_path: \"$DIR/etc/localhost:$https_port.tls.key\"" >> $DIR/etc/$port.config
# Generate tls keys
openssl req -x509 -newkey rsa:4096 -keyout "$DIR/etc/localhost:$https_port.tls.key" -out "$DIR/etc/localhost:$https_port.tls.crt" -days 365 -nodes -subj "/O=matrix"
openssl req -x509 -newkey rsa:4096 -keyout $DIR/etc/localhost\:$https_port.tls.key -out $DIR/etc/localhost\:$https_port.tls.crt -days 365 -nodes -subj "/O=matrix"
# Regenerate configuration
{
printf '\n\n# Customisation made by demo/start.sh\n'
echo "public_baseurl: http://localhost:$port/"
echo 'enable_registration: true'
# Ignore keys from the trusted keys server
echo '# Ignore keys from the trusted keys server' >> $DIR/etc/$port.config
echo 'trusted_key_servers:' >> $DIR/etc/$port.config
echo ' - server_name: "matrix.org"' >> $DIR/etc/$port.config
echo ' accept_keys_insecurely: true' >> $DIR/etc/$port.config
# Warning, this heredoc depends on the interaction of tabs and spaces.
# Please don't accidentaly bork me with your fancy settings.
listeners=$(cat <<-PORTLISTENERS
# Configure server to listen on both $https_port and $port
# This overides some of the default settings above
listeners:
- port: $https_port
type: http
tls: true
resources:
- names: [client, federation]
- port: $port
tls: false
bind_addresses: ['::1', '127.0.0.1']
type: http
x_forwarded: true
resources:
- names: [client, federation]
compress: false
PORTLISTENERS
)
echo "${listeners}"
# Disable tls for the servers
printf '\n\n# Disable tls on the servers.'
echo '# DO NOT USE IN PRODUCTION'
echo 'use_insecure_ssl_client_just_for_testing_do_not_use: true'
echo 'federation_verify_certificates: false'
# Set tls paths
echo "tls_certificate_path: \"$DIR/etc/localhost:$https_port.tls.crt\""
echo "tls_private_key_path: \"$DIR/etc/localhost:$https_port.tls.key\""
# Ignore keys from the trusted keys server
echo '# Ignore keys from the trusted keys server'
echo 'trusted_key_servers:'
echo ' - server_name: "matrix.org"'
echo ' accept_keys_insecurely: true'
# Reduce the blacklist
blacklist=$(cat <<-BLACK
# Set the blacklist so that it doesn't include 127.0.0.1, ::1
federation_ip_range_blacklist:
- '10.0.0.0/8'
- '172.16.0.0/12'
- '192.168.0.0/16'
- '100.64.0.0/10'
- '169.254.0.0/16'
- 'fe80::/64'
- 'fc00::/7'
BLACK
)
echo "${blacklist}"
} >> "$DIR/etc/$port.config"
# Reduce the blacklist
blacklist=$(cat <<-BLACK
# Set the blacklist so that it doesn't include 127.0.0.1, ::1
federation_ip_range_blacklist:
- '10.0.0.0/8'
- '172.16.0.0/12'
- '192.168.0.0/16'
- '100.64.0.0/10'
- '169.254.0.0/16'
- 'fe80::/64'
- 'fc00::/7'
BLACK
)
echo "${blacklist}" >> $DIR/etc/$port.config
fi
# Check script parameters
if [ $# -eq 1 ]; then
if [ "$1" = "--no-rate-limit" ]; then
if [ $1 = "--no-rate-limit" ]; then
# Disable any rate limiting
ratelimiting=$(cat <<-RC
@@ -141,22 +137,22 @@ for port in 8080 8081 8082; do
burst_count: 1000
RC
)
echo "${ratelimiting}" >> "$DIR/etc/$port.config"
echo "${ratelimiting}" >> $DIR/etc/$port.config
fi
fi
if ! grep -F "full_twisted_stacktraces" -q "$DIR/etc/$port.config"; then
echo "full_twisted_stacktraces: true" >> "$DIR/etc/$port.config"
if ! grep -F "full_twisted_stacktraces" -q $DIR/etc/$port.config; then
echo "full_twisted_stacktraces: true" >> $DIR/etc/$port.config
fi
if ! grep -F "report_stats" -q "$DIR/etc/$port.config" ; then
echo "report_stats: false" >> "$DIR/etc/$port.config"
if ! grep -F "report_stats" -q $DIR/etc/$port.config ; then
echo "report_stats: false" >> $DIR/etc/$port.config
fi
python3 -m synapse.app.homeserver \
--config-path "$DIR/etc/$port.config" \
-D \
popd || exit
popd
done
cd "$CWD" || exit
cd "$CWD"

View File

@@ -8,7 +8,7 @@ for pid_file in $FILES; do
pid=$(cat "$pid_file")
if [[ $pid ]]; then
echo "Killing $pid_file with $pid"
kill "$pid"
kill $pid
fi
done

View File

@@ -1,17 +1,14 @@
# Dockerfile to build the matrixdotorg/synapse docker images.
#
# Note that it uses features which are only available in BuildKit - see
# https://docs.docker.com/go/buildkit/ for more information.
#
# To build the image, run `docker build` command from the root of the
# synapse repository:
#
# DOCKER_BUILDKIT=1 docker build -f docker/Dockerfile .
# docker build -f docker/Dockerfile .
#
# There is an optional PYTHON_VERSION build argument which sets the
# version of python to build against: for example:
#
# DOCKER_BUILDKIT=1 docker build -f docker/Dockerfile --build-arg PYTHON_VERSION=3.9 .
# docker build -f docker/Dockerfile --build-arg PYTHON_VERSION=3.6 .
#
ARG PYTHON_VERSION=3.8
@@ -22,16 +19,7 @@ ARG PYTHON_VERSION=3.8
FROM docker.io/python:${PYTHON_VERSION}-slim as builder
# install the OS build deps
#
# RUN --mount is specific to buildkit and is documented at
# https://github.com/moby/buildkit/blob/master/frontend/dockerfile/docs/syntax.md#build-mounts-run---mount.
# Here we use it to set up a cache for apt, to improve rebuild speeds on
# slow connections.
#
RUN \
--mount=type=cache,target=/var/cache/apt,sharing=locked \
--mount=type=cache,target=/var/lib/apt,sharing=locked \
apt-get update && apt-get install -y \
RUN apt-get update && apt-get install -y \
build-essential \
libffi-dev \
libjpeg-dev \
@@ -56,8 +44,7 @@ COPY synapse/python_dependencies.py /synapse/synapse/python_dependencies.py
# used while you develop on the source
#
# This is aiming at installing the `install_requires` and `extras_require` from `setup.py`
RUN --mount=type=cache,target=/root/.cache/pip \
pip install --prefix="/install" --no-warn-script-location \
RUN pip install --prefix="/install" --no-warn-script-location \
/synapse[all]
# Copy over the rest of the project
@@ -79,10 +66,7 @@ LABEL org.opencontainers.image.documentation='https://github.com/matrix-org/syna
LABEL org.opencontainers.image.source='https://github.com/matrix-org/synapse.git'
LABEL org.opencontainers.image.licenses='Apache-2.0'
RUN \
--mount=type=cache,target=/var/cache/apt,sharing=locked \
--mount=type=cache,target=/var/lib/apt,sharing=locked \
apt-get update && apt-get install -y \
RUN apt-get update && apt-get install -y \
curl \
gosu \
libjpeg62-turbo \

View File

@@ -16,7 +16,7 @@ ARG distro=""
### Stage 0: build a dh-virtualenv
###
# This is only really needed on focal, since other distributions we
# This is only really needed on bionic and focal, since other distributions we
# care about have a recent version of dh-virtualenv by default. Unfortunately,
# it looks like focal is going to be with us for a while.
#
@@ -36,8 +36,9 @@ RUN env DEBIAN_FRONTEND=noninteractive apt-get install \
wget
# fetch and unpack the package
# TODO: Upgrade to 1.2.2 once bionic is dropped (1.2.2 requires debhelper 12; bionic has only 11)
RUN mkdir /dh-virtualenv
RUN wget -q -O /dh-virtualenv.tar.gz https://github.com/spotify/dh-virtualenv/archive/refs/tags/1.2.2.tar.gz
RUN wget -q -O /dh-virtualenv.tar.gz https://github.com/spotify/dh-virtualenv/archive/ac6e1b1.tar.gz
RUN tar -xv --strip-components=1 -C /dh-virtualenv -f /dh-virtualenv.tar.gz
# install its build deps. We do another apt-cache-update here, because we might
@@ -46,9 +47,8 @@ RUN apt-get update -qq -o Acquire::Languages=none \
&& cd /dh-virtualenv \
&& env DEBIAN_FRONTEND=noninteractive mk-build-deps -ri -t "apt-get -y --no-install-recommends"
# Build it. Note that building the docs doesn't work due to differences in
# Sphinx APIs across versions/distros.
RUN cd /dh-virtualenv && DEB_BUILD_OPTIONS=nodoc dpkg-buildpackage -us -uc -b
# build it
RUN cd /dh-virtualenv && dpkg-buildpackage -us -uc -b
###
### Stage 1
@@ -85,12 +85,12 @@ RUN apt-get update -qq -o Acquire::Languages=none \
libpq-dev \
xmlsec1
COPY --from=builder /dh-virtualenv_1.2.2-1_all.deb /
COPY --from=builder /dh-virtualenv_1.2~dev-1_all.deb /
# install dhvirtualenv. Update the apt cache again first, in case we got a
# cached cache from docker the first time.
RUN apt-get update -qq -o Acquire::Languages=none \
&& apt-get install -yq /dh-virtualenv_1.2.2-1_all.deb
&& apt-get install -yq /dh-virtualenv_1.2~dev-1_all.deb
WORKDIR /synapse/source
ENTRYPOINT ["bash","/synapse/source/docker/build_debian.sh"]

View File

@@ -1,6 +1,6 @@
# Use the Sytest image that comes with a lot of the build dependencies
# pre-installed
FROM matrixdotorg/sytest:bionic
FROM matrixdotorg/sytest:latest
# The Sytest image doesn't come with python, so install that
RUN apt-get update && apt-get -qq install -y python3 python3-dev python3-pip
@@ -8,23 +8,5 @@ RUN apt-get update && apt-get -qq install -y python3 python3-dev python3-pip
# We need tox to run the tests in run_pg_tests.sh
RUN python3 -m pip install tox
# Initialise the db
RUN su -c '/usr/lib/postgresql/10/bin/initdb -D /var/lib/postgresql/data -E "UTF-8" --lc-collate="C.UTF-8" --lc-ctype="C.UTF-8" --username=postgres' postgres
# Add a user with our UID and GID so that files get created on the host owned
# by us, not root.
ARG UID
ARG GID
RUN groupadd --gid $GID user
RUN useradd --uid $UID --gid $GID --groups sudo --no-create-home user
# Ensure we can start postgres by sudo-ing as the postgres user.
RUN apt-get update && apt-get -qq install -y sudo
RUN echo "user ALL=(ALL) NOPASSWD: ALL" >> /etc/sudoers
ADD run_pg_tests.sh /run_pg_tests.sh
# Use the "exec form" of ENTRYPOINT (https://docs.docker.com/engine/reference/builder/#entrypoint)
# so that we can `docker run` this container and pass arguments to pg_tests.sh
ENTRYPOINT ["/run_pg_tests.sh"]
USER user
ADD run_pg_tests.sh /pg_tests.sh
ENTRYPOINT /pg_tests.sh

View File

@@ -21,6 +21,3 @@ VOLUME ["/data"]
# files to run the desired worker configuration. Will start supervisord.
COPY ./docker/configure_workers_and_start.py /configure_workers_and_start.py
ENTRYPOINT ["/configure_workers_and_start.py"]
HEALTHCHECK --start-period=5s --interval=15s --timeout=5s \
CMD /bin/sh /healthcheck.sh

View File

@@ -65,12 +65,7 @@ The following environment variables are supported in `generate` mode:
* `SYNAPSE_DATA_DIR`: where the generated config will put persistent data
such as the database and media store. Defaults to `/data`.
* `UID`, `GID`: the user id and group id to use for creating the data
directories. If unset, and no user is set via `docker run --user`, defaults
to `991`, `991`.
## Postgres
By default the config will use SQLite. See the [docs on using Postgres](https://github.com/matrix-org/synapse/blob/develop/docs/postgres.md) for more info on how to use Postgres. Until this section is improved [this issue](https://github.com/matrix-org/synapse/issues/8304) may provide useful information.
directories. Defaults to `991`, `991`.
## Running synapse
@@ -102,9 +97,7 @@ The following environment variables are supported in `run` mode:
`<SYNAPSE_CONFIG_DIR>/homeserver.yaml`.
* `SYNAPSE_WORKER`: module to execute, used when running synapse with workers.
Defaults to `synapse.app.homeserver`, which is suitable for non-worker mode.
* `UID`, `GID`: the user and group id to run Synapse as. If unset, and no user
is set via `docker run --user`, defaults to `991`, `991`. Note that this user
must have permission to read the config files, and write to the data directories.
* `UID`, `GID`: the user and group id to run Synapse as. Defaults to `991`, `991`.
* `TZ`: the [timezone](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones) the container will run with. Defaults to `UTC`.
For more complex setups (e.g. for workers) you can also pass your args directly to synapse using `run` mode. For example like this:
@@ -193,7 +186,7 @@ point to another Dockerfile.
## Disabling the healthcheck
If you are using a non-standard port or tls inside docker you can disable the healthcheck
whilst running the above `docker run` commands.
whilst running the above `docker run` commands.
```
--no-healthcheck
@@ -219,7 +212,7 @@ If you wish to point the healthcheck at a different port with docker command, ad
## Setting the healthcheck in docker-compose file
You can add the following to set a custom healthcheck in a docker compose file.
You will need docker-compose version >2.1 for this to work.
You will need docker-compose version >2.1 for this to work.
```
healthcheck:
@@ -233,5 +226,4 @@ healthcheck:
## Using jemalloc
Jemalloc is embedded in the image and will be used instead of the default allocator.
You can read about jemalloc by reading the Synapse
[README](https://github.com/matrix-org/synapse/blob/HEAD/README.rst#help-synapse-is-slow-and-eats-all-my-ram-cpu).
You can read about jemalloc by reading the Synapse [README](../README.rst).

View File

@@ -5,7 +5,7 @@
set -ex
# Get the codename from distro env
DIST=$(cut -d ':' -f2 <<< "${distro:?}")
DIST=`cut -d ':' -f2 <<< $distro`
# we get a read-only copy of the source: make a writeable copy
cp -aT /synapse/source /synapse/build
@@ -17,7 +17,7 @@ cd /synapse/build
# Section to determine which "component" it should go into (see
# https://manpages.debian.org/stretch/reprepro/reprepro.1.en.html#GUESSING)
DEB_VERSION=$(dpkg-parsechangelog -SVersion)
DEB_VERSION=`dpkg-parsechangelog -SVersion`
case $DEB_VERSION in
*~rc*|*~a*|*~b*|*~c*)
sed -ie '/^Section:/c\Section: prerelease' debian/control

View File

@@ -1,6 +0,0 @@
#!/bin/sh
# This healthcheck script is designed to return OK when every
# host involved returns OK
{%- for healthcheck_url in healthcheck_urls %}
curl -fSs {{ healthcheck_url }} || exit 1
{%- endfor %}

View File

@@ -148,6 +148,14 @@ bcrypt_rounds: 12
allow_guest_access: {{ "True" if SYNAPSE_ALLOW_GUEST else "False" }}
enable_group_creation: true
# The list of identity servers trusted to verify third party
# identifiers by this server.
#
# Also defines the ID server which will be called when an account is
# deactivated (one will be picked arbitrarily).
trusted_third_party_id_servers:
- matrix.org
- vector.im
## Metrics ###

View File

@@ -48,7 +48,7 @@ WORKERS_CONFIG = {
"app": "synapse.app.user_dir",
"listener_resources": ["client"],
"endpoint_patterns": [
"^/_matrix/client/(api/v1|r0|v3|unstable)/user_directory/search$"
"^/_matrix/client/(api/v1|r0|unstable)/user_directory/search$"
],
"shared_extra_conf": {"update_user_directory": False},
"worker_extra_conf": "",
@@ -85,10 +85,10 @@ WORKERS_CONFIG = {
"app": "synapse.app.generic_worker",
"listener_resources": ["client"],
"endpoint_patterns": [
"^/_matrix/client/(v2_alpha|r0|v3)/sync$",
"^/_matrix/client/(api/v1|v2_alpha|r0|v3)/events$",
"^/_matrix/client/(api/v1|r0|v3)/initialSync$",
"^/_matrix/client/(api/v1|r0|v3)/rooms/[^/]+/initialSync$",
"^/_matrix/client/(v2_alpha|r0)/sync$",
"^/_matrix/client/(api/v1|v2_alpha|r0)/events$",
"^/_matrix/client/(api/v1|r0)/initialSync$",
"^/_matrix/client/(api/v1|r0)/rooms/[^/]+/initialSync$",
],
"shared_extra_conf": {},
"worker_extra_conf": "",
@@ -146,11 +146,11 @@ WORKERS_CONFIG = {
"app": "synapse.app.generic_worker",
"listener_resources": ["client"],
"endpoint_patterns": [
"^/_matrix/client/(api/v1|r0|v3|unstable)/rooms/.*/redact",
"^/_matrix/client/(api/v1|r0|v3|unstable)/rooms/.*/send",
"^/_matrix/client/(api/v1|r0|v3|unstable)/rooms/.*/(join|invite|leave|ban|unban|kick)$",
"^/_matrix/client/(api/v1|r0|v3|unstable)/join/",
"^/_matrix/client/(api/v1|r0|v3|unstable)/profile/",
"^/_matrix/client/(api/v1|r0|unstable)/rooms/.*/redact",
"^/_matrix/client/(api/v1|r0|unstable)/rooms/.*/send",
"^/_matrix/client/(api/v1|r0|unstable)/rooms/.*/(join|invite|leave|ban|unban|kick)$",
"^/_matrix/client/(api/v1|r0|unstable)/join/",
"^/_matrix/client/(api/v1|r0|unstable)/profile/",
],
"shared_extra_conf": {},
"worker_extra_conf": "",
@@ -158,11 +158,11 @@ WORKERS_CONFIG = {
"frontend_proxy": {
"app": "synapse.app.frontend_proxy",
"listener_resources": ["client", "replication"],
"endpoint_patterns": ["^/_matrix/client/(api/v1|r0|v3|unstable)/keys/upload"],
"endpoint_patterns": ["^/_matrix/client/(api/v1|r0|unstable)/keys/upload"],
"shared_extra_conf": {},
"worker_extra_conf": (
"worker_main_http_uri: http://127.0.0.1:%d"
% (MAIN_PROCESS_HTTP_LISTENER_PORT,)
% (MAIN_PROCESS_HTTP_LISTENER_PORT,),
),
},
}
@@ -474,16 +474,10 @@ def generate_worker_files(environ, config_path: str, data_dir: str):
# Determine the load-balancing upstreams to configure
nginx_upstream_config = ""
# At the same time, prepare a list of internal endpoints to healthcheck
# starting with the main process which exists even if no workers do.
healthcheck_urls = ["http://localhost:8080/health"]
for upstream_worker_type, upstream_worker_ports in nginx_upstreams.items():
body = ""
for port in upstream_worker_ports:
body += " server localhost:%d;\n" % (port,)
healthcheck_urls.append("http://localhost:%d/health" % (port,))
# Add to the list of configured upstreams
nginx_upstream_config += NGINX_UPSTREAM_CONFIG_BLOCK.format(
@@ -516,13 +510,6 @@ def generate_worker_files(environ, config_path: str, data_dir: str):
worker_config=supervisord_config,
)
# healthcheck config
convert(
"/conf/healthcheck.sh.j2",
"/healthcheck.sh",
healthcheck_urls=healthcheck_urls,
)
# Ensure the logging directory exists
log_dir = data_dir + "/logs"
if not os.path.exists(log_dir):

View File

@@ -10,10 +10,11 @@ set -e
# Set PGUSER so Synapse's tests know what user to connect to the database with
export PGUSER=postgres
# Start the database
sudo -u postgres /usr/lib/postgresql/10/bin/pg_ctl -w -D /var/lib/postgresql/data start
# Initialise & start the database
su -c '/usr/lib/postgresql/9.6/bin/initdb -D /var/lib/postgresql/data -E "UTF-8" --lc-collate="en_US.UTF-8" --lc-ctype="en_US.UTF-8" --username=postgres' postgres
su -c '/usr/lib/postgresql/9.6/bin/pg_ctl -w -D /var/lib/postgresql/data start' postgres
# Run the tests
cd /src
export TRIAL_FLAGS="-j 4"
tox --workdir=./.tox-pg-container -e py36-postgres "$@"
tox --workdir=/tmp -e py35-postgres

View File

@@ -120,7 +120,6 @@ def generate_config_from_template(config_dir, config_path, environ, ownership):
]
if ownership is not None:
log(f"Setting ownership on /data to {ownership}")
subprocess.check_output(["chown", "-R", ownership, "/data"])
args = ["gosu", ownership] + args
@@ -145,18 +144,12 @@ def run_generate_config(environ, ownership):
config_path = environ.get("SYNAPSE_CONFIG_PATH", config_dir + "/homeserver.yaml")
data_dir = environ.get("SYNAPSE_DATA_DIR", "/data")
if ownership is not None:
# make sure that synapse has perms to write to the data dir.
log(f"Setting ownership on {data_dir} to {ownership}")
subprocess.check_output(["chown", ownership, data_dir])
# create a suitable log config from our template
log_config_file = "%s/%s.log.config" % (config_dir, server_name)
if not os.path.exists(log_config_file):
log("Creating log config %s" % (log_config_file,))
convert("/conf/log.config", log_config_file, environ)
# generate the main config file, and a signing key.
args = [
"python",
"-m",
@@ -175,23 +168,29 @@ def run_generate_config(environ, ownership):
"--open-private-ports",
]
# log("running %s" % (args, ))
os.execv("/usr/local/bin/python", args)
if ownership is not None:
# make sure that synapse has perms to write to the data dir.
subprocess.check_output(["chown", ownership, data_dir])
args = ["gosu", ownership] + args
os.execv("/usr/sbin/gosu", args)
else:
os.execv("/usr/local/bin/python", args)
def main(args, environ):
mode = args[1] if len(args) > 1 else "run"
# if we were given an explicit user to switch to, do so
ownership = None
if "UID" in environ:
desired_uid = int(environ["UID"])
desired_gid = int(environ.get("GID", "991"))
ownership = f"{desired_uid}:{desired_gid}"
elif os.getuid() == 0:
# otherwise, if we are running as root, use user 991
ownership = "991:991"
desired_uid = int(environ.get("UID", "991"))
desired_gid = int(environ.get("GID", "991"))
synapse_worker = environ.get("SYNAPSE_WORKER", "synapse.app.homeserver")
if (desired_uid == os.getuid()) and (desired_gid == os.getgid()):
ownership = None
else:
ownership = "{}:{}".format(desired_uid, desired_gid)
if ownership is None:
log("Will not perform chmod/gosu as UserID already matches request")
# In generate mode, generate a configuration and missing keys, then exit
if mode == "generate":

View File

@@ -15,12 +15,12 @@ in `homeserver.yaml`, to the list of authorized domains. If you have not set
1. Agree to the terms of service and submit.
1. Copy your site key and secret key and add them to your `homeserver.yaml`
configuration file
```yaml
```
recaptcha_public_key: YOUR_SITE_KEY
recaptcha_private_key: YOUR_SECRET_KEY
```
1. Enable the CAPTCHA for new registrations
```yaml
```
enable_registration_captcha: true
```
1. Go to the settings page for the CAPTCHA you just created

View File

@@ -3,7 +3,7 @@
## Historical Note
This document was originally written to guide server admins through the upgrade
path towards Synapse 1.0. Specifically,
[MSC1711](https://github.com/matrix-org/matrix-doc/blob/main/proposals/1711-x509-for-federation.md)
[MSC1711](https://github.com/matrix-org/matrix-doc/blob/master/proposals/1711-x509-for-federation.md)
required that all servers present valid TLS certificates on their federation
API. Admins were encouraged to achieve compliance from version 0.99.0 (released
in February 2019) ahead of version 1.0 (released June 2019) enforcing the
@@ -282,7 +282,7 @@ coffin of the Perspectives project (which was already pretty dead). So, the
Spec Core Team decided that a better approach would be to mandate valid TLS
certificates for federation alongside the rest of the Web. More details can be
found in
[MSC1711](https://github.com/matrix-org/matrix-doc/blob/main/proposals/1711-x509-for-federation.md#background-the-failure-of-the-perspectives-approach).
[MSC1711](https://github.com/matrix-org/matrix-doc/blob/master/proposals/1711-x509-for-federation.md#background-the-failure-of-the-perspectives-approach).
This results in a breaking change, which is disruptive, but absolutely critical
for the security model. However, the existence of Let's Encrypt as a trivial

View File

@@ -6,9 +6,9 @@ Please update any links to point to the new website instead.
## About
This directory currently holds a series of markdown files documenting how to install, use
and develop Synapse. The documentation is readable directly from this repository, but it is
recommended to instead browse through the [website](https://matrix-org.github.io/synapse) for
easier discoverability.
and develop Synapse, the reference Matrix homeserver. The documentation is readable directly
from this repository, but it is recommended to instead browse through the
[website](https://matrix-org.github.io/synapse) for easier discoverability.
## Adding to the documentation
@@ -50,10 +50,8 @@ build the documentation with:
mdbook build
```
The rendered contents will be outputted to a new `book/` directory at the root of the repository. Please note that
index.html is not built by default, it is created by copying over the file `welcome_and_overview.html` to `index.html`
during deployment. Thus, when running `mdbook serve` locally the book will initially show a 404 in place of the index
due to the above. Do not be alarmed!
The rendered contents will be outputted to a new `book/` directory at the root of the repository. You can
browse the book by opening `book/index.html` in a web browser.
You can also have mdbook host the docs on a local webserver with hot-reload functionality via:

View File

@@ -23,58 +23,47 @@
- [Structured Logging](structured_logging.md)
- [Templates](templates.md)
- [User Authentication](usage/configuration/user_authentication/README.md)
- [Single-Sign On](usage/configuration/user_authentication/single_sign_on/README.md)
- [Single-Sign On]()
- [OpenID Connect](openid.md)
- [SAML](usage/configuration/user_authentication/single_sign_on/saml.md)
- [CAS](usage/configuration/user_authentication/single_sign_on/cas.md)
- [SAML]()
- [CAS]()
- [SSO Mapping Providers](sso_mapping_providers.md)
- [Password Auth Providers](password_auth_providers.md)
- [JSON Web Tokens](jwt.md)
- [Refresh Tokens](usage/configuration/user_authentication/refresh_tokens.md)
- [Registration Captcha](CAPTCHA_SETUP.md)
- [Application Services](application_services.md)
- [Server Notices](server_notices.md)
- [Consent Tracking](consent_tracking.md)
- [URL Previews](development/url_previews.md)
- [URL Previews](url_previews.md)
- [User Directory](user_directory.md)
- [Message Retention Policies](message_retention_policies.md)
- [Pluggable Modules](modules/index.md)
- [Writing a module](modules/writing_a_module.md)
- [Spam checker callbacks](modules/spam_checker_callbacks.md)
- [Third-party rules callbacks](modules/third_party_rules_callbacks.md)
- [Presence router callbacks](modules/presence_router_callbacks.md)
- [Account validity callbacks](modules/account_validity_callbacks.md)
- [Password auth provider callbacks](modules/password_auth_provider_callbacks.md)
- [Background update controller callbacks](modules/background_update_controller_callbacks.md)
- [Porting a legacy module to the new interface](modules/porting_legacy_module.md)
- [Pluggable Modules](modules.md)
- [Third Party Rules]()
- [Spam Checker](spam_checker.md)
- [Presence Router](presence_router_module.md)
- [Media Storage Providers]()
- [Workers](workers.md)
- [Using `synctl` with Workers](synctl_workers.md)
- [Systemd](systemd-with-workers/README.md)
- [Administration](usage/administration/README.md)
- [Admin API](usage/administration/admin_api/README.md)
- [Account Validity](admin_api/account_validity.md)
- [Background Updates](usage/administration/admin_api/background_updates.md)
- [Delete Group](admin_api/delete_group.md)
- [Event Reports](admin_api/event_reports.md)
- [Media](admin_api/media_admin_api.md)
- [Purge History](admin_api/purge_history_api.md)
- [Purge Rooms](admin_api/purge_room.md)
- [Register Users](admin_api/register_api.md)
- [Registration Tokens](usage/administration/admin_api/registration_tokens.md)
- [Manipulate Room Membership](admin_api/room_membership.md)
- [Rooms](admin_api/rooms.md)
- [Server Notices](admin_api/server_notices.md)
- [Shutdown Room](admin_api/shutdown_room.md)
- [Statistics](admin_api/statistics.md)
- [Users](admin_api/user_admin_api.md)
- [Server Version](admin_api/version_api.md)
- [Federation](usage/administration/admin_api/federation.md)
- [Manhole](manhole.md)
- [Monitoring](metrics-howto.md)
- [Understanding Synapse Through Grafana Graphs](usage/administration/understanding_synapse_through_grafana_graphs.md)
- [Useful SQL for Admins](usage/administration/useful_sql_for_admins.md)
- [Database Maintenance Tools](usage/administration/database_maintenance_tools.md)
- [State Groups](usage/administration/state_groups.md)
- [Request log format](usage/administration/request_log.md)
- [Admin FAQ](usage/administration/admin_faq.md)
- [Scripts]()
# Development
@@ -84,7 +73,6 @@
- [Testing]()
- [OpenTracing](opentracing.md)
- [Database Schemas](development/database_schema.md)
- [Experimental features](development/experimental_features.md)
- [Synapse Architecture]()
- [Log Contexts](log_contexts.md)
- [Replication](replication.md)
@@ -102,4 +90,3 @@
# Other
- [Dependency Deprecation Policy](deprecation_policy.md)
- [Running Synapse on a Single-Board Computer](other/running_synapse_on_single_board_computers.md)

View File

@@ -99,7 +99,7 @@ server admin: see [Admin API](../usage/administration/admin_api).
It returns a JSON body like the following:
```json
```jsonc
{
"event_id": "$bNUFCwGzWca1meCGkjp-zwslF-GfVcXukvRLI1_FaVY",
"event_json": {
@@ -132,7 +132,7 @@ It returns a JSON body like the following:
},
"type": "m.room.message",
"unsigned": {
"age_ts": 1592291711430
"age_ts": 1592291711430,
}
},
"id": <report_id>,

Some files were not shown because too many files have changed in this diff Show More