1
0

Compare commits

..

2 Commits

Author SHA1 Message Date
Travis Ralston
177f2b838c changelog 2019-05-22 19:16:02 -06:00
Travis Ralston
f9d7d3aa89 Remove m.relates_to from events if the client set it to null
It appears as though Python only checks to see if the key exists in a dictionary, not necessarily for a useful value. This means that when clients submit (valid) requests with `m.relates_to: null` and Synapse later reads it, it gets a None reference error on access.

This is the easier route than guarding all the places where it could be None.
2019-05-22 19:14:10 -06:00
158 changed files with 1451 additions and 3821 deletions

View File

@@ -1,16 +1,8 @@
Synapse 0.99.5.2 (2019-05-30)
=============================
Bugfixes
--------
- Fix bug where we leaked extremities when we soft failed events, leading to performance degradation. ([\#5274](https://github.com/matrix-org/synapse/issues/5274), [\#5278](https://github.com/matrix-org/synapse/issues/5278), [\#5291](https://github.com/matrix-org/synapse/issues/5291))
Synapse 0.99.5.1 (2019-05-22)
=============================
0.99.5.1 supersedes 0.99.5 due to malformed debian changelog - no functional changes.
No significant changes.
Synapse 0.99.5 (2019-05-22)
===========================

View File

@@ -1 +0,0 @@
Synapse will now serve the experimental "room complexity" API endpoint.

View File

@@ -1 +0,0 @@
Add experimental support for relations (aka reactions and edits).

View File

@@ -1 +0,0 @@
Ability to configure default room version.

View File

@@ -1 +0,0 @@
The base classes for the v1 and v2_alpha REST APIs have been unified.

View File

@@ -1 +0,0 @@
Simplifications and comments in do_auth.

View File

@@ -1 +0,0 @@
Fix appservice timestamp massaging.

View File

@@ -1 +0,0 @@
Rewrite store_server_verify_key to store several keys at once.

View File

@@ -1 +0,0 @@
Remove unused VerifyKey.expired and .time_added fields.

View File

@@ -1 +0,0 @@
Simplify Keyring.process_v2_response.

View File

@@ -1 +0,0 @@
Store key validity time in the storage layer.

1
changelog.d/5239.bugfix Normal file
View File

@@ -0,0 +1 @@
Fix 500 Internal Server Error when sending an event with `m.relates_to: null`.

View File

@@ -1 +0,0 @@
Refactor synapse.crypto.keyring to use a KeyFetcher interface.

View File

@@ -1 +0,0 @@
Ability to configure default room version.

View File

@@ -1 +0,0 @@
Simplification to Keyring.wait_for_previous_lookups.

View File

@@ -1 +0,0 @@
Ensure that server_keys fetched via a notary server are correctly signed.

View File

@@ -1 +0,0 @@
Show the correct error when logging out and access token is missing.

View File

@@ -1 +0,0 @@
Fix error code when there is an invalid parameter on /_matrix/client/r0/publicRooms

View File

@@ -1 +0,0 @@
Fix error when downloading thumbnail with missing width/height parameter.

View File

@@ -1 +0,0 @@
Synapse now more efficiently collates room statistics.

View File

@@ -1 +0,0 @@
Fix schema update for account validity.

View File

@@ -1 +0,0 @@
Fix bug where we leaked extremities when we soft failed events, leading to performance degradation.

View File

@@ -1 +0,0 @@
Fix "db txn 'update_presence' from sentinel context" log messages.

View File

@@ -1 +0,0 @@
Allow configuring a range for the account validity startup job.

View File

@@ -1 +0,0 @@
Fix dropped logcontexts during high outbound traffic.

View File

@@ -1 +0,0 @@
Fix bug where we leaked extremities when we soft failed events, leading to performance degradation.

View File

@@ -1 +0,0 @@
Fix docs on resetting the user directory.

View File

@@ -1 +0,0 @@
Specify the type of reCAPTCHA key to use.

View File

@@ -1 +0,0 @@
CAS login will now hit the r0 API, not the deprecated v1 one.

View File

@@ -1 +0,0 @@
Remove spurious debug from MatrixFederationHttpClient.get_json.

View File

@@ -1 +0,0 @@
Improve logging for logcontext leaks.

View File

@@ -1 +0,0 @@
Fix bug where we leaked extremities when we soft failed events, leading to performance degradation.

View File

@@ -1 +0,0 @@
Fix a bug where it is not possible to get events in the federation format with the request `GET /_matrix/client/r0/rooms/{roomId}/messages`.

View File

@@ -1 +0,0 @@
Fix performance problems with the rooms stats background update.

View File

@@ -1 +0,0 @@
Refactor keyring.VerifyKeyRequest to use attr.s.

View File

@@ -1 +0,0 @@
Rewrite get_server_verify_keys, again.

View File

@@ -1 +0,0 @@
Fix noisy 'no key for server' logs.

View File

@@ -1 +0,0 @@
Clarify that the admin change password API logs the user out.

View File

@@ -1 +0,0 @@
Fix bug where a notary server would sometimes forget old keys.

View File

@@ -1 +0,0 @@
Prevent users from setting huge displaynames and avatar URLs.

View File

@@ -1 +0,0 @@
Ensure that we have an up-to-date copy of the signing key when validating incoming federation requests.

View File

@@ -1 +0,0 @@
Synapse now more efficiently collates room statistics.

View File

@@ -1 +0,0 @@
The base classes for the v1 and v2_alpha REST APIs have been unified.

View File

@@ -1 +0,0 @@
Improve docstrings on MatrixFederationClient.

View File

@@ -1 +0,0 @@
Fix various problems which made the signing-key notary server time out for some requests.

View File

@@ -1 +0,0 @@
Fix bug which would make certain operations (such as room joins) block for 20 minutes while attemoting to fetch verification keys.

View File

@@ -1 +0,0 @@
Fix a bug where we could rapidly mark a server as unreachable even though it was only down for a few minutes.

View File

@@ -1 +0,0 @@
Fix a bug where account validity renewal emails could only be sent when email notifs were enabled.

View File

@@ -1 +0,0 @@
Add ability to perform password reset via email without trusting the identity server.

6
debian/changelog vendored
View File

@@ -1,9 +1,3 @@
matrix-synapse-py3 (0.99.5.2) stable; urgency=medium
* New synapse release 0.99.5.2.
-- Synapse Packaging team <packages@matrix.org> Thu, 30 May 2019 16:28:07 +0100
matrix-synapse-py3 (0.99.5.1) stable; urgency=medium
* New synapse release 0.99.5.1.

View File

@@ -7,7 +7,6 @@ Requires a public/private key pair from:
https://developers.google.com/recaptcha/
Must be a reCAPTCHA v2 key using the "I'm not a robot" Checkbox option
Setting ReCaptcha Keys
----------------------

View File

@@ -69,7 +69,7 @@ An empty body may be passed for backwards compatibility.
Reset password
==============
Changes the password of another user. This will automatically log the user out of all their devices.
Changes the password of another user.
The api is::

View File

@@ -83,16 +83,6 @@ pid_file: DATADIR/homeserver.pid
#
#restrict_public_rooms_to_local_users: true
# The default room version for newly created rooms.
#
# Known room versions are listed here:
# https://matrix.org/docs/spec/#complete-list-of-room-versions
#
# For example, for room version 1, default_room_version should be set
# to "1".
#
#default_room_version: "1"
# The GC threshold parameters to pass to `gc.set_threshold`, if defined
#
#gc_thresholds: [700, 10, 10]
@@ -763,9 +753,7 @@ uploads_path: "DATADIR/uploads"
# This means that, if a validity period is set, and Synapse is restarted (it will
# then derive an expiration date from the current validity period), and some time
# after that the validity period changes and Synapse is restarted, the users'
# expiration dates won't be updated unless their account is manually renewed. This
# date will be randomly selected within a range [now + period - d ; now + period],
# where d is equal to 10% of the validity period.
# expiration dates won't be updated unless their account is manually renewed.
#
#account_validity:
# enabled: True
@@ -1018,8 +1006,10 @@ password_config:
# Enable sending emails for password resets, notification events or
# account expiry notices
# Enable sending emails for notification events or expiry notices
# Defining a custom URL for Riot is only needed if email notifications
# should contain links to a self-hosted installation of Riot; when set
# the "app_name" setting is ignored.
#
# If your SMTP server requires authentication, the optional smtp_user &
# smtp_pass variables should be used
@@ -1027,64 +1017,22 @@ password_config:
#email:
# enable_notifs: false
# smtp_host: "localhost"
# smtp_port: 25 # SSL: 465, STARTTLS: 587
# smtp_port: 25
# smtp_user: "exampleusername"
# smtp_pass: "examplepassword"
# require_transport_security: False
# notif_from: "Your Friendly %(app)s Home Server <noreply@example.com>"
# app_name: Matrix
#
# # Enable email notifications by default
# notif_for_new_users: True
#
# # Defining a custom URL for Riot is only needed if email notifications
# # should contain links to a self-hosted installation of Riot; when set
# # the "app_name" setting is ignored
# riot_base_url: "http://localhost/riot"
#
# # Enable sending password reset emails via the configured, trusted
# # identity servers
# #
# # IMPORTANT! This will give a malicious or overtaken identity server
# # the ability to reset passwords for your users! Make absolutely sure
# # that you want to do this! It is strongly recommended that password
# # reset emails be sent by the homeserver instead
# #
# # If this option is set to false and SMTP options have not been
# # configured, resetting user passwords via email will be disabled
# #trust_identity_server_for_password_resets: false
#
# # Configure the time that a validation email or text message code
# # will expire after sending
# #
# # This is currently used for password resets
# #validation_token_lifetime: 1h
#
# # Template directory. All template files should be stored within this
# # directory
# #
# # if template_dir is unset, uses the example templates that are part of
# # the Synapse distribution.
# #template_dir: res/templates
#
# # Templates for email notifications
# #
# notif_template_html: notif_mail.html
# notif_template_text: notif_mail.txt
#
# # Templates for account expiry notices
# #
# # Templates for account expiry notices.
# expiry_template_html: notice_expiry.html
# expiry_template_text: notice_expiry.txt
#
# # Templates for password reset emails sent by the homeserver
# #
# #password_reset_template_html: password_reset.html
# #password_reset_template_text: password_reset.txt
#
# # Templates for password reset success and failure pages that a user
# # will see after attempting to reset their password
# #
# #password_reset_template_success_html: password_reset_success.html
# #password_reset_template_failure_html: password_reset_failure.html
# notif_for_new_users: True
# riot_base_url: "http://localhost/riot"
#password_providers:
@@ -1145,9 +1093,9 @@ password_config:
#
# 'search_all_users' defines whether to search all users visible to your HS
# when searching the user directory, rather than limiting to users visible
# in public rooms. Defaults to false. If you set it True, you'll have to
# rebuild the user_directory search indexes, see
# https://github.com/matrix-org/synapse/blob/master/docs/user_directory.md
# in public rooms. Defaults to false. If you set it True, you'll have to run
# UPDATE user_directory_stream_pos SET stream_id = NULL;
# on your database to tell it to rebuild the user_directory search indexes.
#
#user_directory:
# enabled: true

View File

@@ -7,7 +7,11 @@ who are present in a publicly viewable room present on the server.
The directory info is stored in various tables, which can (typically after
DB corruption) get stale or out of sync. If this happens, for now the
solution to fix it is to execute the SQL here
https://github.com/matrix-org/synapse/blob/master/synapse/storage/schema/delta/53/user_dir_populate.sql
and then restart synapse. This should then start a background task to
quickest solution to fix it is:
```
UPDATE user_directory_stream_pos SET stream_id = NULL;
```
and restart the synapse, which should then start a background task to
flush the current tables and regenerate the directory.

View File

@@ -20,7 +20,9 @@ class CallVisitor(ast.NodeVisitor):
else:
return
if name == "client_patterns":
if name == "client_path_patterns":
PATTERNS_V1.append(node.args[0].s)
elif name == "client_v2_patterns":
PATTERNS_V2.append(node.args[0].s)

View File

@@ -27,4 +27,4 @@ try:
except ImportError:
pass
__version__ = "0.99.5.2"
__version__ = "0.99.5.1"

View File

@@ -339,15 +339,6 @@ class UnsupportedRoomVersionError(SynapseError):
)
class ThreepidValidationError(SynapseError):
"""An error raised when there was a problem authorising an event."""
def __init__(self, *args, **kwargs):
if "errcode" not in kwargs:
kwargs["errcode"] = Codes.FORBIDDEN
super(ThreepidValidationError, self).__init__(*args, **kwargs)
class IncompatibleRoomVersionError(SynapseError):
"""A server is trying to join a room whose version it does not support.

View File

@@ -85,6 +85,10 @@ class RoomVersions(object):
)
# the version we will give rooms which are created on this server
DEFAULT_ROOM_VERSION = RoomVersions.V1
KNOWN_ROOM_VERSIONS = {
v.identifier: v for v in (
RoomVersions.V1,

View File

@@ -26,7 +26,6 @@ CLIENT_API_PREFIX = "/_matrix/client"
FEDERATION_PREFIX = "/_matrix/federation"
FEDERATION_V1_PREFIX = FEDERATION_PREFIX + "/v1"
FEDERATION_V2_PREFIX = FEDERATION_PREFIX + "/v2"
FEDERATION_UNSTABLE_PREFIX = FEDERATION_PREFIX + "/unstable"
STATIC_PREFIX = "/_matrix/static"
WEB_CLIENT_PREFIX = "/_matrix/client"
CONTENT_REPO_PREFIX = "/_matrix/content"

View File

@@ -344,21 +344,15 @@ class _LimitedHostnameResolver(object):
def resolveHostName(self, resolutionReceiver, hostName, portNumber=0,
addressTypes=None, transportSemantics='TCP'):
# Note this is happening deep within the reactor, so we don't need to
# worry about log contexts.
# We need this function to return `resolutionReceiver` so we do all the
# actual logic involving deferreds in a separate function.
# even though this is happening within the depths of twisted, we need to drop
# our logcontext before starting _resolve, otherwise: (a) _resolve will drop
# the logcontext if it returns an incomplete deferred; (b) _resolve will
# call the resolutionReceiver *with* a logcontext, which it won't be expecting.
with PreserveLoggingContext():
self._resolve(
resolutionReceiver,
hostName,
portNumber,
addressTypes,
transportSemantics,
)
self._resolve(
resolutionReceiver, hostName, portNumber,
addressTypes, transportSemantics,
)
return resolutionReceiver

View File

@@ -37,7 +37,8 @@ from synapse.replication.slave.storage.client_ips import SlavedClientIpStore
from synapse.replication.slave.storage.devices import SlavedDeviceStore
from synapse.replication.slave.storage.registration import SlavedRegistrationStore
from synapse.replication.tcp.client import ReplicationClientHandler
from synapse.rest.client.v2_alpha._base import client_patterns
from synapse.rest.client.v1.base import ClientV1RestServlet, client_path_patterns
from synapse.rest.client.v2_alpha._base import client_v2_patterns
from synapse.server import HomeServer
from synapse.storage.engines import create_engine
from synapse.util.httpresourcetree import create_resource_tree
@@ -48,11 +49,11 @@ from synapse.util.versionstring import get_version_string
logger = logging.getLogger("synapse.app.frontend_proxy")
class PresenceStatusStubServlet(RestServlet):
PATTERNS = client_patterns("/presence/(?P<user_id>[^/]*)/status")
class PresenceStatusStubServlet(ClientV1RestServlet):
PATTERNS = client_path_patterns("/presence/(?P<user_id>[^/]*)/status")
def __init__(self, hs):
super(PresenceStatusStubServlet, self).__init__()
super(PresenceStatusStubServlet, self).__init__(hs)
self.http_client = hs.get_simple_http_client()
self.auth = hs.get_auth()
self.main_uri = hs.config.worker_main_http_uri
@@ -83,7 +84,7 @@ class PresenceStatusStubServlet(RestServlet):
class KeyUploadServlet(RestServlet):
PATTERNS = client_patterns("/keys/upload(/(?P<device_id>[^/]+))?$")
PATTERNS = client_v2_patterns("/keys/upload(/(?P<device_id>[^/]+))?$")
def __init__(self, hs):
"""

View File

@@ -176,7 +176,6 @@ class SynapseHomeServer(HomeServer):
resources.update({
"/_matrix/client/api/v1": client_resource,
"/_synapse/password_reset": client_resource,
"/_matrix/client/r0": client_resource,
"/_matrix/client/unstable": client_resource,
"/_matrix/client/v2_alpha": client_resource,

View File

@@ -1,7 +1,5 @@
# -*- coding: utf-8 -*-
# Copyright 2015-2016 OpenMarket Ltd
# Copyright 2017-2018 New Vector Ltd
# Copyright 2019 The Matrix.org Foundation C.I.C.
# Copyright 2015, 2016 OpenMarket Ltd
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
@@ -31,76 +29,12 @@ logger = logging.getLogger(__name__)
class EmailConfig(Config):
def read_config(self, config):
# TODO: We should separate better the email configuration from the notification
# and account validity config.
self.email_enable_notifs = False
email_config = config.get("email", {})
self.email_smtp_host = email_config.get("smtp_host", None)
self.email_smtp_port = email_config.get("smtp_port", None)
self.email_smtp_user = email_config.get("smtp_user", None)
self.email_smtp_pass = email_config.get("smtp_pass", None)
self.require_transport_security = email_config.get(
"require_transport_security", False
)
if "app_name" in email_config:
self.email_app_name = email_config["app_name"]
else:
self.email_app_name = "Matrix"
# TODO: Rename notif_from to something more generic, or have a separate
# from for password resets, message notifications, etc?
# Currently the email section is a bit bogged down with settings for
# multiple functions. Would be good to split it out into separate
# sections and only put the common ones under email:
self.email_notif_from = email_config.get("notif_from", None)
if self.email_notif_from is not None:
# make sure it's valid
parsed = email.utils.parseaddr(self.email_notif_from)
if parsed[1] == '':
raise RuntimeError("Invalid notif_from address")
template_dir = email_config.get("template_dir")
# we need an absolute path, because we change directory after starting (and
# we don't yet know what auxilliary templates like mail.css we will need).
# (Note that loading as package_resources with jinja.PackageLoader doesn't
# work for the same reason.)
if not template_dir:
template_dir = pkg_resources.resource_filename(
'synapse', 'res/templates'
)
self.email_template_dir = os.path.abspath(template_dir)
self.email_enable_notifs = email_config.get("enable_notifs", False)
account_validity_renewal_enabled = config.get(
"account_validity", {},
).get("renew_at")
email_trust_identity_server_for_password_resets = email_config.get(
"trust_identity_server_for_password_resets", False,
)
self.email_password_reset_behaviour = (
"remote" if email_trust_identity_server_for_password_resets else "local"
)
if self.email_password_reset_behaviour == "local" and email_config == {}:
logger.warn(
"User password resets have been disabled due to lack of email config"
)
self.email_password_reset_behaviour = "off"
# Get lifetime of a validation token in milliseconds
self.email_validation_token_lifetime = self.parse_duration(
email_config.get("validation_token_lifetime", "1h")
)
if (
self.email_enable_notifs
or account_validity_renewal_enabled
or self.email_password_reset_behaviour == "local"
):
if self.email_enable_notifs:
# make sure we can import the required deps
import jinja2
import bleach
@@ -108,68 +42,6 @@ class EmailConfig(Config):
jinja2
bleach
if self.email_password_reset_behaviour == "local":
required = [
"smtp_host",
"smtp_port",
"notif_from",
]
missing = []
for k in required:
if k not in email_config:
missing.append(k)
if (len(missing) > 0):
raise RuntimeError(
"email.password_reset_behaviour is set to 'local' "
"but required keys are missing: %s" %
(", ".join(["email." + k for k in missing]),)
)
# Templates for password reset emails
self.email_password_reset_template_html = email_config.get(
"password_reset_template_html", "password_reset.html",
)
self.email_password_reset_template_text = email_config.get(
"password_reset_template_text", "password_reset.txt",
)
self.email_password_reset_failure_template = email_config.get(
"password_reset_failure_template", "password_reset_failure.html",
)
# This template does not support any replaceable variables, so we will
# read it from the disk once during setup
email_password_reset_success_template = email_config.get(
"password_reset_success_template", "password_reset_success.html",
)
# Check templates exist
for f in [self.email_password_reset_template_html,
self.email_password_reset_template_text,
self.email_password_reset_failure_template,
email_password_reset_success_template]:
p = os.path.join(self.email_template_dir, f)
if not os.path.isfile(p):
raise ConfigError("Unable to find template file %s" % (p, ))
# Retrieve content of web templates
filepath = os.path.join(
self.email_template_dir,
email_password_reset_success_template,
)
self.email_password_reset_success_html_content = self.read_file(
filepath,
"email.password_reset_template_success_html",
)
if config.get("public_baseurl") is None:
raise RuntimeError(
"email.password_reset_behaviour is set to 'local' but no "
"public_baseurl is set. This is necessary to generate password "
"reset links"
)
if self.email_enable_notifs:
required = [
"smtp_host",
"smtp_port",
@@ -194,22 +66,11 @@ class EmailConfig(Config):
"email.enable_notifs is True but no public_baseurl is set"
)
self.email_smtp_host = email_config["smtp_host"]
self.email_smtp_port = email_config["smtp_port"]
self.email_notif_from = email_config["notif_from"]
self.email_notif_template_html = email_config["notif_template_html"]
self.email_notif_template_text = email_config["notif_template_text"]
for f in self.email_notif_template_text, self.email_notif_template_html:
p = os.path.join(self.email_template_dir, f)
if not os.path.isfile(p):
raise ConfigError("Unable to find email template file %s" % (p, ))
self.email_notif_for_new_users = email_config.get(
"notif_for_new_users", True
)
self.email_riot_base_url = email_config.get(
"riot_base_url", None
)
if account_validity_renewal_enabled:
self.email_expiry_template_html = email_config.get(
"expiry_template_html", "notice_expiry.html",
)
@@ -217,15 +78,58 @@ class EmailConfig(Config):
"expiry_template_text", "notice_expiry.txt",
)
for f in self.email_expiry_template_text, self.email_expiry_template_html:
p = os.path.join(self.email_template_dir, f)
template_dir = email_config.get("template_dir")
# we need an absolute path, because we change directory after starting (and
# we don't yet know what auxilliary templates like mail.css we will need).
# (Note that loading as package_resources with jinja.PackageLoader doesn't
# work for the same reason.)
if not template_dir:
template_dir = pkg_resources.resource_filename(
'synapse', 'res/templates'
)
template_dir = os.path.abspath(template_dir)
for f in self.email_notif_template_text, self.email_notif_template_html:
p = os.path.join(template_dir, f)
if not os.path.isfile(p):
raise ConfigError("Unable to find email template file %s" % (p, ))
self.email_template_dir = template_dir
self.email_notif_for_new_users = email_config.get(
"notif_for_new_users", True
)
self.email_riot_base_url = email_config.get(
"riot_base_url", None
)
self.email_smtp_user = email_config.get(
"smtp_user", None
)
self.email_smtp_pass = email_config.get(
"smtp_pass", None
)
self.require_transport_security = email_config.get(
"require_transport_security", False
)
if "app_name" in email_config:
self.email_app_name = email_config["app_name"]
else:
self.email_app_name = "Matrix"
# make sure it's valid
parsed = email.utils.parseaddr(self.email_notif_from)
if parsed[1] == '':
raise RuntimeError("Invalid notif_from address")
else:
self.email_enable_notifs = False
# Not much point setting defaults for the rest: it would be an
# error for them to be used.
def default_config(self, config_dir_path, server_name, **kwargs):
return """
# Enable sending emails for password resets, notification events or
# account expiry notices
# Enable sending emails for notification events or expiry notices
# Defining a custom URL for Riot is only needed if email notifications
# should contain links to a self-hosted installation of Riot; when set
# the "app_name" setting is ignored.
#
# If your SMTP server requires authentication, the optional smtp_user &
# smtp_pass variables should be used
@@ -233,62 +137,20 @@ class EmailConfig(Config):
#email:
# enable_notifs: false
# smtp_host: "localhost"
# smtp_port: 25 # SSL: 465, STARTTLS: 587
# smtp_port: 25
# smtp_user: "exampleusername"
# smtp_pass: "examplepassword"
# require_transport_security: False
# notif_from: "Your Friendly %(app)s Home Server <noreply@example.com>"
# app_name: Matrix
#
# # Enable email notifications by default
# notif_for_new_users: True
#
# # Defining a custom URL for Riot is only needed if email notifications
# # should contain links to a self-hosted installation of Riot; when set
# # the "app_name" setting is ignored
# riot_base_url: "http://localhost/riot"
#
# # Enable sending password reset emails via the configured, trusted
# # identity servers
# #
# # IMPORTANT! This will give a malicious or overtaken identity server
# # the ability to reset passwords for your users! Make absolutely sure
# # that you want to do this! It is strongly recommended that password
# # reset emails be sent by the homeserver instead
# #
# # If this option is set to false and SMTP options have not been
# # configured, resetting user passwords via email will be disabled
# #trust_identity_server_for_password_resets: false
#
# # Configure the time that a validation email or text message code
# # will expire after sending
# #
# # This is currently used for password resets
# #validation_token_lifetime: 1h
#
# # Template directory. All template files should be stored within this
# # directory
# #
# # if template_dir is unset, uses the example templates that are part of
# # the Synapse distribution.
# #template_dir: res/templates
#
# # Templates for email notifications
# #
# notif_template_html: notif_mail.html
# notif_template_text: notif_mail.txt
#
# # Templates for account expiry notices
# #
# # Templates for account expiry notices.
# expiry_template_html: notice_expiry.html
# expiry_template_text: notice_expiry.txt
#
# # Templates for password reset emails sent by the homeserver
# #
# #password_reset_template_html: password_reset.html
# #password_reset_template_text: password_reset.txt
#
# # Templates for password reset success and failure pages that a user
# # will see after attempting to reset their password
# #
# #password_reset_template_success_html: password_reset_success.html
# #password_reset_template_failure_html: password_reset_failure.html
# notif_for_new_users: True
# riot_base_url: "http://localhost/riot"
"""

View File

@@ -39,8 +39,6 @@ class AccountValidityConfig(Config):
else:
self.renew_email_subject = "Renew your %(app)s account"
self.startup_job_max_delta = self.period * 10. / 100.
if self.renew_by_email_enabled and "public_baseurl" not in synapse_config:
raise ConfigError("Can't send renewal emails without 'public_baseurl'")
@@ -131,9 +129,7 @@ class RegistrationConfig(Config):
# This means that, if a validity period is set, and Synapse is restarted (it will
# then derive an expiration date from the current validity period), and some time
# after that the validity period changes and Synapse is restarted, the users'
# expiration dates won't be updated unless their account is manually renewed. This
# date will be randomly selected within a range [now + period - d ; now + period],
# where d is equal to 10%% of the validity period.
# expiration dates won't be updated unless their account is manually renewed.
#
#account_validity:
# enabled: True

View File

@@ -20,7 +20,6 @@ import os.path
from netaddr import IPSet
from synapse.api.room_versions import KNOWN_ROOM_VERSIONS
from synapse.http.endpoint import parse_and_validate_server_name
from synapse.python_dependencies import DependencyException, check_requirements
@@ -36,8 +35,6 @@ logger = logging.Logger(__name__)
# in the list.
DEFAULT_BIND_ADDRESSES = ['::', '0.0.0.0']
DEFAULT_ROOM_VERSION = "1"
class ServerConfig(Config):
@@ -91,22 +88,6 @@ class ServerConfig(Config):
"restrict_public_rooms_to_local_users", False,
)
default_room_version = config.get(
"default_room_version", DEFAULT_ROOM_VERSION,
)
# Ensure room version is a str
default_room_version = str(default_room_version)
if default_room_version not in KNOWN_ROOM_VERSIONS:
raise ConfigError(
"Unknown default_room_version: %s, known room versions: %s" %
(default_room_version, list(KNOWN_ROOM_VERSIONS.keys()))
)
# Get the actual room version object rather than just the identifier
self.default_room_version = KNOWN_ROOM_VERSIONS[default_room_version]
# whether to enable search. If disabled, new entries will not be inserted
# into the search tables and they will not be indexed. Users will receive
# errors when attempting to search for messages.
@@ -329,10 +310,6 @@ class ServerConfig(Config):
unsecure_port = 8008
pid_file = os.path.join(data_dir_path, "homeserver.pid")
# Bring DEFAULT_ROOM_VERSION into the local-scope for use in the
# default config string
default_room_version = DEFAULT_ROOM_VERSION
return """\
## Server ##
@@ -407,16 +384,6 @@ class ServerConfig(Config):
#
#restrict_public_rooms_to_local_users: true
# The default room version for newly created rooms.
#
# Known room versions are listed here:
# https://matrix.org/docs/spec/#complete-list-of-room-versions
#
# For example, for room version 1, default_room_version should be set
# to "1".
#
#default_room_version: "%(default_room_version)s"
# The GC threshold parameters to pass to `gc.set_threshold`, if defined
#
#gc_thresholds: [700, 10, 10]

View File

@@ -107,7 +107,7 @@ class TlsConfig(Config):
certs = []
for ca_file in custom_ca_list:
logger.debug("Reading custom CA certificate file: %s", ca_file)
content = self.read_file(ca_file, "federation_custom_ca_list")
content = self.read_file(ca_file)
# Parse the CA certificates
try:

View File

@@ -43,9 +43,9 @@ class UserDirectoryConfig(Config):
#
# 'search_all_users' defines whether to search all users visible to your HS
# when searching the user directory, rather than limiting to users visible
# in public rooms. Defaults to false. If you set it True, you'll have to
# rebuild the user_directory search indexes, see
# https://github.com/matrix-org/synapse/blob/master/docs/user_directory.md
# in public rooms. Defaults to false. If you set it True, you'll have to run
# UPDATE user_directory_stream_pos SET stream_id = NULL;
# on your database to tell it to rebuild the user_directory search indexes.
#
#user_directory:
# enabled: true

File diff suppressed because it is too large Load Diff

View File

@@ -76,7 +76,6 @@ class EventBuilder(object):
# someone tries to get them when they don't exist.
_state_key = attr.ib(default=None)
_redacts = attr.ib(default=None)
_origin_server_ts = attr.ib(default=None)
internal_metadata = attr.ib(default=attr.Factory(lambda: _EventInternalMetadata({})))
@@ -143,9 +142,6 @@ class EventBuilder(object):
if self._redacts is not None:
event_dict["redacts"] = self._redacts
if self._origin_server_ts is not None:
event_dict["origin_server_ts"] = self._origin_server_ts
defer.returnValue(
create_local_event_from_event_dict(
clock=self._clock,
@@ -213,7 +209,6 @@ class EventBuilderFactory(object):
content=key_values.get("content", {}),
unsigned=key_values.get("unsigned", {}),
redacts=key_values.get("redacts", None),
origin_server_ts=key_values.get("origin_server_ts", None),
)
@@ -250,7 +245,7 @@ def create_local_event_from_event_dict(clock, hostname, signing_key,
event_dict["event_id"] = _create_event_id(clock, hostname)
event_dict["origin"] = hostname
event_dict.setdefault("origin_server_ts", time_now)
event_dict["origin_server_ts"] = time_now
event_dict.setdefault("unsigned", {})
age = event_dict["unsigned"].pop("age", 0)

View File

@@ -330,13 +330,12 @@ class EventClientSerializer(object):
)
@defer.inlineCallbacks
def serialize_event(self, event, time_now, bundle_aggregations=True, **kwargs):
def serialize_event(self, event, time_now, **kwargs):
"""Serializes a single event.
Args:
event (EventBase)
time_now (int): The current time in milliseconds
bundle_aggregations (bool): Whether to bundle in related events
**kwargs: Arguments to pass to `serialize_event`
Returns:
@@ -351,7 +350,7 @@ class EventClientSerializer(object):
# If MSC1849 is enabled then we need to look if thre are any relations
# we need to bundle in with the event
if self.experimental_msc1849_support_enabled and bundle_aggregations:
if self.experimental_msc1849_support_enabled:
annotations = yield self.store.get_aggregation_groups_for_event(
event_id,
)

View File

@@ -265,7 +265,7 @@ def _check_sigs_on_pdus(keyring, room_version, pdus):
]
more_deferreds = keyring.verify_json_objects_for_server([
(p.sender_domain, p.redacted_pdu_json, 0)
(p.sender_domain, p.redacted_pdu_json)
for p in pdus_to_check_sender
])
@@ -298,7 +298,7 @@ def _check_sigs_on_pdus(keyring, room_version, pdus):
]
more_deferreds = keyring.verify_json_objects_for_server([
(get_domain_from_id(p.pdu.event_id), p.redacted_pdu_json, 0)
(get_domain_from_id(p.pdu.event_id), p.redacted_pdu_json)
for p in pdus_to_check_event_id
])

View File

@@ -23,11 +23,7 @@ from twisted.internet import defer
import synapse
from synapse.api.errors import Codes, FederationDeniedError, SynapseError
from synapse.api.room_versions import RoomVersions
from synapse.api.urls import (
FEDERATION_UNSTABLE_PREFIX,
FEDERATION_V1_PREFIX,
FEDERATION_V2_PREFIX,
)
from synapse.api.urls import FEDERATION_V1_PREFIX, FEDERATION_V2_PREFIX
from synapse.http.endpoint import parse_and_validate_server_name
from synapse.http.server import JsonResource
from synapse.http.servlet import (
@@ -94,7 +90,6 @@ class NoAuthenticationError(AuthenticationError):
class Authenticator(object):
def __init__(self, hs):
self._clock = hs.get_clock()
self.keyring = hs.get_keyring()
self.server_name = hs.hostname
self.store = hs.get_datastore()
@@ -103,7 +98,6 @@ class Authenticator(object):
# A method just so we can pass 'self' as the authenticator to the Servlets
@defer.inlineCallbacks
def authenticate_request(self, request, content):
now = self._clock.time_msec()
json_request = {
"method": request.method.decode('ascii'),
"uri": request.uri.decode('ascii'),
@@ -140,7 +134,7 @@ class Authenticator(object):
401, "Missing Authorization headers", Codes.UNAUTHORIZED,
)
yield self.keyring.verify_json_for_server(origin, json_request, now)
yield self.keyring.verify_json_for_server(origin, json_request)
logger.info("Request from %s", origin)
request.authenticated_entity = origin
@@ -1310,30 +1304,6 @@ class FederationGroupsSettingJoinPolicyServlet(BaseFederationServlet):
defer.returnValue((200, new_content))
class RoomComplexityServlet(BaseFederationServlet):
"""
Indicates to other servers how complex (and therefore likely
resource-intensive) a public room this server knows about is.
"""
PATH = "/rooms/(?P<room_id>[^/]*)/complexity"
PREFIX = FEDERATION_UNSTABLE_PREFIX
@defer.inlineCallbacks
def on_GET(self, origin, content, query, room_id):
store = self.handler.hs.get_datastore()
is_public = yield store.is_room_world_readable_or_publicly_joinable(
room_id
)
if not is_public:
raise SynapseError(404, "Room not found", errcode=Codes.INVALID_PARAM)
complexity = yield store.get_room_complexity(room_id)
defer.returnValue((200, complexity))
FEDERATION_SERVLET_CLASSES = (
FederationSendServlet,
FederationEventServlet,
@@ -1357,7 +1327,6 @@ FEDERATION_SERVLET_CLASSES = (
FederationThirdPartyInviteExchangeServlet,
On3pidBindServlet,
FederationVersionServlet,
RoomComplexityServlet,
)
OPENID_SERVLET_CLASSES = (

View File

@@ -97,11 +97,10 @@ class GroupAttestationSigning(object):
# TODO: We also want to check that *new* attestations that people give
# us to store are valid for at least a little while.
now = self.clock.time_msec()
if valid_until_ms < now:
if valid_until_ms < self.clock.time_msec():
raise SynapseError(400, "Attestation expired")
yield self.keyring.verify_json_for_server(server_name, attestation, now)
yield self.keyring.verify_json_for_server(server_name, attestation)
def create_attestation(self, group_id, user_id):
"""Create an attestation for the group_id and user_id with default

View File

@@ -162,7 +162,7 @@ class AuthHandler(BaseHandler):
defer.returnValue(params)
@defer.inlineCallbacks
def check_auth(self, flows, clientdict, clientip, password_servlet=False):
def check_auth(self, flows, clientdict, clientip):
"""
Takes a dictionary sent by the client in the login / registration
protocol and handles the User-Interactive Auth flow.
@@ -186,16 +186,6 @@ class AuthHandler(BaseHandler):
clientip (str): The IP address of the client.
password_servlet (bool): Whether the request originated from
PasswordRestServlet.
XXX: This is a temporary hack to distinguish between checking
for threepid validations locally (in the case of password
resets) and using the identity server (in the case of binding
a 3PID during registration). Once we start using the
homeserver for both tasks, this distinction will no longer be
necessary.
Returns:
defer.Deferred[dict, dict, str]: a deferred tuple of
(creds, params, session_id).
@@ -251,9 +241,7 @@ class AuthHandler(BaseHandler):
if 'type' in authdict:
login_type = authdict['type']
try:
result = yield self._check_auth_dict(
authdict, clientip, password_servlet=password_servlet,
)
result = yield self._check_auth_dict(authdict, clientip)
if result:
creds[login_type] = result
self._save_session(session)
@@ -363,7 +351,7 @@ class AuthHandler(BaseHandler):
return sess.setdefault('serverdict', {}).get(key, default)
@defer.inlineCallbacks
def _check_auth_dict(self, authdict, clientip, password_servlet=False):
def _check_auth_dict(self, authdict, clientip):
"""Attempt to validate the auth dict provided by a client
Args:
@@ -381,13 +369,7 @@ class AuthHandler(BaseHandler):
login_type = authdict['type']
checker = self.checkers.get(login_type)
if checker is not None:
# XXX: Temporary workaround for having Synapse handle password resets
# See AuthHandler.check_auth for further details
res = yield checker(
authdict,
clientip=clientip,
password_servlet=password_servlet,
)
res = yield checker(authdict, clientip)
defer.returnValue(res)
# build a v1-login-style dict out of the authdict and fall back to the
@@ -401,7 +383,7 @@ class AuthHandler(BaseHandler):
defer.returnValue(canonical_id)
@defer.inlineCallbacks
def _check_recaptcha(self, authdict, clientip, **kwargs):
def _check_recaptcha(self, authdict, clientip):
try:
user_response = authdict["response"]
except KeyError:
@@ -447,20 +429,20 @@ class AuthHandler(BaseHandler):
defer.returnValue(True)
raise LoginError(401, "", errcode=Codes.UNAUTHORIZED)
def _check_email_identity(self, authdict, **kwargs):
return self._check_threepid('email', authdict, **kwargs)
def _check_email_identity(self, authdict, _):
return self._check_threepid('email', authdict)
def _check_msisdn(self, authdict, **kwargs):
def _check_msisdn(self, authdict, _):
return self._check_threepid('msisdn', authdict)
def _check_dummy_auth(self, authdict, **kwargs):
def _check_dummy_auth(self, authdict, _):
return defer.succeed(True)
def _check_terms_auth(self, authdict, **kwargs):
def _check_terms_auth(self, authdict, _):
return defer.succeed(True)
@defer.inlineCallbacks
def _check_threepid(self, medium, authdict, password_servlet=False, **kwargs):
def _check_threepid(self, medium, authdict):
if 'threepid_creds' not in authdict:
raise LoginError(400, "Missing threepid_creds", Codes.MISSING_PARAM)
@@ -469,29 +451,7 @@ class AuthHandler(BaseHandler):
identity_handler = self.hs.get_handlers().identity_handler
logger.info("Getting validated threepid. threepidcreds: %r", (threepid_creds,))
if (
not password_servlet
or self.hs.config.email_password_reset_behaviour == "remote"
):
threepid = yield identity_handler.threepid_from_creds(threepid_creds)
elif self.hs.config.email_password_reset_behaviour == "local":
row = yield self.store.get_threepid_validation_session(
medium,
threepid_creds["client_secret"],
sid=threepid_creds["sid"],
)
threepid = {
"medium": row["medium"],
"address": row["address"],
"validated_at": row["validated_at"],
} if row else None
if row:
# Valid threepid returned, delete from the db
yield self.store.delete_threepid_session(threepid_creds["sid"])
else:
raise SynapseError(400, "Password resets are not enabled on this homeserver")
threepid = yield identity_handler.threepid_from_creds(threepid_creds)
if not threepid:
raise LoginError(401, "", errcode=Codes.UNAUTHORIZED)

View File

@@ -122,9 +122,6 @@ class EventStreamHandler(BaseHandler):
chunks = yield self._event_serializer.serialize_events(
events, time_now, as_client_event=as_client_event,
# We don't bundle "live" events, as otherwise clients
# will end up double counting annotations.
bundle_aggregations=False,
)
chunk = {

View File

@@ -2013,44 +2013,15 @@ class FederationHandler(BaseHandler):
Args:
origin (str):
event (synapse.events.EventBase):
event (synapse.events.FrozenEvent):
context (synapse.events.snapshot.EventContext):
auth_events (dict[(str, str)->synapse.events.EventBase]):
Map from (event_type, state_key) to event
What we expect the event's auth_events to be, based on the event's
position in the dag. I think? maybe??
Also NB that this function adds entries to it.
Returns:
defer.Deferred[None]
"""
room_version = yield self.store.get_room_version(event.room_id)
yield self._update_auth_events_and_context_for_auth(
origin, event, context, auth_events
)
try:
self.auth.check(room_version, event, auth_events=auth_events)
except AuthError as e:
logger.warn("Failed auth resolution for %r because %s", event, e)
raise e
@defer.inlineCallbacks
def _update_auth_events_and_context_for_auth(
self, origin, event, context, auth_events
):
"""Helper for do_auth. See there for docs.
Args:
origin (str):
event (synapse.events.EventBase):
context (synapse.events.snapshot.EventContext):
auth_events (dict[(str, str)->synapse.events.EventBase]):
auth_events (dict[(str, str)->str]):
Returns:
defer.Deferred[None]
"""
# Check if we have all the auth events.
current_state = set(e.event_id for e in auth_events.values())
event_auth_events = set(event.auth_event_ids())
if event.is_state():
@@ -2058,21 +2029,11 @@ class FederationHandler(BaseHandler):
else:
event_key = None
# if the event's auth_events refers to events which are not in our
# calculated auth_events, we need to fetch those events from somewhere.
#
# we start by fetching them from the store, and then try calling /event_auth/.
missing_auth = event_auth_events.difference(
e.event_id for e in auth_events.values()
)
if missing_auth:
if event_auth_events - current_state:
# TODO: can we use store.have_seen_events here instead?
have_events = yield self.store.get_seen_events_with_rejections(
missing_auth
event_auth_events - current_state
)
logger.debug("Got events %s from store", have_events)
missing_auth.difference_update(have_events.keys())
else:
have_events = {}
@@ -2081,12 +2042,13 @@ class FederationHandler(BaseHandler):
for e in auth_events.values()
})
seen_events = set(have_events.keys())
missing_auth = event_auth_events - seen_events - current_state
if missing_auth:
logger.info("Missing auth: %s", missing_auth)
# If we don't have all the auth events, we need to get them.
logger.info(
"auth_events contains unknown events: %s",
missing_auth,
)
try:
remote_auth_chain = yield self.federation_client.get_event_auth(
origin, event.room_id, event.event_id
@@ -2127,168 +2089,145 @@ class FederationHandler(BaseHandler):
have_events = yield self.store.get_seen_events_with_rejections(
event.auth_event_ids()
)
seen_events = set(have_events.keys())
except Exception:
# FIXME:
logger.exception("Failed to get auth chain")
if event.internal_metadata.is_outlier():
logger.info("Skipping auth_event fetch for outlier")
return
# FIXME: Assumes we have and stored all the state for all the
# prev_events
different_auth = event_auth_events.difference(
e.event_id for e in auth_events.values()
)
if not different_auth:
return
logger.info(
"auth_events refers to events which are not in our calculated auth "
"chain: %s",
different_auth,
)
current_state = set(e.event_id for e in auth_events.values())
different_auth = event_auth_events - current_state
room_version = yield self.store.get_room_version(event.room_id)
different_events = yield logcontext.make_deferred_yieldable(
defer.gatherResults([
logcontext.run_in_background(
self.store.get_event,
d,
allow_none=True,
allow_rejected=False,
if different_auth and not event.internal_metadata.is_outlier():
# Do auth conflict res.
logger.info("Different auth: %s", different_auth)
different_events = yield logcontext.make_deferred_yieldable(
defer.gatherResults([
logcontext.run_in_background(
self.store.get_event,
d,
allow_none=True,
allow_rejected=False,
)
for d in different_auth
if d in have_events and not have_events[d]
], consumeErrors=True)
).addErrback(unwrapFirstError)
if different_events:
local_view = dict(auth_events)
remote_view = dict(auth_events)
remote_view.update({
(d.type, d.state_key): d for d in different_events if d
})
new_state = yield self.state_handler.resolve_events(
room_version,
[list(local_view.values()), list(remote_view.values())],
event
)
for d in different_auth
if d in have_events and not have_events[d]
], consumeErrors=True)
).addErrback(unwrapFirstError)
if different_events:
local_view = dict(auth_events)
remote_view = dict(auth_events)
remote_view.update({
(d.type, d.state_key): d for d in different_events if d
})
auth_events.update(new_state)
new_state = yield self.state_handler.resolve_events(
room_version,
[list(local_view.values()), list(remote_view.values())],
event
)
current_state = set(e.event_id for e in auth_events.values())
different_auth = event_auth_events - current_state
logger.info(
"After state res: updating auth_events with new state %s",
{
(d.type, d.state_key): d.event_id for d in new_state.values()
if auth_events.get((d.type, d.state_key)) != d
},
)
yield self._update_context_for_auth_events(
event, context, auth_events, event_key,
)
auth_events.update(new_state)
if different_auth and not event.internal_metadata.is_outlier():
logger.info("Different auth after resolution: %s", different_auth)
different_auth = event_auth_events.difference(
e.event_id for e in auth_events.values()
)
# Only do auth resolution if we have something new to say.
# We can't rove an auth failure.
do_resolution = False
yield self._update_context_for_auth_events(
event, context, auth_events, event_key,
)
provable = [
RejectedReason.NOT_ANCESTOR, RejectedReason.NOT_ANCESTOR,
]
if not different_auth:
# we're done
return
for e_id in different_auth:
if e_id in have_events:
if have_events[e_id] in provable:
do_resolution = True
break
logger.info(
"auth_events still refers to events which are not in the calculated auth "
"chain after state resolution: %s",
different_auth,
)
# Only do auth resolution if we have something new to say.
# We can't prove an auth failure.
do_resolution = False
for e_id in different_auth:
if e_id in have_events:
if have_events[e_id] == RejectedReason.NOT_ANCESTOR:
do_resolution = True
break
if not do_resolution:
logger.info(
"Skipping auth resolution due to lack of provable rejection reasons"
)
return
logger.info("Doing auth resolution")
prev_state_ids = yield context.get_prev_state_ids(self.store)
# 1. Get what we think is the auth chain.
auth_ids = yield self.auth.compute_auth_events(
event, prev_state_ids
)
local_auth_chain = yield self.store.get_auth_chain(
auth_ids, include_given=True
)
try:
# 2. Get remote difference.
result = yield self.federation_client.query_auth(
origin,
event.room_id,
event.event_id,
local_auth_chain,
)
seen_remotes = yield self.store.have_seen_events(
[e.event_id for e in result["auth_chain"]]
)
# 3. Process any remote auth chain events we haven't seen.
for ev in result["auth_chain"]:
if ev.event_id in seen_remotes:
continue
if ev.event_id == event.event_id:
continue
if do_resolution:
prev_state_ids = yield context.get_prev_state_ids(self.store)
# 1. Get what we think is the auth chain.
auth_ids = yield self.auth.compute_auth_events(
event, prev_state_ids
)
local_auth_chain = yield self.store.get_auth_chain(
auth_ids, include_given=True
)
try:
auth_ids = ev.auth_event_ids()
auth = {
(e.type, e.state_key): e
for e in result["auth_chain"]
if e.event_id in auth_ids
or event.type == EventTypes.Create
}
ev.internal_metadata.outlier = True
logger.debug(
"do_auth %s different_auth: %s",
event.event_id, e.event_id
# 2. Get remote difference.
result = yield self.federation_client.query_auth(
origin,
event.room_id,
event.event_id,
local_auth_chain,
)
yield self._handle_new_event(
origin, ev, auth_events=auth
seen_remotes = yield self.store.have_seen_events(
[e.event_id for e in result["auth_chain"]]
)
if ev.event_id in event_auth_events:
auth_events[(ev.type, ev.state_key)] = ev
except AuthError:
pass
# 3. Process any remote auth chain events we haven't seen.
for ev in result["auth_chain"]:
if ev.event_id in seen_remotes:
continue
except Exception:
# FIXME:
logger.exception("Failed to query auth chain")
if ev.event_id == event.event_id:
continue
# 4. Look at rejects and their proofs.
# TODO.
try:
auth_ids = ev.auth_event_ids()
auth = {
(e.type, e.state_key): e
for e in result["auth_chain"]
if e.event_id in auth_ids
or event.type == EventTypes.Create
}
ev.internal_metadata.outlier = True
yield self._update_context_for_auth_events(
event, context, auth_events, event_key,
)
logger.debug(
"do_auth %s different_auth: %s",
event.event_id, e.event_id
)
yield self._handle_new_event(
origin, ev, auth_events=auth
)
if ev.event_id in event_auth_events:
auth_events[(ev.type, ev.state_key)] = ev
except AuthError:
pass
except Exception:
# FIXME:
logger.exception("Failed to query auth chain")
# 4. Look at rejects and their proofs.
# TODO.
yield self._update_context_for_auth_events(
event, context, auth_events, event_key,
)
try:
self.auth.check(room_version, event, auth_events=auth_events)
except AuthError as e:
logger.warn("Failed auth resolution for %r because %s", event, e)
raise e
@defer.inlineCallbacks
def _update_context_for_auth_events(self, event, context, auth_events,

View File

@@ -247,14 +247,7 @@ class IdentityHandler(BaseHandler):
defer.returnValue(changed)
@defer.inlineCallbacks
def requestEmailToken(
self,
id_server,
email,
client_secret,
send_attempt,
next_link=None,
):
def requestEmailToken(self, id_server, email, client_secret, send_attempt, **kwargs):
if not self._should_trust_id_server(id_server):
raise SynapseError(
400, "Untrusted ID server '%s'" % id_server,
@@ -266,9 +259,7 @@ class IdentityHandler(BaseHandler):
'client_secret': client_secret,
'send_attempt': send_attempt,
}
if next_link:
params.update({'next_link': next_link})
params.update(kwargs)
try:
data = yield self.http_client.post_json_get_json(

View File

@@ -166,9 +166,6 @@ class MessageHandler(object):
now = self.clock.time_msec()
events = yield self._event_serializer.serialize_events(
room_state.values(), now,
# We don't bother bundling aggregations in when asked for state
# events, as clients won't use them.
bundle_aggregations=False,
)
defer.returnValue(events)

View File

@@ -182,27 +182,17 @@ class PresenceHandler(object):
# Start a LoopingCall in 30s that fires every 5s.
# The initial delay is to allow disconnected clients a chance to
# reconnect before we treat them as offline.
def run_timeout_handler():
return run_as_background_process(
"handle_presence_timeouts", self._handle_timeouts
)
self.clock.call_later(
30,
self.clock.looping_call,
run_timeout_handler,
self._handle_timeouts,
5000,
)
def run_persister():
return run_as_background_process(
"persist_presence_changes", self._persist_unpersisted_changes
)
self.clock.call_later(
60,
self.clock.looping_call,
run_persister,
self._persist_unpersisted_changes,
60 * 1000,
)
@@ -239,7 +229,6 @@ class PresenceHandler(object):
)
if self.unpersisted_users_changes:
yield self.store.update_presence([
self.user_to_current_state[user_id]
for user_id in self.unpersisted_users_changes
@@ -251,18 +240,30 @@ class PresenceHandler(object):
"""We periodically persist the unpersisted changes, as otherwise they
may stack up and slow down shutdown times.
"""
logger.info(
"Performing _persist_unpersisted_changes. Persisting %d unpersisted changes",
len(self.unpersisted_users_changes)
)
unpersisted = self.unpersisted_users_changes
self.unpersisted_users_changes = set()
if unpersisted:
logger.info(
"Persisting %d upersisted presence updates", len(unpersisted)
)
yield self.store.update_presence([
self.user_to_current_state[user_id]
for user_id in unpersisted
])
logger.info("Finished _persist_unpersisted_changes")
@defer.inlineCallbacks
def _update_states_and_catch_exception(self, new_states):
try:
res = yield self._update_states(new_states)
defer.returnValue(res)
except Exception:
logger.exception("Error updating presence")
@defer.inlineCallbacks
def _update_states(self, new_states):
"""Updates presence of users. Sets the appropriate timeouts. Pokes
@@ -337,41 +338,45 @@ class PresenceHandler(object):
logger.info("Handling presence timeouts")
now = self.clock.time_msec()
# Fetch the list of users that *may* have timed out. Things may have
# changed since the timeout was set, so we won't necessarily have to
# take any action.
users_to_check = set(self.wheel_timer.fetch(now))
try:
with Measure(self.clock, "presence_handle_timeouts"):
# Fetch the list of users that *may* have timed out. Things may have
# changed since the timeout was set, so we won't necessarily have to
# take any action.
users_to_check = set(self.wheel_timer.fetch(now))
# Check whether the lists of syncing processes from an external
# process have expired.
expired_process_ids = [
process_id for process_id, last_update
in self.external_process_last_updated_ms.items()
if now - last_update > EXTERNAL_PROCESS_EXPIRY
]
for process_id in expired_process_ids:
users_to_check.update(
self.external_process_last_updated_ms.pop(process_id, ())
)
self.external_process_last_update.pop(process_id)
# Check whether the lists of syncing processes from an external
# process have expired.
expired_process_ids = [
process_id for process_id, last_update
in self.external_process_last_updated_ms.items()
if now - last_update > EXTERNAL_PROCESS_EXPIRY
]
for process_id in expired_process_ids:
users_to_check.update(
self.external_process_last_updated_ms.pop(process_id, ())
)
self.external_process_last_update.pop(process_id)
states = [
self.user_to_current_state.get(
user_id, UserPresenceState.default(user_id)
)
for user_id in users_to_check
]
states = [
self.user_to_current_state.get(
user_id, UserPresenceState.default(user_id)
)
for user_id in users_to_check
]
timers_fired_counter.inc(len(states))
timers_fired_counter.inc(len(states))
changes = handle_timeouts(
states,
is_mine_fn=self.is_mine_id,
syncing_user_ids=self.get_currently_syncing_users(),
now=now,
)
changes = handle_timeouts(
states,
is_mine_fn=self.is_mine_id,
syncing_user_ids=self.get_currently_syncing_users(),
now=now,
)
return self._update_states(changes)
run_in_background(self._update_states_and_catch_exception, changes)
except Exception:
logger.exception("Exception in _handle_timeouts loop")
@defer.inlineCallbacks
def bump_presence_active_time(self, user):

View File

@@ -31,9 +31,6 @@ from ._base import BaseHandler
logger = logging.getLogger(__name__)
MAX_DISPLAYNAME_LEN = 100
MAX_AVATAR_URL_LEN = 1000
class BaseProfileHandler(BaseHandler):
"""Handles fetching and updating user profile information.
@@ -165,11 +162,6 @@ class BaseProfileHandler(BaseHandler):
if not by_admin and target_user != requester.user:
raise AuthError(400, "Cannot set another user's displayname")
if len(new_displayname) > MAX_DISPLAYNAME_LEN:
raise SynapseError(
400, "Displayname is too long (max %i)" % (MAX_DISPLAYNAME_LEN, ),
)
if new_displayname == '':
new_displayname = None
@@ -225,11 +217,6 @@ class BaseProfileHandler(BaseHandler):
if not by_admin and target_user != requester.user:
raise AuthError(400, "Cannot set another user's avatar_url")
if len(new_avatar_url) > MAX_AVATAR_URL_LEN:
raise SynapseError(
400, "Avatar URL is too long (max %i)" % (MAX_AVATAR_URL_LEN, ),
)
yield self.store.set_profile_avatar_url(
target_user.localpart, new_avatar_url
)

View File

@@ -531,8 +531,6 @@ class RegistrationHandler(BaseHandler):
A tuple of (user_id, access_token).
Raises:
RegistrationError if there was a problem registering.
NB this is only used in tests. TODO: move it to the test package!
"""
if localpart is None:
raise SynapseError(400, "Request must include user id")

View File

@@ -27,7 +27,7 @@ from twisted.internet import defer
from synapse.api.constants import EventTypes, JoinRules, RoomCreationPreset
from synapse.api.errors import AuthError, Codes, NotFoundError, StoreError, SynapseError
from synapse.api.room_versions import KNOWN_ROOM_VERSIONS
from synapse.api.room_versions import DEFAULT_ROOM_VERSION, KNOWN_ROOM_VERSIONS
from synapse.storage.state import StateFilter
from synapse.types import RoomAlias, RoomID, RoomStreamToken, StreamToken, UserID
from synapse.util import stringutils
@@ -70,7 +70,6 @@ class RoomCreationHandler(BaseHandler):
self.spam_checker = hs.get_spam_checker()
self.event_creation_handler = hs.get_event_creation_handler()
self.room_member_handler = hs.get_room_member_handler()
self.config = hs.config
# linearizer to stop two upgrades happening at once
self._upgrade_linearizer = Linearizer("room_upgrade_linearizer")
@@ -476,11 +475,7 @@ class RoomCreationHandler(BaseHandler):
if ratelimit:
yield self.ratelimit(requester)
room_version = config.get(
"room_version",
self.config.default_room_version.identifier,
)
room_version = config.get("room_version", DEFAULT_ROOM_VERSION.identifier)
if not isinstance(room_version, string_types):
raise SynapseError(
400,

View File

@@ -285,24 +285,7 @@ class MatrixFederationHttpClient(object):
request (MatrixFederationRequest): details of request to be sent
timeout (int|None): number of milliseconds to wait for the response headers
(including connecting to the server), *for each attempt*.
60s by default.
long_retries (bool): whether to use the long retry algorithm.
The regular retry algorithm makes 4 attempts, with intervals
[0.5s, 1s, 2s].
The long retry algorithm makes 11 attempts, with intervals
[4s, 16s, 60s, 60s, ...]
Both algorithms add -20%/+40% jitter to the retry intervals.
Note that the above intervals are *in addition* to the time spent
waiting for the request to complete (up to `timeout` ms).
NB: the long retry algorithm takes over 20 minutes to complete, with
a default timeout of 60s!
(including connecting to the server). 60s by default.
ignore_backoff (bool): true to ignore the historical backoff data
and try the request anyway.
@@ -583,14 +566,10 @@ class MatrixFederationHttpClient(object):
the request body. This will be encoded as JSON.
json_data_callback (callable): A callable returning the dict to
use as the request body.
long_retries (bool): whether to use the long retry algorithm. See
docs on _send_request for details.
timeout (int|None): number of milliseconds to wait for the response headers
(including connecting to the server), *for each attempt*.
self._default_timeout (60s) by default.
long_retries (bool): A boolean that indicates whether we should
retry for a short or long time.
timeout(int): How long to try (in ms) the destination for before
giving up. None indicates no timeout.
ignore_backoff (bool): true to ignore the historical backoff data
and try the request anyway.
backoff_on_404 (bool): True if we should count a 404 response as
@@ -648,22 +627,15 @@ class MatrixFederationHttpClient(object):
Args:
destination (str): The remote server to send the HTTP request
to.
path (str): The HTTP path.
data (dict): A dict containing the data that will be used as
the request body. This will be encoded as JSON.
long_retries (bool): whether to use the long retry algorithm. See
docs on _send_request for details.
timeout (int|None): number of milliseconds to wait for the response headers
(including connecting to the server), *for each attempt*.
self._default_timeout (60s) by default.
long_retries (bool): A boolean that indicates whether we should
retry for a short or long time.
timeout(int): How long to try (in ms) the destination for before
giving up. None indicates no timeout.
ignore_backoff (bool): true to ignore the historical backoff data and
try the request anyway.
args (dict): query params
Returns:
Deferred[dict|list]: Succeeds when we get a 2xx HTTP response. The
@@ -714,19 +686,14 @@ class MatrixFederationHttpClient(object):
Args:
destination (str): The remote server to send the HTTP request
to.
path (str): The HTTP path.
args (dict|None): A dictionary used to create query strings, defaults to
None.
timeout (int|None): number of milliseconds to wait for the response headers
(including connecting to the server), *for each attempt*.
self._default_timeout (60s) by default.
timeout (int): How long to try (in ms) the destination for before
giving up. None indicates no timeout and that the request will
be retried.
ignore_backoff (bool): true to ignore the historical backoff data
and try the request anyway.
try_trailing_slash_on_400 (bool): True if on a 400 M_UNRECOGNIZED
response we should try appending a trailing slash to the end of
the request. Workaround for #3622 in Synapse <= v0.99.3.
@@ -744,6 +711,10 @@ class MatrixFederationHttpClient(object):
RequestSendFailed: If there were problems connecting to the
remote, due to e.g. DNS failures, connection timeouts etc.
"""
logger.debug("get_json args: %s", args)
logger.debug("Query bytes: %s Retry DNS: %s", args, retry_on_dns_fail)
request = MatrixFederationRequest(
method="GET",
destination=destination,
@@ -775,18 +746,12 @@ class MatrixFederationHttpClient(object):
destination (str): The remote server to send the HTTP request
to.
path (str): The HTTP path.
long_retries (bool): whether to use the long retry algorithm. See
docs on _send_request for details.
timeout (int|None): number of milliseconds to wait for the response headers
(including connecting to the server), *for each attempt*.
self._default_timeout (60s) by default.
long_retries (bool): A boolean that indicates whether we should
retry for a short or long time.
timeout(int): How long to try (in ms) the destination for before
giving up. None indicates no timeout.
ignore_backoff (bool): true to ignore the historical backoff data and
try the request anyway.
args (dict): query params
Returns:
Deferred[dict|list]: Succeeds when we get a 2xx HTTP response. The
result will be the decoded JSON body.

View File

@@ -55,7 +55,7 @@ def parse_integer_from_args(args, name, default=None, required=False):
return int(args[name][0])
except Exception:
message = "Query parameter %r must be an integer" % (name,)
raise SynapseError(400, message, errcode=Codes.INVALID_PARAM)
raise SynapseError(400, message)
else:
if required:
message = "Missing integer query parameter %r" % (name,)

View File

@@ -80,10 +80,10 @@ ALLOWED_ATTRS = {
class Mailer(object):
def __init__(self, hs, app_name, template_html, template_text):
def __init__(self, hs, app_name, notif_template_html, notif_template_text):
self.hs = hs
self.template_html = template_html
self.template_text = template_text
self.notif_template_html = notif_template_html
self.notif_template_text = notif_template_text
self.sendmail = self.hs.get_sendmail()
self.store = self.hs.get_datastore()
@@ -93,49 +93,22 @@ class Mailer(object):
logger.info("Created Mailer for app_name %s" % app_name)
@defer.inlineCallbacks
def send_password_reset_mail(
self,
email_address,
token,
client_secret,
sid,
):
"""Send an email with a password reset link to a user
Args:
email_address (str): Email address we're sending the password
reset to
token (str): Unique token generated by the server to verify
password reset email was received
client_secret (str): Unique token generated by the client to
group together multiple email sending attempts
sid (str): The generated session ID
"""
if email.utils.parseaddr(email_address)[1] == '':
raise RuntimeError("Invalid 'to' email address")
link = (
self.hs.config.public_baseurl +
"_synapse/password_reset/email/submit_token"
"?token=%s&client_secret=%s&sid=%s" %
(token, client_secret, sid)
)
template_vars = {
"link": link,
}
yield self.send_email(
email_address,
"[%s] Password Reset Email" % self.hs.config.server_name,
template_vars,
)
@defer.inlineCallbacks
def send_notification_mail(self, app_id, user_id, email_address,
push_actions, reason):
"""Send email regarding a user's room notifications"""
try:
from_string = self.hs.config.email_notif_from % {
"app": self.app_name
}
except TypeError:
from_string = self.hs.config.email_notif_from
raw_from = email.utils.parseaddr(from_string)[1]
raw_to = email.utils.parseaddr(email_address)[1]
if raw_to == '':
raise RuntimeError("Invalid 'to' address")
rooms_in_order = deduped_ordered_list(
[pa['room_id'] for pa in push_actions]
)
@@ -203,36 +176,14 @@ class Mailer(object):
"reason": reason,
}
yield self.send_email(
email_address,
"[%s] %s" % (self.app_name, summary_text),
template_vars,
)
@defer.inlineCallbacks
def send_email(self, email_address, subject, template_vars):
"""Send an email with the given information and template text"""
try:
from_string = self.hs.config.email_notif_from % {
"app": self.app_name
}
except TypeError:
from_string = self.hs.config.email_notif_from
raw_from = email.utils.parseaddr(from_string)[1]
raw_to = email.utils.parseaddr(email_address)[1]
if raw_to == '':
raise RuntimeError("Invalid 'to' address")
html_text = self.template_html.render(**template_vars)
html_text = self.notif_template_html.render(**template_vars)
html_part = MIMEText(html_text, "html", "utf8")
plain_text = self.template_text.render(**template_vars)
plain_text = self.notif_template_text.render(**template_vars)
text_part = MIMEText(plain_text, "plain", "utf8")
multipart_msg = MIMEMultipart('alternative')
multipart_msg['Subject'] = subject
multipart_msg['Subject'] = "[%s] %s" % (self.app_name, summary_text)
multipart_msg['From'] = from_string
multipart_msg['To'] = email_address
multipart_msg['Date'] = email.utils.formatdate()

View File

@@ -70,8 +70,8 @@ class PusherFactory(object):
mailer = Mailer(
hs=self.hs,
app_name=app_name,
template_html=self.notif_template_html,
template_text=self.notif_template_text,
notif_template_html=self.notif_template_html,
notif_template_text=self.notif_template_text,
)
self.mailers[app_name] = mailer
return EmailPusher(self.hs, pusherdict, mailer)

View File

@@ -77,7 +77,7 @@ REQUIREMENTS = [
]
CONDITIONAL_REQUIREMENTS = {
"email": ["Jinja2>=2.9", "bleach>=1.4.2"],
"email.enable_notifs": ["Jinja2>=2.9", "bleach>=1.4.2"],
"matrix-synapse-ldap3": ["matrix-synapse-ldap3>=0.1"],
# we use execute_batch, which arrived in psycopg 2.7.

View File

@@ -1,9 +0,0 @@
<html>
<body>
<p>A password reset request has been received for your Matrix account. If this was you, please click the link below to confirm resetting your password:</p>
<a href="{{ link }}">{{ link }}</a>
<p>If this was not you, please disregard this email and contact your server administrator. Thank you.</p>
</body>
</html>

View File

@@ -1,7 +0,0 @@
A password reset request has been received for your Matrix account. If this
was you, please click the link below to confirm resetting your password:
{{ link }}
If this was not you, please disregard this email and contact your server
administrator. Thank you.

View File

@@ -1,6 +0,0 @@
<html>
<head></head>
<body>
<p>{{ failure_reason }}. Your password has not been reset.</p>
</body>
</html>

View File

@@ -1,6 +0,0 @@
<html>
<head></head>
<body>
<p>Your password was successfully reset. You may now close this window.</p>
</body>
</html>

View File

@@ -822,16 +822,10 @@ class AdminRestResource(JsonResource):
def __init__(self, hs):
JsonResource.__init__(self, hs, canonical_json=False)
register_servlets(hs, self)
def register_servlets(hs, http_server):
"""
Register all the admin servlets.
"""
register_servlets_for_client_rest_resource(hs, http_server)
SendServerNoticeServlet(hs).register(http_server)
VersionServlet(hs).register(http_server)
register_servlets_for_client_rest_resource(hs, self)
SendServerNoticeServlet(hs).register(self)
VersionServlet(hs).register(self)
def register_servlets_for_client_rest_resource(hs, http_server):

View File

@@ -0,0 +1,65 @@
# -*- coding: utf-8 -*-
# Copyright 2014-2016 OpenMarket Ltd
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""This module contains base REST classes for constructing client v1 servlets.
"""
import logging
import re
from synapse.api.urls import CLIENT_API_PREFIX
from synapse.http.servlet import RestServlet
from synapse.rest.client.transactions import HttpTransactionCache
logger = logging.getLogger(__name__)
def client_path_patterns(path_regex, releases=(0,), include_in_unstable=True):
"""Creates a regex compiled client path with the correct client path
prefix.
Args:
path_regex (str): The regex string to match. This should NOT have a ^
as this will be prefixed.
Returns:
SRE_Pattern
"""
patterns = [re.compile("^" + CLIENT_API_PREFIX + "/api/v1" + path_regex)]
if include_in_unstable:
unstable_prefix = CLIENT_API_PREFIX + "/unstable"
patterns.append(re.compile("^" + unstable_prefix + path_regex))
for release in releases:
new_prefix = CLIENT_API_PREFIX + "/r%d" % (release,)
patterns.append(re.compile("^" + new_prefix + path_regex))
return patterns
class ClientV1RestServlet(RestServlet):
"""A base Synapse REST Servlet for the client version 1 API.
"""
# This subclass was presumably created to allow the auth for the v1
# protocol version to be different, however this behaviour was removed.
# it may no longer be necessary
def __init__(self, hs):
"""
Args:
hs (synapse.server.HomeServer):
"""
self.hs = hs
self.builder_factory = hs.get_event_builder_factory()
self.auth = hs.get_auth()
self.txns = HttpTransactionCache(hs)

View File

@@ -19,10 +19,11 @@ import logging
from twisted.internet import defer
from synapse.api.errors import AuthError, Codes, NotFoundError, SynapseError
from synapse.http.servlet import RestServlet, parse_json_object_from_request
from synapse.rest.client.v2_alpha._base import client_patterns
from synapse.http.servlet import parse_json_object_from_request
from synapse.types import RoomAlias
from .base import ClientV1RestServlet, client_path_patterns
logger = logging.getLogger(__name__)
@@ -32,14 +33,13 @@ def register_servlets(hs, http_server):
ClientAppserviceDirectoryListServer(hs).register(http_server)
class ClientDirectoryServer(RestServlet):
PATTERNS = client_patterns("/directory/room/(?P<room_alias>[^/]*)$", v1=True)
class ClientDirectoryServer(ClientV1RestServlet):
PATTERNS = client_path_patterns("/directory/room/(?P<room_alias>[^/]*)$")
def __init__(self, hs):
super(ClientDirectoryServer, self).__init__()
super(ClientDirectoryServer, self).__init__(hs)
self.store = hs.get_datastore()
self.handlers = hs.get_handlers()
self.auth = hs.get_auth()
@defer.inlineCallbacks
def on_GET(self, request, room_alias):
@@ -120,14 +120,13 @@ class ClientDirectoryServer(RestServlet):
defer.returnValue((200, {}))
class ClientDirectoryListServer(RestServlet):
PATTERNS = client_patterns("/directory/list/room/(?P<room_id>[^/]*)$", v1=True)
class ClientDirectoryListServer(ClientV1RestServlet):
PATTERNS = client_path_patterns("/directory/list/room/(?P<room_id>[^/]*)$")
def __init__(self, hs):
super(ClientDirectoryListServer, self).__init__()
super(ClientDirectoryListServer, self).__init__(hs)
self.store = hs.get_datastore()
self.handlers = hs.get_handlers()
self.auth = hs.get_auth()
@defer.inlineCallbacks
def on_GET(self, request, room_id):
@@ -163,16 +162,15 @@ class ClientDirectoryListServer(RestServlet):
defer.returnValue((200, {}))
class ClientAppserviceDirectoryListServer(RestServlet):
PATTERNS = client_patterns(
"/directory/list/appservice/(?P<network_id>[^/]*)/(?P<room_id>[^/]*)$", v1=True
class ClientAppserviceDirectoryListServer(ClientV1RestServlet):
PATTERNS = client_path_patterns(
"/directory/list/appservice/(?P<network_id>[^/]*)/(?P<room_id>[^/]*)$"
)
def __init__(self, hs):
super(ClientAppserviceDirectoryListServer, self).__init__()
super(ClientAppserviceDirectoryListServer, self).__init__(hs)
self.store = hs.get_datastore()
self.handlers = hs.get_handlers()
self.auth = hs.get_auth()
def on_PUT(self, request, network_id, room_id):
content = parse_json_object_from_request(request)

View File

@@ -19,22 +19,21 @@ import logging
from twisted.internet import defer
from synapse.api.errors import SynapseError
from synapse.http.servlet import RestServlet
from synapse.rest.client.v2_alpha._base import client_patterns
from synapse.streams.config import PaginationConfig
from .base import ClientV1RestServlet, client_path_patterns
logger = logging.getLogger(__name__)
class EventStreamRestServlet(RestServlet):
PATTERNS = client_patterns("/events$", v1=True)
class EventStreamRestServlet(ClientV1RestServlet):
PATTERNS = client_path_patterns("/events$")
DEFAULT_LONGPOLL_TIME_MS = 30000
def __init__(self, hs):
super(EventStreamRestServlet, self).__init__()
super(EventStreamRestServlet, self).__init__(hs)
self.event_stream_handler = hs.get_event_stream_handler()
self.auth = hs.get_auth()
@defer.inlineCallbacks
def on_GET(self, request):
@@ -77,11 +76,11 @@ class EventStreamRestServlet(RestServlet):
# TODO: Unit test gets, with and without auth, with different kinds of events.
class EventRestServlet(RestServlet):
PATTERNS = client_patterns("/events/(?P<event_id>[^/]*)$", v1=True)
class EventRestServlet(ClientV1RestServlet):
PATTERNS = client_path_patterns("/events/(?P<event_id>[^/]*)$")
def __init__(self, hs):
super(EventRestServlet, self).__init__()
super(EventRestServlet, self).__init__(hs)
self.clock = hs.get_clock()
self.event_handler = hs.get_event_handler()
self._event_serializer = hs.get_event_client_serializer()

View File

@@ -15,19 +15,19 @@
from twisted.internet import defer
from synapse.http.servlet import RestServlet, parse_boolean
from synapse.rest.client.v2_alpha._base import client_patterns
from synapse.http.servlet import parse_boolean
from synapse.streams.config import PaginationConfig
from .base import ClientV1RestServlet, client_path_patterns
# TODO: Needs unit testing
class InitialSyncRestServlet(RestServlet):
PATTERNS = client_patterns("/initialSync$", v1=True)
class InitialSyncRestServlet(ClientV1RestServlet):
PATTERNS = client_path_patterns("/initialSync$")
def __init__(self, hs):
super(InitialSyncRestServlet, self).__init__()
super(InitialSyncRestServlet, self).__init__(hs)
self.initial_sync_handler = hs.get_initial_sync_handler()
self.auth = hs.get_auth()
@defer.inlineCallbacks
def on_GET(self, request):

View File

@@ -29,11 +29,12 @@ from synapse.http.servlet import (
parse_json_object_from_request,
parse_string,
)
from synapse.rest.client.v2_alpha._base import client_patterns
from synapse.rest.well_known import WellKnownBuilder
from synapse.types import UserID, map_username_to_mxid_localpart
from synapse.util.msisdn import phone_number_to_msisdn
from .base import ClientV1RestServlet, client_path_patterns
logger = logging.getLogger(__name__)
@@ -80,16 +81,15 @@ def login_id_thirdparty_from_phone(identifier):
}
class LoginRestServlet(RestServlet):
PATTERNS = client_patterns("/login$", v1=True)
class LoginRestServlet(ClientV1RestServlet):
PATTERNS = client_path_patterns("/login$")
CAS_TYPE = "m.login.cas"
SSO_TYPE = "m.login.sso"
TOKEN_TYPE = "m.login.token"
JWT_TYPE = "m.login.jwt"
def __init__(self, hs):
super(LoginRestServlet, self).__init__()
self.hs = hs
super(LoginRestServlet, self).__init__(hs)
self.jwt_enabled = hs.config.jwt_enabled
self.jwt_secret = hs.config.jwt_secret
self.jwt_algorithm = hs.config.jwt_algorithm
@@ -371,7 +371,7 @@ class LoginRestServlet(RestServlet):
class CasRedirectServlet(RestServlet):
PATTERNS = client_patterns("/login/(cas|sso)/redirect", v1=True)
PATTERNS = client_path_patterns("/login/(cas|sso)/redirect")
def __init__(self, hs):
super(CasRedirectServlet, self).__init__()
@@ -386,7 +386,7 @@ class CasRedirectServlet(RestServlet):
b"redirectUrl": args[b"redirectUrl"][0]
}).encode('ascii')
hs_redirect_url = (self.cas_service_url +
b"/_matrix/client/r0/login/cas/ticket")
b"/_matrix/client/api/v1/login/cas/ticket")
service_param = urllib.parse.urlencode({
b"service": b"%s?%s" % (hs_redirect_url, client_redirect_url_param)
}).encode('ascii')
@@ -394,27 +394,27 @@ class CasRedirectServlet(RestServlet):
finish_request(request)
class CasTicketServlet(RestServlet):
PATTERNS = client_patterns("/login/cas/ticket", v1=True)
class CasTicketServlet(ClientV1RestServlet):
PATTERNS = client_path_patterns("/login/cas/ticket", releases=())
def __init__(self, hs):
super(CasTicketServlet, self).__init__()
super(CasTicketServlet, self).__init__(hs)
self.cas_server_url = hs.config.cas_server_url
self.cas_service_url = hs.config.cas_service_url
self.cas_required_attributes = hs.config.cas_required_attributes
self._sso_auth_handler = SSOAuthHandler(hs)
self._http_client = hs.get_simple_http_client()
@defer.inlineCallbacks
def on_GET(self, request):
client_redirect_url = parse_string(request, "redirectUrl", required=True)
http_client = self.hs.get_simple_http_client()
uri = self.cas_server_url + "/proxyValidate"
args = {
"ticket": parse_string(request, "ticket", required=True),
"service": self.cas_service_url
}
try:
body = yield self._http_client.get_raw(uri, args)
body = yield http_client.get_raw(uri, args)
except PartialDownloadError as pde:
# Twisted raises this error if the connection is closed,
# even if that's being used old-http style to signal end-of-data

View File

@@ -17,18 +17,19 @@ import logging
from twisted.internet import defer
from synapse.http.servlet import RestServlet
from synapse.rest.client.v2_alpha._base import client_patterns
from synapse.api.errors import AuthError
from .base import ClientV1RestServlet, client_path_patterns
logger = logging.getLogger(__name__)
class LogoutRestServlet(RestServlet):
PATTERNS = client_patterns("/logout$", v1=True)
class LogoutRestServlet(ClientV1RestServlet):
PATTERNS = client_path_patterns("/logout$")
def __init__(self, hs):
super(LogoutRestServlet, self).__init__()
self.auth = hs.get_auth()
super(LogoutRestServlet, self).__init__(hs)
self._auth = hs.get_auth()
self._auth_handler = hs.get_auth_handler()
self._device_handler = hs.get_device_handler()
@@ -37,25 +38,32 @@ class LogoutRestServlet(RestServlet):
@defer.inlineCallbacks
def on_POST(self, request):
requester = yield self.auth.get_user_by_req(request)
if requester.device_id is None:
# the acccess token wasn't associated with a device.
# Just delete the access token
access_token = self.auth.get_access_token_from_request(request)
yield self._auth_handler.delete_access_token(access_token)
try:
requester = yield self.auth.get_user_by_req(request)
except AuthError:
# this implies the access token has already been deleted.
defer.returnValue((401, {
"errcode": "M_UNKNOWN_TOKEN",
"error": "Access Token unknown or expired"
}))
else:
yield self._device_handler.delete_device(
requester.user.to_string(), requester.device_id)
if requester.device_id is None:
# the acccess token wasn't associated with a device.
# Just delete the access token
access_token = self._auth.get_access_token_from_request(request)
yield self._auth_handler.delete_access_token(access_token)
else:
yield self._device_handler.delete_device(
requester.user.to_string(), requester.device_id)
defer.returnValue((200, {}))
class LogoutAllRestServlet(RestServlet):
PATTERNS = client_patterns("/logout/all$", v1=True)
class LogoutAllRestServlet(ClientV1RestServlet):
PATTERNS = client_path_patterns("/logout/all$")
def __init__(self, hs):
super(LogoutAllRestServlet, self).__init__()
super(LogoutAllRestServlet, self).__init__(hs)
self.auth = hs.get_auth()
self._auth_handler = hs.get_auth_handler()
self._device_handler = hs.get_device_handler()

View File

@@ -23,22 +23,21 @@ from twisted.internet import defer
from synapse.api.errors import AuthError, SynapseError
from synapse.handlers.presence import format_user_presence_state
from synapse.http.servlet import RestServlet, parse_json_object_from_request
from synapse.rest.client.v2_alpha._base import client_patterns
from synapse.http.servlet import parse_json_object_from_request
from synapse.types import UserID
from .base import ClientV1RestServlet, client_path_patterns
logger = logging.getLogger(__name__)
class PresenceStatusRestServlet(RestServlet):
PATTERNS = client_patterns("/presence/(?P<user_id>[^/]*)/status", v1=True)
class PresenceStatusRestServlet(ClientV1RestServlet):
PATTERNS = client_path_patterns("/presence/(?P<user_id>[^/]*)/status")
def __init__(self, hs):
super(PresenceStatusRestServlet, self).__init__()
self.hs = hs
super(PresenceStatusRestServlet, self).__init__(hs)
self.presence_handler = hs.get_presence_handler()
self.clock = hs.get_clock()
self.auth = hs.get_auth()
@defer.inlineCallbacks
def on_GET(self, request, user_id):

View File

@@ -16,19 +16,18 @@
""" This module contains REST servlets to do with profile: /profile/<paths> """
from twisted.internet import defer
from synapse.http.servlet import RestServlet, parse_json_object_from_request
from synapse.rest.client.v2_alpha._base import client_patterns
from synapse.http.servlet import parse_json_object_from_request
from synapse.types import UserID
from .base import ClientV1RestServlet, client_path_patterns
class ProfileDisplaynameRestServlet(RestServlet):
PATTERNS = client_patterns("/profile/(?P<user_id>[^/]*)/displayname", v1=True)
class ProfileDisplaynameRestServlet(ClientV1RestServlet):
PATTERNS = client_path_patterns("/profile/(?P<user_id>[^/]*)/displayname")
def __init__(self, hs):
super(ProfileDisplaynameRestServlet, self).__init__()
self.hs = hs
super(ProfileDisplaynameRestServlet, self).__init__(hs)
self.profile_handler = hs.get_profile_handler()
self.auth = hs.get_auth()
@defer.inlineCallbacks
def on_GET(self, request, user_id):
@@ -72,14 +71,12 @@ class ProfileDisplaynameRestServlet(RestServlet):
return (200, {})
class ProfileAvatarURLRestServlet(RestServlet):
PATTERNS = client_patterns("/profile/(?P<user_id>[^/]*)/avatar_url", v1=True)
class ProfileAvatarURLRestServlet(ClientV1RestServlet):
PATTERNS = client_path_patterns("/profile/(?P<user_id>[^/]*)/avatar_url")
def __init__(self, hs):
super(ProfileAvatarURLRestServlet, self).__init__()
self.hs = hs
super(ProfileAvatarURLRestServlet, self).__init__(hs)
self.profile_handler = hs.get_profile_handler()
self.auth = hs.get_auth()
@defer.inlineCallbacks
def on_GET(self, request, user_id):
@@ -122,14 +119,12 @@ class ProfileAvatarURLRestServlet(RestServlet):
return (200, {})
class ProfileRestServlet(RestServlet):
PATTERNS = client_patterns("/profile/(?P<user_id>[^/]*)", v1=True)
class ProfileRestServlet(ClientV1RestServlet):
PATTERNS = client_path_patterns("/profile/(?P<user_id>[^/]*)")
def __init__(self, hs):
super(ProfileRestServlet, self).__init__()
self.hs = hs
super(ProfileRestServlet, self).__init__(hs)
self.profile_handler = hs.get_profile_handler()
self.auth = hs.get_auth()
@defer.inlineCallbacks
def on_GET(self, request, user_id):

Some files were not shown because too many files have changed in this diff Show More