1
0

Merge branch 'matrix-org-hotfixes' into rav/frontend_proxy_auth_header_matrix_hotfixes

This commit is contained in:
Richard van der Hoff
2017-10-27 11:07:44 +01:00
81 changed files with 517 additions and 279 deletions
+50
View File
@@ -1,3 +1,53 @@
Changes in synapse v0.24.1 (2017-10-24)
=======================================
Bug fixes:
* Fix updating group profiles over federation (PR #2567)
Changes in synapse v0.24.0 (2017-10-23)
=======================================
No changes since v0.24.0-rc1
Changes in synapse v0.24.0-rc1 (2017-10-19)
===========================================
Features:
* Add Group Server (PR #2352, #2363, #2374, #2377, #2378, #2382, #2410, #2426,
#2430, #2454, #2471, #2472, #2544)
* Add support for channel notifications (PR #2501)
* Add basic implementation of backup media store (PR #2538)
* Add config option to auto-join new users to rooms (PR #2545)
Changes:
* Make the spam checker a module (PR #2474)
* Delete expired url cache data (PR #2478)
* Ignore incoming events for rooms that we have left (PR #2490)
* Allow spam checker to reject invites too (PR #2492)
* Add room creation checks to spam checker (PR #2495)
* Spam checking: add the invitee to user_may_invite (PR #2502)
* Process events from federation for different rooms in parallel (PR #2520)
* Allow error strings from spam checker (PR #2531)
* Improve error handling for missing files in config (PR #2551)
Bug fixes:
* Fix handling SERVFAILs when doing AAAA lookups for federation (PR #2477)
* Fix incompatibility with newer versions of ujson (PR #2483) Thanks to
@jeremycline!
* Fix notification keywords that start/end with non-word chars (PR #2500)
* Fix stack overflow and logcontexts from linearizer (PR #2532)
* Fix 500 error when fields missing from power_levels event (PR #2552)
* Fix 500 error when we get an error handling a PDU (PR #2553)
Changes in synapse v0.23.1 (2017-10-02)
=======================================
+3 -1
View File
@@ -823,7 +823,9 @@ spidering 'internal' URLs on your network. At the very least we recommend that
your loopback and RFC1918 IP addresses are blacklisted.
This also requires the optional lxml and netaddr python dependencies to be
installed.
installed. This in turn requires the libxml2 library to be available - on
Debian/Ubuntu this means ``apt-get install libxml2-dev``, or equivalent for
your OS.
Password reset
+2
View File
@@ -4,6 +4,8 @@ Purge History API
The purge history API allows server admins to purge historic events from their
database, reclaiming disk space.
**NB!** This will not delete local events (locally sent messages content etc) from the database, but will remove lots of the metadata about them and does dramatically reduce the on disk space usage
Depending on the amount of history being purged a call to the API may take
several minutes or longer. During this period users will not be able to
paginate further back in the room from the point being purged from.
+105 -38
View File
@@ -1,52 +1,119 @@
Basically, PEP8
- Everything should comply with PEP8. Code should pass
``pep8 --max-line-length=100`` without any warnings.
- NEVER tabs. 4 spaces to indent.
- Max line width: 79 chars (with flexibility to overflow by a "few chars" if
- **Indenting**:
- NEVER tabs. 4 spaces to indent.
- follow PEP8; either hanging indent or multiline-visual indent depending
on the size and shape of the arguments and what makes more sense to the
author. In other words, both this::
print("I am a fish %s" % "moo")
and this::
print("I am a fish %s" %
"moo")
and this::
print(
"I am a fish %s" %
"moo",
)
...are valid, although given each one takes up 2x more vertical space than
the previous, it's up to the author's discretion as to which layout makes
most sense for their function invocation. (e.g. if they want to add
comments per-argument, or put expressions in the arguments, or group
related arguments together, or want to deliberately extend or preserve
vertical/horizontal space)
- **Line length**:
Max line length is 79 chars (with flexibility to overflow by a "few chars" if
the overflowing content is not semantically significant and avoids an
explosion of vertical whitespace).
- Use camel case for class and type names
- Use underscores for functions and variables.
- Use double quotes.
- Use parentheses instead of '\\' for line continuation where ever possible
(which is pretty much everywhere)
- There should be max a single new line between:
Use parentheses instead of ``\`` for line continuation where ever possible
(which is pretty much everywhere).
- **Naming**:
- Use camel case for class and type names
- Use underscores for functions and variables.
- Use double quotes ``"foo"`` rather than single quotes ``'foo'``.
- **Blank lines**:
- There should be max a single new line between:
- statements
- functions in a class
- There should be two new lines between:
- There should be two new lines between:
- definitions in a module (e.g., between different classes)
- There should be spaces where spaces should be and not where there shouldn't be:
- a single space after a comma
- a single space before and after for '=' when used as assignment
- no spaces before and after for '=' for default values and keyword arguments.
- Indenting must follow PEP8; either hanging indent or multiline-visual indent
depending on the size and shape of the arguments and what makes more sense to
the author. In other words, both this::
print("I am a fish %s" % "moo")
- **Whitespace**:
and this::
There should be spaces where spaces should be and not where there shouldn't
be:
print("I am a fish %s" %
"moo")
- a single space after a comma
- a single space before and after for '=' when used as assignment
- no spaces before and after for '=' for default values and keyword arguments.
and this::
- **Comments**: should follow the `google code style
<http://google.github.io/styleguide/pyguide.html?showone=Comments#Comments>`_.
This is so that we can generate documentation with `sphinx
<http://sphinxcontrib-napoleon.readthedocs.org/en/latest/>`_. See the
`examples
<http://sphinxcontrib-napoleon.readthedocs.io/en/latest/example_google.html>`_
in the sphinx documentation.
print(
"I am a fish %s" %
"moo"
)
- **Imports**:
...are valid, although given each one takes up 2x more vertical space than
the previous, it's up to the author's discretion as to which layout makes most
sense for their function invocation. (e.g. if they want to add comments
per-argument, or put expressions in the arguments, or group related arguments
together, or want to deliberately extend or preserve vertical/horizontal
space)
- Prefer to import classes and functions than packages or modules.
Comments should follow the `google code style <http://google.github.io/styleguide/pyguide.html?showone=Comments#Comments>`_.
This is so that we can generate documentation with
`sphinx <http://sphinxcontrib-napoleon.readthedocs.org/en/latest/>`_. See the
`examples <http://sphinxcontrib-napoleon.readthedocs.io/en/latest/example_google.html>`_
in the sphinx documentation.
Example::
Code should pass pep8 --max-line-length=100 without any warnings.
from synapse.types import UserID
...
user_id = UserID(local, server)
is preferred over::
from synapse import types
...
user_id = types.UserID(local, server)
(or any other variant).
This goes against the advice in the Google style guide, but it means that
errors in the name are caught early (at import time).
- Multiple imports from the same package can be combined onto one line::
from synapse.types import GroupID, RoomID, UserID
An effort should be made to keep the individual imports in alphabetical
order.
If the list becomes long, wrap it with parentheses and split it over
multiple lines.
- As per `PEP-8 <https://www.python.org/dev/peps/pep-0008/#imports>`_,
imports should be grouped in the following order, with a blank line between
each group:
1. standard library imports
2. related third party imports
3. local application/library specific imports
- Imports within each group should be sorted alphabetically by module name.
- Avoid wildcard imports (``from synapse.types import *``) and relative
imports (``from .types import UserID``).
+1
View File
@@ -112,6 +112,7 @@ class Store(object):
_simple_update_one = SQLBaseStore.__dict__["_simple_update_one"]
_simple_update_one_txn = SQLBaseStore.__dict__["_simple_update_one_txn"]
_simple_update_txn = SQLBaseStore.__dict__["_simple_update_txn"]
def runInteraction(self, desc, func, *args, **kwargs):
def r(conn):
+1 -1
View File
@@ -16,4 +16,4 @@
""" This is a reference implementation of a Matrix home server.
"""
__version__ = "0.23.1"
__version__ = "0.24.1"
+1 -1
View File
@@ -19,7 +19,7 @@ import sys
try:
import affinity
except:
except Exception:
affinity = None
from daemonize import Daemonize
+8 -2
View File
@@ -80,8 +80,7 @@ class PresenceStatusStubServlet(ClientV1RestServlet):
class KeyUploadServlet(RestServlet):
PATTERNS = client_v2_patterns("/keys/upload(/(?P<device_id>[^/]+))?$",
releases=())
PATTERNS = client_v2_patterns("/keys/upload(/(?P<device_id>[^/]+))?$")
def __init__(self, hs):
"""
@@ -119,9 +118,16 @@ class KeyUploadServlet(RestServlet):
if body:
# They're actually trying to upload something, proxy to main synapse.
# Pass through the auth headers, if any, in case the access token
# is there.
auth_headers = request.requestHeaders.getRawHeaders("Authorization", [])
headers = {
"Authorization": auth_headers,
}
result = yield self.http_client.post_json_get_json(
self.main_uri + request.uri,
body,
headers=headers,
)
defer.returnValue((200, result))
+7 -3
View File
@@ -18,6 +18,7 @@ from synapse.api.constants import ThirdPartyEntityKind
from synapse.api.errors import CodeMessageException
from synapse.http.client import SimpleHttpClient
from synapse.events.utils import serialize_event
from synapse.util.logcontext import preserve_fn, make_deferred_yieldable
from synapse.util.caches.response_cache import ResponseCache
from synapse.types import ThirdPartyInstanceID
@@ -192,9 +193,12 @@ class ApplicationServiceApi(SimpleHttpClient):
defer.returnValue(None)
key = (service.id, protocol)
return self.protocol_meta_cache.get(key) or (
self.protocol_meta_cache.set(key, _get())
)
result = self.protocol_meta_cache.get(key)
if not result:
result = self.protocol_meta_cache.set(
key, preserve_fn(_get)()
)
return make_deferred_yieldable(result)
@defer.inlineCallbacks
def push_bulk(self, service, events, txn_id=None):
+1 -1
View File
@@ -123,7 +123,7 @@ class _ServiceQueuer(object):
with Measure(self.clock, "servicequeuer.send"):
try:
yield self.txn_ctrl.send(service, events)
except:
except Exception:
logger.exception("AS request failed")
finally:
self.requests_in_flight.discard(service.id)
+5 -1
View File
@@ -148,8 +148,8 @@ def setup_logging(config, use_worker_options=False):
"%(asctime)s - %(name)s - %(lineno)d - %(levelname)s - %(request)s"
" - %(message)s"
)
if log_config is None:
if log_config is None:
level = logging.INFO
level_for_storage = logging.INFO
if config.verbosity:
@@ -176,6 +176,10 @@ def setup_logging(config, use_worker_options=False):
logger.info("Opened new log file due to SIGHUP")
else:
handler = logging.StreamHandler()
def sighup(signum, stack):
pass
handler.setFormatter(formatter)
handler.addFilter(LoggingContextFilter(request=""))
+1 -1
View File
@@ -303,7 +303,7 @@ def read_gc_thresholds(thresholds):
return (
int(thresholds[0]), int(thresholds[1]), int(thresholds[2]),
)
except:
except Exception:
raise ConfigError(
"Value of `gc_threshold` must be a list of three integers if set"
)
+6
View File
@@ -109,6 +109,12 @@ class TlsConfig(Config):
# key. It may be necessary to publish the fingerprints of a new
# certificate and wait until the "valid_until_ts" of the previous key
# responses have passed before deploying it.
#
# You can calculate a fingerprint from a given TLS listener via:
# openssl s_client -connect $host:$port < /dev/null 2> /dev/null |
# openssl x509 -outform DER | openssl sha256 -binary | base64 | tr -d '='
# or by checking matrix.org/federationtester/api/report?server_name=$host
#
tls_fingerprints: []
# tls_fingerprints: [{"sha256": "<base64_encoded_sha256_fingerprint>"}]
""" % locals()
+1 -1
View File
@@ -34,7 +34,7 @@ class ServerContextFactory(ssl.ContextFactory):
try:
_ecCurve = _OpenSSLECCurve(_defaultCurveName)
_ecCurve.addECKeyToContext(context)
except:
except Exception:
logger.exception("Failed to enable elliptic curve for TLS")
context.set_options(SSL.OP_NO_SSLv2 | SSL.OP_NO_SSLv3)
context.use_certificate_chain_file(config.tls_certificate_file)
+1 -1
View File
@@ -43,7 +43,7 @@ def check_event_content_hash(event, hash_algorithm=hashlib.sha256):
message_hash_base64 = event.hashes[name]
try:
message_hash_bytes = decode_base64(message_hash_base64)
except:
except Exception:
raise SynapseError(
400,
"Invalid base64: %s" % (message_hash_base64,),
+1 -1
View File
@@ -759,7 +759,7 @@ def _handle_key_deferred(verify_request):
))
try:
verify_signed_json(json_object, server_name, verify_key)
except:
except Exception:
raise SynapseError(
401,
"Invalid signature for server %s with key %s:%s" % (
+2 -2
View File
@@ -443,12 +443,12 @@ def _check_power_levels(event, auth_events):
for k, v in user_list.items():
try:
UserID.from_string(k)
except:
except Exception:
raise SynapseError(400, "Not a valid user_id: %s" % (k,))
try:
int(v)
except:
except Exception:
raise SynapseError(400, "Not a valid power level: %s" % (v,))
key = (event.type, event.state_key, )
+1 -1
View File
@@ -55,7 +55,7 @@ class EventBuilderFactory(object):
local_part = str(int(self.clock.time())) + i + random_string(5)
e_id = EventID.create(local_part, self.hostname)
e_id = EventID(local_part, self.hostname)
return e_id.to_string()
+1 -1
View File
@@ -22,7 +22,7 @@ class SpamChecker(object):
config = None
try:
module, config = hs.config.spam_checker
except:
except Exception:
pass
if module is not None:
+5 -3
View File
@@ -18,6 +18,7 @@ from .federation_base import FederationBase
from .units import Transaction, Edu
from synapse.util import async
from synapse.util.logcontext import make_deferred_yieldable, preserve_fn
from synapse.util.logutils import log_function
from synapse.util.caches.response_cache import ResponseCache
from synapse.events import FrozenEvent
@@ -253,12 +254,13 @@ class FederationServer(FederationBase):
result = self._state_resp_cache.get((room_id, event_id))
if not result:
with (yield self._server_linearizer.queue((origin, room_id))):
resp = yield self._state_resp_cache.set(
d = self._state_resp_cache.set(
(room_id, event_id),
self._on_context_state_request_compute(room_id, event_id)
preserve_fn(self._on_context_state_request_compute)(room_id, event_id)
)
resp = yield make_deferred_yieldable(d)
else:
resp = yield result
resp = yield make_deferred_yieldable(result)
defer.returnValue((200, resp))
+20
View File
@@ -485,6 +485,26 @@ class TransportLayerClient(object):
ignore_backoff=True,
)
@log_function
def update_group_profile(self, destination, group_id, requester_user_id, content):
"""Update a remote group profile
Args:
destination (str)
group_id (str)
requester_user_id (str)
content (dict): The new profile of the group
"""
path = PREFIX + "/groups/%s/profile" % (group_id,)
return self.client.post_json(
destination=destination,
path=path,
args={"requester_user_id": requester_user_id},
data=content,
ignore_backoff=True,
)
@log_function
def get_group_summary(self, destination, group_id, requester_user_id):
"""Get a group summary
+16 -16
View File
@@ -112,7 +112,7 @@ class Authenticator(object):
key = strip_quotes(param_dict["key"])
sig = strip_quotes(param_dict["sig"])
return (origin, key, sig)
except:
except Exception:
raise AuthenticationError(
400, "Malformed Authorization header", Codes.UNAUTHORIZED
)
@@ -177,7 +177,7 @@ class BaseFederationServlet(object):
if self.REQUIRE_AUTH:
logger.exception("authenticate_request failed")
raise
except:
except Exception:
logger.exception("authenticate_request failed")
raise
@@ -270,7 +270,7 @@ class FederationSendServlet(BaseFederationServlet):
code, response = yield self.handler.on_incoming_transaction(
transaction_data
)
except:
except Exception:
logger.exception("on_incoming_transaction failed")
raise
@@ -610,7 +610,7 @@ class FederationVersionServlet(BaseFederationServlet):
class FederationGroupsProfileServlet(BaseFederationServlet):
"""Get the basic profile of a group on behalf of a user
"""Get/set the basic profile of a group on behalf of a user
"""
PATH = "/groups/(?P<group_id>[^/]*)/profile$"
@@ -626,6 +626,18 @@ class FederationGroupsProfileServlet(BaseFederationServlet):
defer.returnValue((200, new_content))
@defer.inlineCallbacks
def on_POST(self, origin, content, query, group_id):
requester_user_id = parse_string_from_args(query, "requester_user_id")
if get_domain_from_id(requester_user_id) != origin:
raise SynapseError(403, "requester_user_id doesn't match origin")
new_content = yield self.handler.update_group_profile(
group_id, requester_user_id, content
)
defer.returnValue((200, new_content))
class FederationGroupsSummaryServlet(BaseFederationServlet):
PATH = "/groups/(?P<group_id>[^/]*)/summary$"
@@ -642,18 +654,6 @@ class FederationGroupsSummaryServlet(BaseFederationServlet):
defer.returnValue((200, new_content))
@defer.inlineCallbacks
def on_POST(self, origin, content, query, group_id):
requester_user_id = parse_string_from_args(query, "requester_user_id")
if get_domain_from_id(requester_user_id) != origin:
raise SynapseError(403, "requester_user_id doesn't match origin")
new_content = yield self.handler.update_group_profile(
group_id, requester_user_id, content
)
defer.returnValue((200, new_content))
class FederationGroupsRoomsServlet(BaseFederationServlet):
"""Get the rooms in a group on behalf of a user
+8 -24
View File
@@ -13,14 +13,11 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from twisted.internet import defer
import logging
from synapse.api.errors import SynapseError
from synapse.types import UserID, get_domain_from_id, RoomID, GroupID
import logging
import urllib
from synapse.types import GroupID, RoomID, UserID, get_domain_from_id
from twisted.internet import defer
logger = logging.getLogger(__name__)
@@ -698,9 +695,11 @@ class GroupsServerHandler(object):
def create_group(self, group_id, user_id, content):
group = yield self.check_group_is_ours(group_id)
_validate_group_id(group_id)
logger.info("Attempting to create group with ID: %r", group_id)
# parsing the id into a GroupID validates it.
group_id_obj = GroupID.from_string(group_id)
if group:
raise SynapseError(400, "Group already exists")
@@ -710,7 +709,7 @@ class GroupsServerHandler(object):
raise SynapseError(
403, "Only server admin can create group on this server",
)
localpart = GroupID.from_string(group_id).localpart
localpart = group_id_obj.localpart
if not localpart.startswith(self.hs.config.group_creation_prefix):
raise SynapseError(
400,
@@ -786,18 +785,3 @@ def _parse_visibility_from_contents(content):
is_public = True
return is_public
def _validate_group_id(group_id):
"""Validates the group ID is valid for creation on this home server
"""
localpart = GroupID.from_string(group_id).localpart
if localpart.lower() != localpart:
raise SynapseError(400, "Group ID must be lower case")
if urllib.quote(localpart.encode('utf-8')) != localpart:
raise SynapseError(
400,
"Group ID can only contain characters a-z, 0-9, or '_-./'",
)
+1 -1
View File
@@ -267,7 +267,7 @@ class AuthHandler(BaseHandler):
user_id = authdict["user"]
password = authdict["password"]
if not user_id.startswith('@'):
user_id = UserID.create(user_id, self.hs.hostname).to_string()
user_id = UserID(user_id, self.hs.hostname).to_string()
return self._check_password(user_id, password)
+8 -8
View File
@@ -227,7 +227,7 @@ class FederationHandler(BaseHandler):
state, auth_chain = yield self.replication_layer.get_state_for_room(
origin, pdu.room_id, pdu.event_id,
)
except:
except Exception:
logger.exception("Failed to get state for event: %s", pdu.event_id)
yield self._process_received_pdu(
@@ -461,7 +461,7 @@ class FederationHandler(BaseHandler):
def check_match(id):
try:
return server_name == get_domain_from_id(id)
except:
except Exception:
return False
# Parses mapping `event_id -> (type, state_key) -> state event_id`
@@ -499,7 +499,7 @@ class FederationHandler(BaseHandler):
continue
try:
domain = get_domain_from_id(ev.state_key)
except:
except Exception:
continue
if domain != server_name:
@@ -738,7 +738,7 @@ class FederationHandler(BaseHandler):
joined_domains[dom] = min(d, old_d)
else:
joined_domains[dom] = d
except:
except Exception:
pass
return sorted(joined_domains.items(), key=lambda d: d[1])
@@ -940,7 +940,7 @@ class FederationHandler(BaseHandler):
room_creator_user_id="",
is_public=False
)
except:
except Exception:
# FIXME
pass
@@ -1775,7 +1775,7 @@ class FederationHandler(BaseHandler):
[e_id for e_id, _ in event.auth_events]
)
seen_events = set(have_events.keys())
except:
except Exception:
# FIXME:
logger.exception("Failed to get auth chain")
@@ -1899,7 +1899,7 @@ class FederationHandler(BaseHandler):
except AuthError:
pass
except:
except Exception:
# FIXME:
logger.exception("Failed to query auth chain")
@@ -1966,7 +1966,7 @@ class FederationHandler(BaseHandler):
def get_next(it, opt=None):
try:
return it.next()
except:
except Exception:
return opt
current_local = get_next(local_iter)
+1 -1
View File
@@ -214,7 +214,7 @@ class InitialSyncHandler(BaseHandler):
})
d["account_data"] = account_data_events
except:
except Exception:
logger.exception("Failed to get snapshot")
yield concurrently_execute(handle_room, room_list, 10)
+1 -1
View File
@@ -563,7 +563,7 @@ class MessageHandler(BaseHandler):
try:
dump = ujson.dumps(unfreeze(event.content))
ujson.loads(dump)
except:
except Exception:
logger.exception("Failed to encode content: %r", event.content)
raise
+1 -1
View File
@@ -364,7 +364,7 @@ class PresenceHandler(object):
)
preserve_fn(self._update_states)(changes)
except:
except Exception:
logger.exception("Exception in _handle_timeouts loop")
@defer.inlineCallbacks
+3 -3
View File
@@ -118,7 +118,7 @@ class ProfileHandler(BaseHandler):
logger.exception("Failed to get displayname")
raise
except:
except Exception:
logger.exception("Failed to get displayname")
else:
defer.returnValue(result["displayname"])
@@ -165,7 +165,7 @@ class ProfileHandler(BaseHandler):
if e.code != 404:
logger.exception("Failed to get avatar_url")
raise
except:
except Exception:
logger.exception("Failed to get avatar_url")
defer.returnValue(result["avatar_url"])
@@ -266,7 +266,7 @@ class ProfileHandler(BaseHandler):
},
ignore_backoff=True,
)
except:
except Exception:
logger.exception("Failed to get avatar_url")
yield self.store.update_remote_profile_cache(
+7 -10
View File
@@ -15,7 +15,6 @@
"""Contains functions for registering clients."""
import logging
import urllib
from twisted.internet import defer
@@ -23,6 +22,7 @@ from synapse.api.errors import (
AuthError, Codes, SynapseError, RegistrationError, InvalidCaptchaError
)
from synapse.http.client import CaptchaServerHttpClient
from synapse import types
from synapse.types import UserID
from synapse.util.async import run_on_reactor
from ._base import BaseHandler
@@ -46,12 +46,10 @@ class RegistrationHandler(BaseHandler):
@defer.inlineCallbacks
def check_username(self, localpart, guest_access_token=None,
assigned_user_id=None):
yield run_on_reactor()
if urllib.quote(localpart.encode('utf-8')) != localpart:
if types.contains_invalid_mxid_characters(localpart):
raise SynapseError(
400,
"User ID can only contain characters a-z, 0-9, or '_-./'",
"User ID can only contain characters a-z, 0-9, or '=_-./'",
Codes.INVALID_USERNAME
)
@@ -81,7 +79,7 @@ class RegistrationHandler(BaseHandler):
"A different user ID has already been registered for this session",
)
yield self.check_user_id_not_appservice_exclusive(user_id)
self.check_user_id_not_appservice_exclusive(user_id)
users = yield self.store.get_users_by_id_case_insensitive(user_id)
if users:
@@ -254,11 +252,10 @@ class RegistrationHandler(BaseHandler):
"""
Registers email_id as SAML2 Based Auth.
"""
if urllib.quote(localpart) != localpart:
if types.contains_invalid_mxid_characters(localpart):
raise SynapseError(
400,
"User ID must only contain characters which do not"
" require URL encoding."
"User ID can only contain characters a-z, 0-9, or '=_-./'",
)
user = UserID(localpart, self.hs.hostname)
user_id = user.to_string()
@@ -292,7 +289,7 @@ class RegistrationHandler(BaseHandler):
try:
identity_handler = self.hs.get_handlers().identity_handler
threepid = yield identity_handler.threepid_from_creds(c)
except:
except Exception:
logger.exception("Couldn't validate 3pid")
raise RegistrationError(400, "Couldn't validate 3pid")
+3 -3
View File
@@ -91,7 +91,7 @@ class RoomCreationHandler(BaseHandler):
if wchar in config["room_alias_name"]:
raise SynapseError(400, "Invalid characters in room alias")
room_alias = RoomAlias.create(
room_alias = RoomAlias(
config["room_alias_name"],
self.hs.hostname,
)
@@ -108,7 +108,7 @@ class RoomCreationHandler(BaseHandler):
for i in invite_list:
try:
UserID.from_string(i)
except:
except Exception:
raise SynapseError(400, "Invalid user_id: %s" % (i,))
invite_3pid_list = config.get("invite_3pid", [])
@@ -123,7 +123,7 @@ class RoomCreationHandler(BaseHandler):
while attempts < 5:
try:
random_string = stringutils.random_string(18)
gen_room_id = RoomID.create(
gen_room_id = RoomID(
random_string,
self.hs.hostname,
)
+7 -2
View File
@@ -20,6 +20,7 @@ from ._base import BaseHandler
from synapse.api.constants import (
EventTypes, JoinRules,
)
from synapse.util.logcontext import make_deferred_yieldable, preserve_fn
from synapse.util.async import concurrently_execute
from synapse.util.caches.descriptors import cachedInlineCallbacks
from synapse.util.caches.response_cache import ResponseCache
@@ -70,6 +71,7 @@ class RoomListHandler(BaseHandler):
if search_filter:
# We explicitly don't bother caching searches or requests for
# appservice specific lists.
logger.info("Bypassing cache as search request.")
return self._get_public_room_list(
limit, since_token, search_filter, network_tuple=network_tuple,
)
@@ -77,13 +79,16 @@ class RoomListHandler(BaseHandler):
key = (limit, since_token, network_tuple)
result = self.response_cache.get(key)
if not result:
logger.info("No cached result, calculating one.")
result = self.response_cache.set(
key,
self._get_public_room_list(
preserve_fn(self._get_public_room_list)(
limit, since_token, network_tuple=network_tuple
)
)
return result
else:
logger.info("Using cached deferred result.")
return make_deferred_yieldable(result)
@defer.inlineCallbacks
def _get_public_room_list(self, limit=None, since_token=None,
+1 -1
View File
@@ -61,7 +61,7 @@ class SearchHandler(BaseHandler):
assert batch_group is not None
assert batch_group_key is not None
assert batch_token is not None
except:
except Exception:
raise SynapseError(400, "Invalid batch")
try:
+3 -3
View File
@@ -15,7 +15,7 @@
from synapse.api.constants import Membership, EventTypes
from synapse.util.async import concurrently_execute
from synapse.util.logcontext import LoggingContext
from synapse.util.logcontext import LoggingContext, make_deferred_yieldable, preserve_fn
from synapse.util.metrics import Measure, measure_func
from synapse.util.caches.response_cache import ResponseCache
from synapse.push.clientformat import format_push_rules_for_user
@@ -184,11 +184,11 @@ class SyncHandler(object):
if not result:
result = self.response_cache.set(
sync_config.request_key,
self._wait_for_sync_for_user(
preserve_fn(self._wait_for_sync_for_user)(
sync_config, since_token, timeout, full_state
)
)
return result
return make_deferred_yieldable(result)
@defer.inlineCallbacks
def _wait_for_sync_for_user(self, sync_config, since_token, timeout,
+89 -38
View File
@@ -18,7 +18,7 @@ from OpenSSL.SSL import VERIFY_NONE
from synapse.api.errors import (
CodeMessageException, MatrixCodeMessageException, SynapseError, Codes,
)
from synapse.util.logcontext import preserve_context_over_fn
from synapse.util.logcontext import make_deferred_yieldable
from synapse.util import logcontext
import synapse.metrics
from synapse.http.endpoint import SpiderEndpoint
@@ -114,43 +114,73 @@ class SimpleHttpClient(object):
raise e
@defer.inlineCallbacks
def post_urlencoded_get_json(self, uri, args={}):
def post_urlencoded_get_json(self, uri, args={}, headers=None):
"""
Args:
uri (str):
args (dict[str, str|List[str]]): query params
headers (dict[str, List[str]]|None): If not None, a map from
header name to a list of values for that header
Returns:
Deferred[object]: parsed json
"""
# TODO: Do we ever want to log message contents?
logger.debug("post_urlencoded_get_json args: %s", args)
query_bytes = urllib.urlencode(encode_urlencode_args(args), True)
actual_headers = {
b"Content-Type": [b"application/x-www-form-urlencoded"],
b"User-Agent": [self.user_agent],
}
if headers:
actual_headers.update(headers)
response = yield self.request(
"POST",
uri.encode("ascii"),
headers=Headers({
b"Content-Type": [b"application/x-www-form-urlencoded"],
b"User-Agent": [self.user_agent],
}),
headers=Headers(actual_headers),
bodyProducer=FileBodyProducer(StringIO(query_bytes))
)
body = yield preserve_context_over_fn(readBody, response)
body = yield make_deferred_yieldable(readBody(response))
defer.returnValue(json.loads(body))
@defer.inlineCallbacks
def post_json_get_json(self, uri, post_json):
def post_json_get_json(self, uri, post_json, headers=None):
"""
Args:
uri (str):
post_json (object):
headers (dict[str, List[str]]|None): If not None, a map from
header name to a list of values for that header
Returns:
Deferred[object]: parsed json
"""
json_str = encode_canonical_json(post_json)
logger.debug("HTTP POST %s -> %s", json_str, uri)
actual_headers = {
b"Content-Type": [b"application/json"],
b"User-Agent": [self.user_agent],
}
if headers:
actual_headers.update(headers)
response = yield self.request(
"POST",
uri.encode("ascii"),
headers=Headers({
b"Content-Type": [b"application/json"],
b"User-Agent": [self.user_agent],
}),
headers=Headers(actual_headers),
bodyProducer=FileBodyProducer(StringIO(json_str))
)
body = yield preserve_context_over_fn(readBody, response)
body = yield make_deferred_yieldable(readBody(response))
if 200 <= response.code < 300:
defer.returnValue(json.loads(body))
@@ -160,7 +190,7 @@ class SimpleHttpClient(object):
defer.returnValue(json.loads(body))
@defer.inlineCallbacks
def get_json(self, uri, args={}):
def get_json(self, uri, args={}, headers=None):
""" Gets some json from the given URI.
Args:
@@ -169,6 +199,8 @@ class SimpleHttpClient(object):
None.
**Note**: The value of each key is assumed to be an iterable
and *not* a string.
headers (dict[str, List[str]]|None): If not None, a map from
header name to a list of values for that header
Returns:
Deferred: Succeeds when we get *any* 2xx HTTP response, with the
HTTP body as JSON.
@@ -177,13 +209,13 @@ class SimpleHttpClient(object):
error message.
"""
try:
body = yield self.get_raw(uri, args)
body = yield self.get_raw(uri, args, headers=headers)
defer.returnValue(json.loads(body))
except CodeMessageException as e:
raise self._exceptionFromFailedRequest(e.code, e.msg)
@defer.inlineCallbacks
def put_json(self, uri, json_body, args={}):
def put_json(self, uri, json_body, args={}, headers=None):
""" Puts some json to the given URI.
Args:
@@ -193,6 +225,8 @@ class SimpleHttpClient(object):
None.
**Note**: The value of each key is assumed to be an iterable
and *not* a string.
headers (dict[str, List[str]]|None): If not None, a map from
header name to a list of values for that header
Returns:
Deferred: Succeeds when we get *any* 2xx HTTP response, with the
HTTP body as JSON.
@@ -205,17 +239,21 @@ class SimpleHttpClient(object):
json_str = encode_canonical_json(json_body)
actual_headers = {
b"Content-Type": [b"application/json"],
b"User-Agent": [self.user_agent],
}
if headers:
actual_headers.update(headers)
response = yield self.request(
"PUT",
uri.encode("ascii"),
headers=Headers({
b"User-Agent": [self.user_agent],
"Content-Type": ["application/json"]
}),
headers=Headers(actual_headers),
bodyProducer=FileBodyProducer(StringIO(json_str))
)
body = yield preserve_context_over_fn(readBody, response)
body = yield make_deferred_yieldable(readBody(response))
if 200 <= response.code < 300:
defer.returnValue(json.loads(body))
@@ -226,7 +264,7 @@ class SimpleHttpClient(object):
raise CodeMessageException(response.code, body)
@defer.inlineCallbacks
def get_raw(self, uri, args={}):
def get_raw(self, uri, args={}, headers=None):
""" Gets raw text from the given URI.
Args:
@@ -235,6 +273,8 @@ class SimpleHttpClient(object):
None.
**Note**: The value of each key is assumed to be an iterable
and *not* a string.
headers (dict[str, List[str]]|None): If not None, a map from
header name to a list of values for that header
Returns:
Deferred: Succeeds when we get *any* 2xx HTTP response, with the
HTTP body at text.
@@ -246,15 +286,19 @@ class SimpleHttpClient(object):
query_bytes = urllib.urlencode(args, True)
uri = "%s?%s" % (uri, query_bytes)
actual_headers = {
b"User-Agent": [self.user_agent],
}
if headers:
actual_headers.update(headers)
response = yield self.request(
"GET",
uri.encode("ascii"),
headers=Headers({
b"User-Agent": [self.user_agent],
})
headers=Headers(actual_headers),
)
body = yield preserve_context_over_fn(readBody, response)
body = yield make_deferred_yieldable(readBody(response))
if 200 <= response.code < 300:
defer.returnValue(body)
@@ -274,27 +318,33 @@ class SimpleHttpClient(object):
# The two should be factored out.
@defer.inlineCallbacks
def get_file(self, url, output_stream, max_size=None):
def get_file(self, url, output_stream, max_size=None, headers=None):
"""GETs a file from a given URL
Args:
url (str): The URL to GET
output_stream (file): File to write the response body to.
headers (dict[str, List[str]]|None): If not None, a map from
header name to a list of values for that header
Returns:
A (int,dict,string,int) tuple of the file length, dict of the response
headers, absolute URI of the response and HTTP response code.
"""
actual_headers = {
b"User-Agent": [self.user_agent],
}
if headers:
actual_headers.update(headers)
response = yield self.request(
"GET",
url.encode("ascii"),
headers=Headers({
b"User-Agent": [self.user_agent],
})
headers=Headers(actual_headers),
)
headers = dict(response.headers.getAllRawHeaders())
resp_headers = dict(response.headers.getAllRawHeaders())
if 'Content-Length' in headers and headers['Content-Length'] > max_size:
if 'Content-Length' in resp_headers and resp_headers['Content-Length'] > max_size:
logger.warn("Requested URL is too large > %r bytes" % (self.max_size,))
raise SynapseError(
502,
@@ -315,10 +365,9 @@ class SimpleHttpClient(object):
# straight back in again
try:
length = yield preserve_context_over_fn(
_readBodyToFile,
response, output_stream, max_size
)
length = yield make_deferred_yieldable(_readBodyToFile(
response, output_stream, max_size,
))
except Exception as e:
logger.exception("Failed to download body")
raise SynapseError(
@@ -327,7 +376,9 @@ class SimpleHttpClient(object):
Codes.UNKNOWN,
)
defer.returnValue((length, headers, response.request.absoluteURI, response.code))
defer.returnValue(
(length, resp_headers, response.request.absoluteURI, response.code),
)
# XXX: FIXME: This is horribly copy-pasted from matrixfederationclient.
@@ -395,7 +446,7 @@ class CaptchaServerHttpClient(SimpleHttpClient):
)
try:
body = yield preserve_context_over_fn(readBody, response)
body = yield make_deferred_yieldable(readBody(response))
defer.returnValue(body)
except PartialDownloadError as e:
# twisted dislikes google's response, no content length.
+1 -1
View File
@@ -550,7 +550,7 @@ class MatrixFederationHttpClient(object):
length = yield _readBodyToFile(
response, output_stream, max_size
)
except:
except Exception:
logger.exception("Failed to download body")
raise
+1 -1
View File
@@ -130,7 +130,7 @@ def wrap_request_handler(request_handler, include_metrics=False):
pretty_print=_request_user_agent_is_curl(request),
version_string=self.version_string,
)
except:
except Exception:
logger.exception(
"Failed handle request %s.%s on %r: %r",
request_handler.__module__,
+3 -3
View File
@@ -48,7 +48,7 @@ def parse_integer_from_args(args, name, default=None, required=False):
if name in args:
try:
return int(args[name][0])
except:
except Exception:
message = "Query parameter %r must be an integer" % (name,)
raise SynapseError(400, message)
else:
@@ -88,7 +88,7 @@ def parse_boolean_from_args(args, name, default=None, required=False):
"true": True,
"false": False,
}[args[name][0]]
except:
except Exception:
message = (
"Boolean query parameter %r must be one of"
" ['true', 'false']"
@@ -162,7 +162,7 @@ def parse_json_value_from_request(request):
"""
try:
content_bytes = request.content.read()
except:
except Exception:
raise SynapseError(400, "Error reading JSON content.")
try:
+1 -1
View File
@@ -67,7 +67,7 @@ class SynapseRequest(Request):
ru_utime, ru_stime = context.get_resource_usage()
db_txn_count = context.db_txn_count
db_txn_duration = context.db_txn_duration
except:
except Exception:
ru_utime, ru_stime = (0, 0)
db_txn_count, db_txn_duration = (0, 0)
+1 -1
View File
@@ -289,7 +289,7 @@ class Notifier(object):
for user_stream in user_streams:
try:
user_stream.notify(stream_key, new_token, time_now_ms)
except:
except Exception:
logger.exception("Failed to notify listener")
self.notify_replication()
+1 -1
View File
@@ -121,7 +121,7 @@ class EmailPusher(object):
starting_max_ordering = self.max_stream_ordering
try:
yield self._unsafe_process()
except:
except Exception:
logger.exception("Exception processing notifs")
if self.max_stream_ordering == starting_max_ordering:
break
+3 -3
View File
@@ -131,7 +131,7 @@ class HttpPusher(object):
starting_max_ordering = self.max_stream_ordering
try:
yield self._unsafe_process()
except:
except Exception:
logger.exception("Exception processing notifs")
if self.max_stream_ordering == starting_max_ordering:
break
@@ -314,7 +314,7 @@ class HttpPusher(object):
defer.returnValue([])
try:
resp = yield self.http_client.post_json_get_json(self.url, notification_dict)
except:
except Exception:
logger.warn("Failed to push %s ", self.url)
defer.returnValue(False)
rejected = []
@@ -345,7 +345,7 @@ class HttpPusher(object):
}
try:
resp = yield self.http_client.post_json_get_json(self.url, d)
except:
except Exception:
logger.exception("Failed to push %s ", self.url)
defer.returnValue(False)
rejected = []
+1 -1
View File
@@ -27,7 +27,7 @@ logger = logging.getLogger(__name__)
try:
from synapse.push.emailpusher import EmailPusher
from synapse.push.mailer import Mailer, load_jinja2_templates
except:
except Exception:
pass
+3 -3
View File
@@ -137,7 +137,7 @@ class PusherPool:
)
yield preserve_context_over_deferred(defer.gatherResults(deferreds))
except:
except Exception:
logger.exception("Exception in pusher on_new_notifications")
@defer.inlineCallbacks
@@ -162,7 +162,7 @@ class PusherPool:
)
yield preserve_context_over_deferred(defer.gatherResults(deferreds))
except:
except Exception:
logger.exception("Exception in pusher on_new_receipts")
@defer.inlineCallbacks
@@ -188,7 +188,7 @@ class PusherPool:
for pusherdict in pushers:
try:
p = self.pusher_factory.create_pusher(pusherdict)
except:
except Exception:
logger.exception("Couldn't start a pusher: caught Exception")
continue
if p:
+1 -1
View File
@@ -162,7 +162,7 @@ class ReplicationStreamer(object):
)
try:
updates, current_token = yield stream.get_updates()
except:
except Exception:
logger.info("Failed to handle stream %s", stream.NAME)
raise
+1 -1
View File
@@ -93,7 +93,7 @@ class ClientDirectoryServer(ClientV1RestServlet):
)
except SynapseError as e:
raise e
except:
except Exception:
logger.exception("Failed to create association")
raise
except AuthError:
+3 -3
View File
@@ -211,7 +211,7 @@ class LoginRestServlet(ClientV1RestServlet):
user_id = identifier["user"]
if not user_id.startswith('@'):
user_id = UserID.create(
user_id = UserID(
user_id, self.hs.hostname
).to_string()
@@ -278,7 +278,7 @@ class LoginRestServlet(ClientV1RestServlet):
if user is None:
raise LoginError(401, "Invalid JWT", errcode=Codes.UNAUTHORIZED)
user_id = UserID.create(user, self.hs.hostname).to_string()
user_id = UserID(user, self.hs.hostname).to_string()
auth_handler = self.auth_handler
registered_user_id = yield auth_handler.check_user_exists(user_id)
if registered_user_id:
@@ -444,7 +444,7 @@ class CasTicketServlet(ClientV1RestServlet):
if required_value != actual_value:
raise LoginError(401, "Unauthorized", errcode=Codes.UNAUTHORIZED)
user_id = UserID.create(user, self.hs.hostname).to_string()
user_id = UserID(user, self.hs.hostname).to_string()
auth_handler = self.auth_handler
registered_user_id = yield auth_handler.check_user_exists(user_id)
if not registered_user_id:
+1 -1
View File
@@ -78,7 +78,7 @@ class PresenceStatusRestServlet(ClientV1RestServlet):
raise KeyError()
except SynapseError as e:
raise e
except:
except Exception:
raise SynapseError(400, "Unable to parse state")
# yield self.presence_handler.set_state(user, state)
+2 -2
View File
@@ -52,7 +52,7 @@ class ProfileDisplaynameRestServlet(ClientV1RestServlet):
try:
new_name = content["displayname"]
except:
except Exception:
defer.returnValue((400, "Unable to parse name"))
yield self.profile_handler.set_displayname(
@@ -94,7 +94,7 @@ class ProfileAvatarURLRestServlet(ClientV1RestServlet):
content = parse_json_object_from_request(request)
try:
new_name = content["avatar_url"]
except:
except Exception:
defer.returnValue((400, "Unable to parse name"))
yield self.profile_handler.set_avatar_url(
+3 -3
View File
@@ -238,7 +238,7 @@ class JoinRoomAliasServlet(ClientV1RestServlet):
try:
content = parse_json_object_from_request(request)
except:
except Exception:
# Turns out we used to ignore the body entirely, and some clients
# cheekily send invalid bodies.
content = {}
@@ -247,7 +247,7 @@ class JoinRoomAliasServlet(ClientV1RestServlet):
room_id = room_identifier
try:
remote_room_hosts = request.args["server_name"]
except:
except Exception:
remote_room_hosts = None
elif RoomAlias.is_valid(room_identifier):
handler = self.handlers.room_member_handler
@@ -588,7 +588,7 @@ class RoomMembershipRestServlet(ClientV1RestServlet):
try:
content = parse_json_object_from_request(request)
except:
except Exception:
# Turns out we used to ignore the body entirely, and some clients
# cheekily send invalid bodies.
content = {}
+3 -4
View File
@@ -25,7 +25,7 @@ logger = logging.getLogger(__name__)
class DevicesRestServlet(servlet.RestServlet):
PATTERNS = client_v2_patterns("/devices$", releases=[], v2_alpha=False)
PATTERNS = client_v2_patterns("/devices$", v2_alpha=False)
def __init__(self, hs):
"""
@@ -51,7 +51,7 @@ class DeleteDevicesRestServlet(servlet.RestServlet):
API for bulk deletion of devices. Accepts a JSON object with a devices
key which lists the device_ids to delete. Requires user interactive auth.
"""
PATTERNS = client_v2_patterns("/delete_devices", releases=[], v2_alpha=False)
PATTERNS = client_v2_patterns("/delete_devices", v2_alpha=False)
def __init__(self, hs):
super(DeleteDevicesRestServlet, self).__init__()
@@ -93,8 +93,7 @@ class DeleteDevicesRestServlet(servlet.RestServlet):
class DeviceRestServlet(servlet.RestServlet):
PATTERNS = client_v2_patterns("/devices/(?P<device_id>[^/]*)$",
releases=[], v2_alpha=False)
PATTERNS = client_v2_patterns("/devices/(?P<device_id>[^/]*)$", v2_alpha=False)
def __init__(self, hs):
"""
+1 -1
View File
@@ -50,7 +50,7 @@ class GetFilterRestServlet(RestServlet):
try:
filter_id = int(filter_id)
except:
except Exception:
raise SynapseError(400, "Invalid filter_id")
try:
+1 -1
View File
@@ -412,7 +412,7 @@ class GroupCreateServlet(RestServlet):
# TODO: Create group on remote server
content = parse_json_object_from_request(request)
localpart = content.pop("localpart")
group_id = GroupID.create(localpart, self.server_name).to_string()
group_id = GroupID(localpart, self.server_name).to_string()
result = yield self.groups_handler.create_group(group_id, user_id, content)
+4 -14
View File
@@ -53,8 +53,7 @@ class KeyUploadServlet(RestServlet):
},
}
"""
PATTERNS = client_v2_patterns("/keys/upload(/(?P<device_id>[^/]+))?$",
releases=())
PATTERNS = client_v2_patterns("/keys/upload(/(?P<device_id>[^/]+))?$")
def __init__(self, hs):
"""
@@ -128,10 +127,7 @@ class KeyQueryServlet(RestServlet):
} } } } } }
"""
PATTERNS = client_v2_patterns(
"/keys/query$",
releases=()
)
PATTERNS = client_v2_patterns("/keys/query$")
def __init__(self, hs):
"""
@@ -160,10 +156,7 @@ class KeyChangesServlet(RestServlet):
200 OK
{ "changed": ["@foo:example.com"] }
"""
PATTERNS = client_v2_patterns(
"/keys/changes$",
releases=()
)
PATTERNS = client_v2_patterns("/keys/changes$")
def __init__(self, hs):
"""
@@ -213,10 +206,7 @@ class OneTimeKeyServlet(RestServlet):
} } } }
"""
PATTERNS = client_v2_patterns(
"/keys/claim$",
releases=()
)
PATTERNS = client_v2_patterns("/keys/claim$")
def __init__(self, hs):
super(OneTimeKeyServlet, self).__init__()
@@ -30,7 +30,7 @@ logger = logging.getLogger(__name__)
class NotificationsServlet(RestServlet):
PATTERNS = client_v2_patterns("/notifications$", releases=())
PATTERNS = client_v2_patterns("/notifications$")
def __init__(self, hs):
super(NotificationsServlet, self).__init__()
+1 -1
View File
@@ -29,7 +29,7 @@ logger = logging.getLogger(__name__)
class SendToDeviceRestServlet(servlet.RestServlet):
PATTERNS = client_v2_patterns(
"/sendToDevice/(?P<message_type>[^/]*)/(?P<txn_id>[^/]*)$",
releases=[], v2_alpha=False
v2_alpha=False
)
def __init__(self, hs):
+1 -1
View File
@@ -125,7 +125,7 @@ class SyncRestServlet(RestServlet):
filter_object = json.loads(filter_id)
set_timeline_upper_limit(filter_object,
self.hs.config.filter_timeline_limit)
except:
except Exception:
raise SynapseError(400, "Invalid filter JSON")
self.filtering.check_valid_filter(filter_object)
filter = FilterCollection(filter_object)
+4 -7
View File
@@ -26,7 +26,7 @@ logger = logging.getLogger(__name__)
class ThirdPartyProtocolsServlet(RestServlet):
PATTERNS = client_v2_patterns("/thirdparty/protocols", releases=())
PATTERNS = client_v2_patterns("/thirdparty/protocols")
def __init__(self, hs):
super(ThirdPartyProtocolsServlet, self).__init__()
@@ -43,8 +43,7 @@ class ThirdPartyProtocolsServlet(RestServlet):
class ThirdPartyProtocolServlet(RestServlet):
PATTERNS = client_v2_patterns("/thirdparty/protocol/(?P<protocol>[^/]+)$",
releases=())
PATTERNS = client_v2_patterns("/thirdparty/protocol/(?P<protocol>[^/]+)$")
def __init__(self, hs):
super(ThirdPartyProtocolServlet, self).__init__()
@@ -66,8 +65,7 @@ class ThirdPartyProtocolServlet(RestServlet):
class ThirdPartyUserServlet(RestServlet):
PATTERNS = client_v2_patterns("/thirdparty/user(/(?P<protocol>[^/]+))?$",
releases=())
PATTERNS = client_v2_patterns("/thirdparty/user(/(?P<protocol>[^/]+))?$")
def __init__(self, hs):
super(ThirdPartyUserServlet, self).__init__()
@@ -90,8 +88,7 @@ class ThirdPartyUserServlet(RestServlet):
class ThirdPartyLocationServlet(RestServlet):
PATTERNS = client_v2_patterns("/thirdparty/location(/(?P<protocol>[^/]+))?$",
releases=())
PATTERNS = client_v2_patterns("/thirdparty/location(/(?P<protocol>[^/]+))?$")
def __init__(self, hs):
super(ThirdPartyLocationServlet, self).__init__()
@@ -65,7 +65,7 @@ class UserDirectorySearchRestServlet(RestServlet):
try:
search_term = body["search_term"]
except:
except Exception:
raise SynapseError(400, "`search_term` is required field")
results = yield self.user_directory_handler.search_users(
+1 -1
View File
@@ -213,7 +213,7 @@ class RemoteKey(Resource):
)
except KeyLookupError as e:
logger.info("Failed to fetch key: %s", e)
except:
except Exception:
logger.exception("Failed to get key for %r", server_name)
yield self.query_keys(
request, query, query_remote_on_cache_miss=False
+5 -2
View File
@@ -17,6 +17,7 @@ from synapse.http.server import respond_with_json, finish_request
from synapse.api.errors import (
cs_error, Codes, SynapseError
)
from synapse.util import logcontext
from twisted.internet import defer
from twisted.protocols.basic import FileSender
@@ -44,7 +45,7 @@ def parse_media_id(request):
except UnicodeDecodeError:
pass
return server_name, media_id, file_name
except:
except Exception:
raise SynapseError(
404,
"Invalid media id token %r" % (request.postpath,),
@@ -103,7 +104,9 @@ def respond_with_file(request, media_type, file_path,
)
with open(file_path, "rb") as f:
yield FileSender().beginFileTransfer(f, request)
yield logcontext.make_deferred_yieldable(
FileSender().beginFileTransfer(f, request)
)
finish_request(request)
else:
+1 -1
View File
@@ -310,7 +310,7 @@ class MediaRepository(object):
media_length=length,
filesystem_id=file_id,
)
except:
except Exception:
os.remove(fname)
raise
@@ -367,7 +367,7 @@ class PreviewUrlResource(Resource):
dirs = self.filepaths.url_cache_filepath_dirs_to_delete(media_id)
for dir in dirs:
os.rmdir(dir)
except:
except Exception:
pass
yield self.store.delete_url_cache(removed_media)
@@ -397,7 +397,7 @@ class PreviewUrlResource(Resource):
dirs = self.filepaths.url_cache_filepath_dirs_to_delete(media_id)
for dir in dirs:
os.rmdir(dir)
except:
except Exception:
pass
thumbnail_dir = self.filepaths.url_cache_thumbnail_directory(media_id)
@@ -415,7 +415,7 @@ class PreviewUrlResource(Resource):
dirs = self.filepaths.url_cache_thumbnail_dirs_to_delete(media_id)
for dir in dirs:
os.rmdir(dir)
except:
except Exception:
pass
yield self.store.delete_url_cache_media(removed_media)
+1 -1
View File
@@ -560,7 +560,7 @@ def _resolve_with_state(unconflicted_state_ids, conflicted_state_ds, auth_event_
resolved_state = _resolve_state_events(
conflicted_state, auth_events
)
except:
except Exception:
logger.exception("Failed to resolve state")
raise
+1 -1
View File
@@ -103,7 +103,7 @@ class LoggingTransaction(object):
"[SQL values] {%s} %r",
self.name, args[0]
)
except:
except Exception:
# Don't let logging failures stop SQL from working
pass
+1 -1
View File
@@ -98,7 +98,7 @@ class BackgroundUpdateStore(SQLBaseStore):
result = yield self.do_next_background_update(
self.BACKGROUND_UPDATE_DURATION_MS
)
except:
except Exception:
logger.exception("Error doing update")
else:
if result is None:
+1 -1
View File
@@ -1480,7 +1480,7 @@ class EventsStore(SQLBaseStore):
for i in ids
if i in res
])
except:
except Exception:
logger.exception("Failed to callback")
with PreserveLoggingContext():
reactor.callFromThread(fire, event_list, row_dict)
+1 -1
View File
@@ -66,7 +66,7 @@ def prepare_database(db_conn, database_engine, config):
cur.close()
db_conn.commit()
except:
except Exception:
db_conn.rollback()
raise
+1 -1
View File
@@ -636,7 +636,7 @@ class RoomMemberStore(SQLBaseStore):
room_id = row["room_id"]
try:
content = json.loads(row["content"])
except:
except Exception:
continue
display_name = content.get("displayname", None)
+1 -1
View File
@@ -22,7 +22,7 @@ def run_create(cur, database_engine, *args, **kwargs):
# NULL indicates user was not registered by an appservice.
try:
cur.execute("ALTER TABLE users ADD COLUMN appservice_id TEXT")
except:
except Exception:
# Maybe we already added the column? Hope so...
pass
+2 -2
View File
@@ -81,7 +81,7 @@ class SearchStore(BackgroundUpdateStore):
etype = row["type"]
try:
content = json.loads(row["content"])
except:
except Exception:
continue
if etype == "m.room.message":
@@ -407,7 +407,7 @@ class SearchStore(BackgroundUpdateStore):
origin_server_ts, stream = pagination_token.split(",")
origin_server_ts = int(origin_server_ts)
stream = int(stream)
except:
except Exception:
raise SynapseError(400, "Invalid pagination token")
clauses.append(
+3 -3
View File
@@ -80,13 +80,13 @@ class PaginationConfig(object):
from_tok = None # For backwards compat.
elif from_tok:
from_tok = StreamToken.from_string(from_tok)
except:
except Exception:
raise SynapseError(400, "'from' paramater is invalid")
try:
if to_tok:
to_tok = StreamToken.from_string(to_tok)
except:
except Exception:
raise SynapseError(400, "'to' paramater is invalid")
limit = get_param("limit", None)
@@ -98,7 +98,7 @@ class PaginationConfig(object):
try:
return PaginationConfig(from_tok, to_tok, direction, limit)
except:
except Exception:
logger.exception("Failed to create pagination config")
raise SynapseError(400, "Invalid request.")
+37 -8
View File
@@ -12,6 +12,7 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import string
from synapse.api.errors import SynapseError
@@ -126,15 +127,11 @@ class DomainSpecificString(
try:
cls.from_string(s)
return True
except:
except Exception:
return False
__str__ = to_string
@classmethod
def create(cls, localpart, domain,):
return cls(localpart=localpart, domain=domain)
class UserID(DomainSpecificString):
"""Structure representing a user ID."""
@@ -160,6 +157,38 @@ class GroupID(DomainSpecificString):
"""Structure representing a group ID."""
SIGIL = "+"
@classmethod
def from_string(cls, s):
group_id = super(GroupID, cls).from_string(s)
if not group_id.localpart:
raise SynapseError(
400,
"Group ID cannot be empty",
)
if contains_invalid_mxid_characters(group_id.localpart):
raise SynapseError(
400,
"Group ID can only contain characters a-z, 0-9, or '=_-./'",
)
return group_id
mxid_localpart_allowed_characters = set("_-./=" + string.ascii_lowercase + string.digits)
def contains_invalid_mxid_characters(localpart):
"""Check for characters not allowed in an mxid or groupid localpart
Args:
localpart (basestring): the localpart to be checked
Returns:
bool: True if there are any naughty characters
"""
return any(c not in mxid_localpart_allowed_characters for c in localpart)
class StreamToken(
namedtuple("Token", (
@@ -184,7 +213,7 @@ class StreamToken(
# i.e. old token from before receipt_key
keys.append("0")
return cls(*keys)
except:
except Exception:
raise SynapseError(400, "Invalid Token")
def to_string(self):
@@ -270,7 +299,7 @@ class RoomStreamToken(namedtuple("_StreamToken", "topological stream")):
if string[0] == 't':
parts = string[1:].split('-', 1)
return cls(topological=int(parts[0]), stream=int(parts[1]))
except:
except Exception:
pass
raise SynapseError(400, "Invalid token %r" % (string,))
@@ -279,7 +308,7 @@ class RoomStreamToken(namedtuple("_StreamToken", "topological stream")):
try:
if string[0] == 's':
return cls(topological=None, stream=int(string[1:]))
except:
except Exception:
pass
raise SynapseError(400, "Invalid token %r" % (string,))
+9 -9
View File
@@ -59,9 +59,9 @@ class Clock(object):
f(function): The function to call repeatedly.
msec(float): How long to wait between calls in milliseconds.
"""
l = task.LoopingCall(f)
l.start(msec / 1000.0, now=False)
return l
call = task.LoopingCall(f)
call.start(msec / 1000.0, now=False)
return call
def call_later(self, delay, callback, *args, **kwargs):
"""Call something later
@@ -82,7 +82,7 @@ class Clock(object):
def cancel_call_later(self, timer, ignore_errs=False):
try:
timer.cancel()
except:
except Exception:
if not ignore_errs:
raise
@@ -97,12 +97,12 @@ class Clock(object):
try:
ret_deferred.errback(e)
except:
except Exception:
pass
try:
given_deferred.cancel()
except:
except Exception:
pass
timer = None
@@ -110,7 +110,7 @@ class Clock(object):
def cancel(res):
try:
self.cancel_call_later(timer)
except:
except Exception:
pass
return res
@@ -119,7 +119,7 @@ class Clock(object):
def success(res):
try:
ret_deferred.callback(res)
except:
except Exception:
pass
return res
@@ -127,7 +127,7 @@ class Clock(object):
def err(res):
try:
ret_deferred.errback(res)
except:
except Exception:
pass
given_deferred.addCallbacks(callback=success, errback=err)
+3 -3
View File
@@ -73,7 +73,7 @@ class ObservableDeferred(object):
try:
# TODO: Handle errors here.
self._observers.pop().callback(r)
except:
except Exception:
pass
return r
@@ -83,7 +83,7 @@ class ObservableDeferred(object):
try:
# TODO: Handle errors here.
self._observers.pop().errback(f)
except:
except Exception:
pass
if consumeErrors:
@@ -205,7 +205,7 @@ class Linearizer(object):
try:
with PreserveLoggingContext():
yield current_defer
except:
except Exception:
logger.exception("Unexpected exception in Linearizer")
logger.info("Acquired linearizer lock %r for key %r", self.name,
+1 -1
View File
@@ -42,7 +42,7 @@ try:
def get_thread_resource_usage():
return resource.getrusage(RUSAGE_THREAD)
except:
except Exception:
# If the system doesn't support resource.getrusage(RUSAGE_THREAD) then we
# won't track resource usage by returning None.
def get_thread_resource_usage():
+1 -1
View File
@@ -189,7 +189,7 @@ class RetryDestinationLimiter(object):
yield self.store.set_destination_retry_timings(
self.destination, retry_last_ts, self.retry_interval
)
except:
except Exception:
logger.exception(
"Failed to store set_destination_retry_timings",
)
+1 -4
View File
@@ -91,7 +91,4 @@ class WheelTimer(object):
return ret
def __len__(self):
l = 0
for entry in self.entries:
l += len(entry.queue)
return l
return sum(len(entry.queue) for entry in self.entries)
+1 -1
View File
@@ -65,7 +65,7 @@ class ApplicationServiceStoreTestCase(unittest.TestCase):
for f in self.as_yaml_files:
try:
os.remove(f)
except:
except Exception:
pass
def _add_appservice(self, as_token, id, url, hs_token, sender):
+23 -1
View File
@@ -17,7 +17,7 @@ from tests import unittest
from synapse.api.errors import SynapseError
from synapse.server import HomeServer
from synapse.types import UserID, RoomAlias
from synapse.types import UserID, RoomAlias, GroupID
mock_homeserver = HomeServer(hostname="my.domain")
@@ -60,3 +60,25 @@ class RoomAliasTestCase(unittest.TestCase):
room = RoomAlias("channel", "my.domain")
self.assertEquals(room.to_string(), "#channel:my.domain")
class GroupIDTestCase(unittest.TestCase):
def test_parse(self):
group_id = GroupID.from_string("+group/=_-.123:my.domain")
self.assertEqual("group/=_-.123", group_id.localpart)
self.assertEqual("my.domain", group_id.domain)
def test_validate(self):
bad_ids = [
"$badsigil:domain",
"+:empty",
] + [
"+group" + c + ":domain" for c in "A%?æ£"
]
for id_string in bad_ids:
try:
GroupID.from_string(id_string)
self.fail("Parsing '%s' should raise exception" % id_string)
except SynapseError as exc:
self.assertEqual(400, exc.code)
self.assertEqual("M_UNKNOWN", exc.errcode)
+4 -4
View File
@@ -184,7 +184,7 @@ class MockHttpResource(HttpServer):
mock_request.args = urlparse.parse_qs(path.split('?')[1])
mock_request.path = path.split('?')[0]
path = mock_request.path
except:
except Exception:
pass
for (method, pattern, func) in self.callbacks:
@@ -364,13 +364,13 @@ class MemoryDataStore(object):
return {
"name": self.tokens_to_users[token],
}
except:
except Exception:
raise StoreError(400, "User does not exist.")
def get_room(self, room_id):
try:
return self.rooms[room_id]
except:
except Exception:
return None
def store_room(self, room_id, room_creator_user_id, is_public):
@@ -499,7 +499,7 @@ class DeferredMockCallable(object):
for _, _, d in self.expectations:
try:
d.errback(failure)
except:
except Exception:
pass
raise failure