Merge branch 'develop' of github.com:matrix-org/synapse into postgres

This commit is contained in:
Erik Johnston 2015-04-27 14:41:40 +01:00
commit 6f8e2d517e
24 changed files with 641 additions and 390 deletions

View File

@ -86,7 +86,7 @@ Homeserver Installation
=======================
System requirements:
- POSIX-compliant system (tested on Linux & OSX)
- POSIX-compliant system (tested on Linux & OS X)
- Python 2.7
Synapse is written in python but some of the libraries is uses are written in
@ -128,6 +128,15 @@ To set up your homeserver, run (in your virtualenv, as before)::
Substituting your host and domain name as appropriate.
This will generate you a config file that you can then customise, but it will
also generate a set of keys for you. These keys will allow your Home Server to
identify itself to other Home Servers, so don't lose or delete them. It would be
wise to back them up somewhere safe. If, for whatever reason, you do need to
change your Home Server's keys, you may find that other Home Servers have the
old key cached. If you update the signing key, you should change the name of the
key in the <server name>.signing.key file (the second word, which by default is
, 'auto') to something different.
By default, registration of new users is disabled. You can either enable
registration in the config by specifying ``enable_registration: true``
(it is then recommended to also set up CAPTCHA), or
@ -367,10 +376,6 @@ SRV record, as that is the name other machines will expect it to have::
You may additionally want to pass one or more "-v" options, in order to
increase the verbosity of logging output; at least for initial testing.
For the initial alpha release, the homeserver is not speaking TLS for
either client-server or server-server traffic for ease of debugging. We have
also not spent any time yet getting the homeserver to run behind loadbalancers.
Running a Demo Federation of Homeservers
----------------------------------------
@ -433,7 +438,7 @@ track 3PID logins and publish end-user public keys.
It's currently early days for identity servers as Matrix is not yet using 3PIDs
as the primary means of identity and E2E encryption is not complete. As such,
we are running a single identity server (http://matrix.org:8090) at the current
we are running a single identity server (https://matrix.org) at the current
time.

View File

@ -1,3 +1,37 @@
Upgrading to v0.x.x
===================
Application services have had a breaking API change in this version.
They can no longer register themselves with a home server using the AS HTTP API. This
decision was made because a compromised application service with free reign to register
any regex in effect grants full read/write access to the home server if a regex of ``.*``
is used. An attack where a compromised AS re-registers itself with ``.*`` was deemed too
big of a security risk to ignore, and so the ability to register with the HS remotely has
been removed.
It has been replaced by specifying a list of application service registrations in
``homeserver.yaml``::
app_service_config_files: ["registration-01.yaml", "registration-02.yaml"]
Where ``registration-01.yaml`` looks like::
url: <String> # e.g. "https://my.application.service.com"
as_token: <String>
hs_token: <String>
sender_localpart: <String> # This is a new field which denotes the user_id localpart when using the AS token
namespaces:
users:
- exclusive: <Boolean>
regex: <String> # e.g. "@prefix_.*"
aliases:
- exclusive: <Boolean>
regex: <String>
rooms:
- exclusive: <Boolean>
regex: <String>
Upgrading to v0.8.0
===================

50
docs/metrics-howto.rst Normal file
View File

@ -0,0 +1,50 @@
How to monitor Synapse metrics using Prometheus
===============================================
1: Install prometheus:
Follow instructions at http://prometheus.io/docs/introduction/install/
2: Enable synapse metrics:
Simply setting a (local) port number will enable it. Pick a port.
prometheus itself defaults to 9090, so starting just above that for
locally monitored services seems reasonable. E.g. 9092:
Add to homeserver.yaml
metrics_port: 9092
Restart synapse
3: Check out synapse-prometheus-config
https://github.com/matrix-org/synapse-prometheus-config
4: Add ``synapse.html`` and ``synapse.rules``
The ``.html`` file needs to appear in prometheus's ``consoles`` directory,
and the ``.rules`` file needs to be invoked somewhere in the main config
file. A symlink to each from the git checkout into the prometheus directory
might be easiest to ensure ``git pull`` keeps it updated.
5: Add a prometheus target for synapse
This is easiest if prometheus runs on the same machine as synapse, as it can
then just use localhost::
global: {
rule_file: "synapse.rules"
}
job: {
name: "synapse"
target_group: {
target: "http://localhost:9092/"
}
}
6: Start prometheus::
./prometheus -config.file=prometheus.conf
7: Wait a few seconds for it to start and perform the first scrape,
then visit the console:
http://server-where-prometheus-runs:9090/consoles/synapse.html

View File

@ -183,18 +183,10 @@ class Auth(object):
else:
join_rule = JoinRules.INVITE
user_level = self._get_power_level_from_event_state(
event,
event.user_id,
auth_events,
)
user_level = self._get_user_power_level(event.user_id, auth_events)
ban_level, kick_level, redact_level = (
self._get_ops_level_from_event_state(
event,
auth_events,
)
)
# FIXME (erikj): What should we do here as the default?
ban_level = self._get_named_level(auth_events, "ban", 50)
logger.debug(
"is_membership_change_allowed: %s",
@ -210,11 +202,6 @@ class Auth(object):
}
)
if ban_level:
ban_level = int(ban_level)
else:
ban_level = 50 # FIXME (erikj): What should we do here?
if Membership.JOIN != membership:
# JOIN is the only action you can perform if you're not in the room
if not caller_in_room: # caller isn't joined
@ -259,10 +246,7 @@ class Auth(object):
403, "You cannot unban user &s." % (target_user_id,)
)
elif target_user_id != event.user_id:
if kick_level:
kick_level = int(kick_level)
else:
kick_level = 50 # FIXME (erikj): What should we do here?
kick_level = self._get_named_level(auth_events, "kick", 50)
if user_level < kick_level:
raise AuthError(
@ -276,34 +260,42 @@ class Auth(object):
return True
def _get_power_level_from_event_state(self, event, user_id, auth_events):
def _get_power_level_event(self, auth_events):
key = (EventTypes.PowerLevels, "", )
power_level_event = auth_events.get(key)
level = None
return auth_events.get(key)
def _get_user_power_level(self, user_id, auth_events):
power_level_event = self._get_power_level_event(auth_events)
if power_level_event:
level = power_level_event.content.get("users", {}).get(user_id)
if not level:
level = power_level_event.content.get("users_default", 0)
if level is None:
return 0
else:
return int(level)
else:
key = (EventTypes.Create, "", )
create_event = auth_events.get(key)
if (create_event is not None and
create_event.content["creator"] == user_id):
return 100
else:
return 0
return level
def _get_named_level(self, auth_events, name, default):
power_level_event = self._get_power_level_event(auth_events)
def _get_ops_level_from_event_state(self, event, auth_events):
key = (EventTypes.PowerLevels, "", )
power_level_event = auth_events.get(key)
if not power_level_event:
return default
if power_level_event:
return (
power_level_event.content.get("ban", 50),
power_level_event.content.get("kick", 50),
power_level_event.content.get("redact", 50),
)
return None, None, None,
level = power_level_event.content.get(name, None)
if level is not None:
return int(level)
else:
return default
@defer.inlineCallbacks
def get_user_by_req(self, request):
@ -498,16 +490,7 @@ class Auth(object):
else:
send_level = 0
user_level = self._get_power_level_from_event_state(
event,
event.user_id,
auth_events,
)
if user_level:
user_level = int(user_level)
else:
user_level = 0
user_level = self._get_user_power_level(event.user_id, auth_events)
if user_level < send_level:
raise AuthError(
@ -539,16 +522,9 @@ class Auth(object):
return True
def _check_redaction(self, event, auth_events):
user_level = self._get_power_level_from_event_state(
event,
event.user_id,
auth_events,
)
user_level = self._get_user_power_level(event.user_id, auth_events)
_, _, redact_level = self._get_ops_level_from_event_state(
event,
auth_events,
)
redact_level = self._get_named_level(auth_events, "redact", 50)
if user_level < redact_level:
raise AuthError(
@ -576,11 +552,7 @@ class Auth(object):
if not current_state:
return
user_level = self._get_power_level_from_event_state(
event,
event.user_id,
auth_events,
)
user_level = self._get_user_power_level(event.user_id, auth_events)
# Check other levels:
levels_to_check = [

View File

@ -35,8 +35,8 @@ class Codes(object):
LIMIT_EXCEEDED = "M_LIMIT_EXCEEDED"
CAPTCHA_NEEDED = "M_CAPTCHA_NEEDED"
CAPTCHA_INVALID = "M_CAPTCHA_INVALID"
MISSING_PARAM = "M_MISSING_PARAM",
TOO_LARGE = "M_TOO_LARGE",
MISSING_PARAM = "M_MISSING_PARAM"
TOO_LARGE = "M_TOO_LARGE"
EXCLUSIVE = "M_EXCLUSIVE"

View File

@ -53,6 +53,7 @@ class RegistrationConfig(Config):
@classmethod
def generate_config(cls, args, config_dir_path):
super(RegistrationConfig, cls).generate_config(args, config_dir_path)
if args.enable_registration is None:
args.enable_registration = False

View File

@ -24,6 +24,8 @@ from synapse.api.errors import SynapseError, Codes
from synapse.util.retryutils import get_retry_limiter
from synapse.util.async import create_observer
from OpenSSL import crypto
import logging
@ -38,6 +40,8 @@ class Keyring(object):
self.clock = hs.get_clock()
self.hs = hs
self.key_downloads = {}
@defer.inlineCallbacks
def verify_json_for_server(self, server_name, json_object):
logger.debug("Verifying for %s", server_name)
@ -97,6 +101,22 @@ class Keyring(object):
defer.returnValue(cached[0])
return
download = self.key_downloads.get(server_name)
if download is None:
download = self._get_server_verify_key_impl(server_name, key_ids)
self.key_downloads[server_name] = download
@download.addBoth
def callback(ret):
del self.key_downloads[server_name]
return ret
r = yield create_observer(download)
defer.returnValue(r)
@defer.inlineCallbacks
def _get_server_verify_key_impl(self, server_name, key_ids):
# Try to fetch the key from the remote server.
limiter = yield get_retry_limiter(

View File

@ -36,6 +36,9 @@ metrics = synapse.metrics.get_metrics_for(__name__)
# Don't bother bumping "last active" time if it differs by less than 60 seconds
LAST_ACTIVE_GRANULARITY = 60*1000
# Keep no more than this number of offline serial revisions
MAX_OFFLINE_SERIALS = 1000
# TODO(paul): Maybe there's one of these I can steal from somewhere
def partition(l, func):
@ -135,6 +138,9 @@ class PresenceHandler(BaseHandler):
self._remote_sendmap = {}
# map remote users to sets of local users who're interested in them
self._remote_recvmap = {}
# list of (serial, set of(userids)) tuples, ordered by serial, latest
# first
self._remote_offline_serials = []
# map any user to a UserPresenceCache
self._user_cachemap = {}
@ -714,8 +720,24 @@ class PresenceHandler(BaseHandler):
statuscache=statuscache,
)
user_id = user.to_string()
if state["presence"] == PresenceState.OFFLINE:
self._remote_offline_serials.insert(
0,
(self._user_cachemap_latest_serial, set([user_id]))
)
while len(self._remote_offline_serials) > MAX_OFFLINE_SERIALS:
self._remote_offline_serials.pop() # remove the oldest
del self._user_cachemap[user]
else:
# Remove the user from remote_offline_serials now that they're
# no longer offline
for idx, elem in enumerate(self._remote_offline_serials):
(_, user_ids) = elem
user_ids.discard(user_id)
if not user_ids:
self._remote_offline_serials.pop(idx)
for poll in content.get("poll", []):
user = UserID.from_string(poll)
@ -836,6 +858,8 @@ class PresenceEventSource(object):
presence = self.hs.get_handlers().presence_handler
cachemap = presence._user_cachemap
clock = self.clock
latest_serial = None
updates = []
# TODO(paul): use a DeferredList ? How to limit concurrency.
@ -845,18 +869,31 @@ class PresenceEventSource(object):
if cached.serial <= from_key:
continue
if (yield self.is_visible(observer_user, observed_user)):
updates.append((observed_user, cached))
if not (yield self.is_visible(observer_user, observed_user)):
continue
if latest_serial is None or cached.serial > latest_serial:
latest_serial = cached.serial
updates.append(cached.make_event(user=observed_user, clock=clock))
# TODO(paul): limit
for serial, user_ids in presence._remote_offline_serials:
if serial < from_key:
break
for u in user_ids:
updates.append({
"type": "m.presence",
"content": {"user_id": u, "presence": PresenceState.OFFLINE},
})
# TODO(paul): For the v2 API we want to tell the client their from_key
# is too old if we fell off the end of the _remote_offline_serials
# list, and get them to invalidate+resync. In v1 we have no such
# concept so this is a best-effort result.
if updates:
clock = self.clock
latest_serial = max([x[1].serial for x in updates])
data = [x[1].make_event(user=x[0], clock=clock) for x in updates]
defer.returnValue((data, latest_serial))
defer.returnValue((updates, latest_serial))
else:
defer.returnValue(([], presence._user_cachemap_latest_serial))

View File

@ -124,7 +124,7 @@ class RoomCreationHandler(BaseHandler):
msg_handler = self.hs.get_handlers().message_handler
for event in creation_events:
yield msg_handler.create_and_send_event(event)
yield msg_handler.create_and_send_event(event, ratelimit=False)
if "name" in config:
name = config["name"]
@ -134,7 +134,7 @@ class RoomCreationHandler(BaseHandler):
"sender": user_id,
"state_key": "",
"content": {"name": name},
})
}, ratelimit=False)
if "topic" in config:
topic = config["topic"]
@ -144,7 +144,7 @@ class RoomCreationHandler(BaseHandler):
"sender": user_id,
"state_key": "",
"content": {"topic": topic},
})
}, ratelimit=False)
for invitee in invite_list:
yield msg_handler.create_and_send_event({
@ -153,7 +153,7 @@ class RoomCreationHandler(BaseHandler):
"room_id": room_id,
"sender": user_id,
"content": {"membership": Membership.INVITE},
})
}, ratelimit=False)
result = {"room_id": room_id}

View File

@ -51,6 +51,80 @@ response_timer = metrics.register_distribution(
labels=["method", "servlet"]
)
_next_request_id = 0
def request_handler(request_handler):
"""Wraps a method that acts as a request handler with the necessary logging
and exception handling.
The method must have a signature of "handle_foo(self, request)". The
argument "self" must have "version_string" and "clock" attributes. The
argument "request" must be a twisted HTTP request.
The method must return a deferred. If the deferred succeeds we assume that
a response has been sent. If the deferred fails with a SynapseError we use
it to send a JSON response with the appropriate HTTP reponse code. If the
deferred fails with any other type of error we send a 500 reponse.
We insert a unique request-id into the logging context for this request and
log the response and duration for this request.
"""
@defer.inlineCallbacks
def wrapped_request_handler(self, request):
global _next_request_id
request_id = "%s-%s" % (request.method, _next_request_id)
_next_request_id += 1
with LoggingContext(request_id) as request_context:
request_context.request = request_id
code = None
start = self.clock.time_msec()
try:
logger.info(
"Received request: %s %s",
request.method, request.path
)
yield request_handler(self, request)
code = request.code
except CodeMessageException as e:
code = e.code
if isinstance(e, SynapseError):
logger.info(
"%s SynapseError: %s - %s", request, code, e.msg
)
else:
logger.exception(e)
outgoing_responses_counter.inc(request.method, str(code))
respond_with_json(
request, code, cs_exception(e), send_cors=True,
pretty_print=_request_user_agent_is_curl(request),
version_string=self.version_string,
)
except:
code = 500
logger.exception(
"Failed handle request %s.%s on %r: %r",
request_handler.__module__,
request_handler.__name__,
self,
request
)
respond_with_json(
request,
500,
{"error": "Internal server error"},
send_cors=True
)
finally:
code = str(code) if code else "-"
end = self.clock.time_msec()
logger.info(
"Processed request: %dms %s %s %s",
end-start, code, request.method, request.path
)
return wrapped_request_handler
class HttpServer(object):
""" Interface for registering callbacks on a HTTP server
@ -115,102 +189,56 @@ class JsonResource(HttpServer, resource.Resource):
def render(self, request):
""" This get's called by twisted every time someone sends us a request.
"""
self._async_render_with_logging_context(request)
self._async_render(request)
return server.NOT_DONE_YET
_request_id = 0
@defer.inlineCallbacks
def _async_render_with_logging_context(self, request):
request_id = "%s-%s" % (request.method, JsonResource._request_id)
JsonResource._request_id += 1
with LoggingContext(request_id) as request_context:
request_context.request = request_id
yield self._async_render(request)
@request_handler
@defer.inlineCallbacks
def _async_render(self, request):
""" This get's called by twisted every time someone sends us a request.
This checks if anyone has registered a callback for that method and
path.
"""
code = None
start = self.clock.time_msec()
try:
# Just say yes to OPTIONS.
if request.method == "OPTIONS":
self._send_response(request, 200, {})
return
if request.method == "OPTIONS":
self._send_response(request, 200, {})
return
# Loop through all the registered callbacks to check if the method
# and path regex match
for path_entry in self.path_regexs.get(request.method, []):
m = path_entry.pattern.match(request.path)
if not m:
continue
# Loop through all the registered callbacks to check if the method
# and path regex match
for path_entry in self.path_regexs.get(request.method, []):
m = path_entry.pattern.match(request.path)
if not m:
continue
# We found a match! Trigger callback and then return the
# returned response. We pass both the request and any
# matched groups from the regex to the callback.
# We found a match! Trigger callback and then return the
# returned response. We pass both the request and any
# matched groups from the regex to the callback.
callback = path_entry.callback
callback = path_entry.callback
servlet_instance = getattr(callback, "__self__", None)
if servlet_instance is not None:
servlet_classname = servlet_instance.__class__.__name__
else:
servlet_classname = "%r" % callback
incoming_requests_counter.inc(request.method, servlet_classname)
args = [
urllib.unquote(u).decode("UTF-8") for u in m.groups()
]
logger.info(
"Received request: %s %s",
request.method, request.path
)
code, response = yield callback(request, *args)
self._send_response(request, code, response)
response_timer.inc_by(
self.clock.time_msec() - start, request.method, servlet_classname
)
return
# Huh. No one wanted to handle that? Fiiiiiine. Send 400.
raise UnrecognizedRequestError()
except CodeMessageException as e:
if isinstance(e, SynapseError):
logger.info("%s SynapseError: %s - %s", request, e.code, e.msg)
servlet_instance = getattr(callback, "__self__", None)
if servlet_instance is not None:
servlet_classname = servlet_instance.__class__.__name__
else:
logger.exception(e)
servlet_classname = "%r" % callback
incoming_requests_counter.inc(request.method, servlet_classname)
code = e.code
self._send_response(
request,
code,
cs_exception(e),
response_code_message=e.response_code_message
)
except Exception as e:
logger.exception(e)
self._send_response(
request,
500,
{"error": "Internal server error"}
)
finally:
code = str(code) if code else "-"
args = [
urllib.unquote(u).decode("UTF-8") for u in m.groups()
]
end = self.clock.time_msec()
logger.info(
"Processed request: %dms %s %s %s",
end-start, code, request.method, request.path
code, response = yield callback(request, *args)
self._send_response(request, code, response)
response_timer.inc_by(
self.clock.time_msec() - start, request.method, servlet_classname
)
return
# Huh. No one wanted to handle that? Fiiiiiine. Send 400.
raise UnrecognizedRequestError()
def _send_response(self, request, code, response_json_object,
response_code_message=None):
# could alternatively use request.notifyFinish() and flip a flag when
@ -229,20 +257,10 @@ class JsonResource(HttpServer, resource.Resource):
request, code, response_json_object,
send_cors=True,
response_code_message=response_code_message,
pretty_print=self._request_user_agent_is_curl,
pretty_print=_request_user_agent_is_curl(request),
version_string=self.version_string,
)
@staticmethod
def _request_user_agent_is_curl(request):
user_agents = request.requestHeaders.getRawHeaders(
"User-Agent", default=[]
)
for user_agent in user_agents:
if "curl" in user_agent:
return True
return False
class RootRedirect(resource.Resource):
"""Redirects the root '/' path to another path."""
@ -263,8 +281,8 @@ class RootRedirect(resource.Resource):
def respond_with_json(request, code, json_object, send_cors=False,
response_code_message=None, pretty_print=False,
version_string=""):
if not pretty_print:
json_bytes = encode_pretty_printed_json(json_object)
if pretty_print:
json_bytes = encode_pretty_printed_json(json_object) + "\n"
else:
json_bytes = encode_canonical_json(json_object)
@ -304,3 +322,13 @@ def respond_with_json_bytes(request, code, json_bytes, send_cors=False,
request.write(json_bytes)
request.finish()
return NOT_DONE_YET
def _request_user_agent_is_curl(request):
user_agents = request.requestHeaders.getRawHeaders(
"User-Agent", default=[]
)
for user_agent in user_agents:
if "curl" in user_agent:
return True
return False

View File

@ -23,6 +23,61 @@ import logging
logger = logging.getLogger(__name__)
def parse_integer(request, name, default=None, required=False):
if name in request.args:
try:
return int(request.args[name][0])
except:
message = "Query parameter %r must be an integer" % (name,)
raise SynapseError(400, message)
else:
if required:
message = "Missing integer query parameter %r" % (name,)
raise SynapseError(400, message)
else:
return default
def parse_boolean(request, name, default=None, required=False):
if name in request.args:
try:
return {
"true": True,
"false": False,
}[request.args[name][0]]
except:
message = (
"Boolean query parameter %r must be one of"
" ['true', 'false']"
) % (name,)
raise SynapseError(400, message)
else:
if required:
message = "Missing boolean query parameter %r" % (name,)
raise SynapseError(400, message)
else:
return default
def parse_string(request, name, default=None, required=False,
allowed_values=None, param_type="string"):
if name in request.args:
value = request.args[name][0]
if allowed_values is not None and value not in allowed_values:
message = "Query parameter %r must be one of [%s]" % (
name, ", ".join(repr(v) for v in allowed_values)
)
raise SynapseError(message)
else:
return value
else:
if required:
message = "Missing %s query parameter %r" % (param_type, name)
raise SynapseError(400, message)
else:
return default
class RestServlet(object):
""" A Synapse REST Servlet.
@ -56,58 +111,3 @@ class RestServlet(object):
http_server.register_path(method, pattern, method_handler)
else:
raise NotImplementedError("RestServlet must register something.")
@staticmethod
def parse_integer(request, name, default=None, required=False):
if name in request.args:
try:
return int(request.args[name][0])
except:
message = "Query parameter %r must be an integer" % (name,)
raise SynapseError(400, message)
else:
if required:
message = "Missing integer query parameter %r" % (name,)
raise SynapseError(400, message)
else:
return default
@staticmethod
def parse_boolean(request, name, default=None, required=False):
if name in request.args:
try:
return {
"true": True,
"false": False,
}[request.args[name][0]]
except:
message = (
"Boolean query parameter %r must be one of"
" ['true', 'false']"
) % (name,)
raise SynapseError(400, message)
else:
if required:
message = "Missing boolean query parameter %r" % (name,)
raise SynapseError(400, message)
else:
return default
@staticmethod
def parse_string(request, name, default=None, required=False,
allowed_values=None, param_type="string"):
if name in request.args:
value = request.args[name][0]
if allowed_values is not None and value not in allowed_values:
message = "Query parameter %r must be one of [%s]" % (
name, ", ".join(repr(v) for v in allowed_values)
)
raise SynapseError(message)
else:
return value
else:
if required:
message = "Missing %s query parameter %r" % (param_type, name)
raise SynapseError(400, message)
else:
return default

View File

@ -1,3 +1,17 @@
# Copyright 2015 OpenMarket Ltd
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from synapse.push.rulekinds import PRIORITY_CLASS_MAP, PRIORITY_CLASS_INVERSE_MAP
@ -112,7 +126,25 @@ def make_base_prepend_override_rules():
def make_base_append_override_rules():
return [
{
'rule_id': 'global/override/.m.rule.call',
'rule_id': 'global/override/.m.rule.suppress_notices',
'conditions': [
{
'kind': 'event_match',
'key': 'content.msgtype',
'pattern': 'm.notice',
}
],
'actions': [
'dont_notify',
]
}
]
def make_base_append_underride_rules(user):
return [
{
'rule_id': 'global/underride/.m.rule.call',
'conditions': [
{
'kind': 'event_match',
@ -131,19 +163,6 @@ def make_base_append_override_rules():
}
]
},
{
'rule_id': 'global/override/.m.rule.suppress_notices',
'conditions': [
{
'kind': 'event_match',
'key': 'content.msgtype',
'pattern': 'm.notice',
}
],
'actions': [
'dont_notify',
]
},
{
'rule_id': 'global/override/.m.rule.contains_display_name',
'conditions': [
@ -162,7 +181,7 @@ def make_base_append_override_rules():
]
},
{
'rule_id': 'global/override/.m.rule.room_one_to_one',
'rule_id': 'global/underride/.m.rule.room_one_to_one',
'conditions': [
{
'kind': 'room_member_count',
@ -179,12 +198,7 @@ def make_base_append_override_rules():
'value': False
}
]
}
]
def make_base_append_underride_rules(user):
return [
},
{
'rule_id': 'global/underride/.m.rule.invite_for_me',
'conditions': [

View File

@ -1,3 +1,17 @@
# Copyright 2015 OpenMarket Ltd
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
PRIORITY_CLASS_MAP = {
'underride': 1,
'sender': 2,

View File

@ -1,10 +1,24 @@
# Copyright 2015 OpenMarket Ltd
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import logging
from distutils.version import LooseVersion
logger = logging.getLogger(__name__)
REQUIREMENTS = {
"syutil>=0.0.4": ["syutil"],
"syutil>=0.0.5": ["syutil"],
"Twisted==14.0.2": ["twisted==14.0.2"],
"service_identity>=1.0.0": ["service_identity>=1.0.0"],
"pyopenssl>=0.14": ["OpenSSL>=0.14"],
@ -43,8 +57,8 @@ DEPENDENCY_LINKS = [
),
github_link(
project="matrix-org/syutil",
version="v0.0.4",
egg="syutil-0.0.4",
version="v0.0.5",
egg="syutil-0.0.5",
),
github_link(
project="matrix-org/matrix-angular-sdk",

View File

@ -15,7 +15,9 @@
from twisted.internet import defer
from synapse.http.servlet import RestServlet
from synapse.http.servlet import (
RestServlet, parse_string, parse_integer, parse_boolean
)
from synapse.handlers.sync import SyncConfig
from synapse.types import StreamToken
from synapse.events.utils import (
@ -87,20 +89,20 @@ class SyncRestServlet(RestServlet):
def on_GET(self, request):
user, client = yield self.auth.get_user_by_req(request)
timeout = self.parse_integer(request, "timeout", default=0)
limit = self.parse_integer(request, "limit", required=True)
gap = self.parse_boolean(request, "gap", default=True)
sort = self.parse_string(
timeout = parse_integer(request, "timeout", default=0)
limit = parse_integer(request, "limit", required=True)
gap = parse_boolean(request, "gap", default=True)
sort = parse_string(
request, "sort", default="timeline,asc",
allowed_values=self.ALLOWED_SORT
)
since = self.parse_string(request, "since")
set_presence = self.parse_string(
since = parse_string(request, "since")
set_presence = parse_string(
request, "set_presence", default="online",
allowed_values=self.ALLOWED_PRESENCE
)
backfill = self.parse_boolean(request, "backfill", default=False)
filter_id = self.parse_string(request, "filter", default=None)
backfill = parse_boolean(request, "backfill", default=False)
filter_id = parse_string(request, "filter", default=None)
logger.info(
"/sync: user=%r, timeout=%r, limit=%r, gap=%r, sort=%r, since=%r,"

View File

@ -18,13 +18,15 @@ from .thumbnailer import Thumbnailer
from synapse.http.server import respond_with_json
from synapse.util.stringutils import random_string
from synapse.api.errors import (
cs_exception, CodeMessageException, cs_error, Codes, SynapseError
cs_error, Codes, SynapseError
)
from twisted.internet import defer
from twisted.web.resource import Resource
from twisted.protocols.basic import FileSender
from synapse.util.async import create_observer
import os
import logging
@ -32,6 +34,18 @@ import logging
logger = logging.getLogger(__name__)
def parse_media_id(request):
try:
server_name, media_id = request.postpath
return (server_name, media_id)
except:
raise SynapseError(
404,
"Invalid media id token %r" % (request.postpath,),
Codes.UNKNOWN,
)
class BaseMediaResource(Resource):
isLeaf = True
@ -45,74 +59,9 @@ class BaseMediaResource(Resource):
self.max_upload_size = hs.config.max_upload_size
self.max_image_pixels = hs.config.max_image_pixels
self.filepaths = filepaths
self.version_string = hs.version_string
self.downloads = {}
@staticmethod
def catch_errors(request_handler):
@defer.inlineCallbacks
def wrapped_request_handler(self, request):
try:
yield request_handler(self, request)
except CodeMessageException as e:
logger.info("Responding with error: %r", e)
respond_with_json(
request, e.code, cs_exception(e), send_cors=True
)
except:
logger.exception(
"Failed handle request %s.%s on %r",
request_handler.__module__,
request_handler.__name__,
self,
)
respond_with_json(
request,
500,
{"error": "Internal server error"},
send_cors=True
)
return wrapped_request_handler
@staticmethod
def _parse_media_id(request):
try:
server_name, media_id = request.postpath
return (server_name, media_id)
except:
raise SynapseError(
404,
"Invalid media id token %r" % (request.postpath,),
Codes.UNKNOWN,
)
@staticmethod
def _parse_integer(request, arg_name, default=None):
try:
if default is None:
return int(request.args[arg_name][0])
else:
return int(request.args.get(arg_name, [default])[0])
except:
raise SynapseError(
400,
"Missing integer argument %r" % (arg_name,),
Codes.UNKNOWN,
)
@staticmethod
def _parse_string(request, arg_name, default=None):
try:
if default is None:
return request.args[arg_name][0]
else:
return request.args.get(arg_name, [default])[0]
except:
raise SynapseError(
400,
"Missing string argument %r" % (arg_name,),
Codes.UNKNOWN,
)
def _respond_404(self, request):
respond_with_json(
request, 404,
@ -140,7 +89,7 @@ class BaseMediaResource(Resource):
def callback(media_info):
del self.downloads[key]
return media_info
return download
return create_observer(download)
@defer.inlineCallbacks
def _get_remote_media_impl(self, server_name, media_id):

View File

@ -13,7 +13,8 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from .base_resource import BaseMediaResource
from .base_resource import BaseMediaResource, parse_media_id
from synapse.http.server import request_handler
from twisted.web.server import NOT_DONE_YET
from twisted.internet import defer
@ -28,15 +29,10 @@ class DownloadResource(BaseMediaResource):
self._async_render_GET(request)
return NOT_DONE_YET
@BaseMediaResource.catch_errors
@request_handler
@defer.inlineCallbacks
def _async_render_GET(self, request):
try:
server_name, media_id = request.postpath
except:
self._respond_404(request)
return
server_name, media_id = parse_media_id(request)
if server_name == self.server_name:
yield self._respond_local_file(request, media_id)
else:

View File

@ -1,3 +1,17 @@
# Copyright 2015 OpenMarket Ltd
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from pydenticon import Generator
from twisted.web.resource import Resource

View File

@ -14,7 +14,9 @@
# limitations under the License.
from .base_resource import BaseMediaResource
from .base_resource import BaseMediaResource, parse_media_id
from synapse.http.servlet import parse_string, parse_integer
from synapse.http.server import request_handler
from twisted.web.server import NOT_DONE_YET
from twisted.internet import defer
@ -31,14 +33,14 @@ class ThumbnailResource(BaseMediaResource):
self._async_render_GET(request)
return NOT_DONE_YET
@BaseMediaResource.catch_errors
@request_handler
@defer.inlineCallbacks
def _async_render_GET(self, request):
server_name, media_id = self._parse_media_id(request)
width = self._parse_integer(request, "width")
height = self._parse_integer(request, "height")
method = self._parse_string(request, "method", "scale")
m_type = self._parse_string(request, "type", "image/png")
server_name, media_id = parse_media_id(request)
width = parse_integer(request, "width")
height = parse_integer(request, "height")
method = parse_string(request, "method", "scale")
m_type = parse_string(request, "type", "image/png")
if server_name == self.server_name:
yield self._respond_local_thumbnail(

View File

@ -13,12 +13,10 @@
# See the License for the specific language governing permissions and
# limitations under the License.
from synapse.http.server import respond_with_json
from synapse.http.server import respond_with_json, request_handler
from synapse.util.stringutils import random_string
from synapse.api.errors import (
cs_exception, SynapseError, CodeMessageException
)
from synapse.api.errors import SynapseError
from twisted.web.server import NOT_DONE_YET
from twisted.internet import defer
@ -69,53 +67,42 @@ class UploadResource(BaseMediaResource):
defer.returnValue("mxc://%s/%s" % (self.server_name, media_id))
@request_handler
@defer.inlineCallbacks
def _async_render_POST(self, request):
try:
auth_user, client = yield self.auth.get_user_by_req(request)
# TODO: The checks here are a bit late. The content will have
# already been uploaded to a tmp file at this point
content_length = request.getHeader("Content-Length")
if content_length is None:
raise SynapseError(
msg="Request must specify a Content-Length", code=400
)
if int(content_length) > self.max_upload_size:
raise SynapseError(
msg="Upload request body is too large",
code=413,
)
headers = request.requestHeaders
if headers.hasHeader("Content-Type"):
media_type = headers.getRawHeaders("Content-Type")[0]
else:
raise SynapseError(
msg="Upload request missing 'Content-Type'",
code=400,
)
# if headers.hasHeader("Content-Disposition"):
# disposition = headers.getRawHeaders("Content-Disposition")[0]
# TODO(markjh): parse content-dispostion
content_uri = yield self.create_content(
media_type, None, request.content.read(),
content_length, auth_user
auth_user, client = yield self.auth.get_user_by_req(request)
# TODO: The checks here are a bit late. The content will have
# already been uploaded to a tmp file at this point
content_length = request.getHeader("Content-Length")
if content_length is None:
raise SynapseError(
msg="Request must specify a Content-Length", code=400
)
if int(content_length) > self.max_upload_size:
raise SynapseError(
msg="Upload request body is too large",
code=413,
)
respond_with_json(
request, 200, {"content_uri": content_uri}, send_cors=True
)
except CodeMessageException as e:
logger.exception(e)
respond_with_json(request, e.code, cs_exception(e), send_cors=True)
except:
logger.exception("Failed to store file")
respond_with_json(
request,
500,
{"error": "Internal server error"},
send_cors=True
headers = request.requestHeaders
if headers.hasHeader("Content-Type"):
media_type = headers.getRawHeaders("Content-Type")[0]
else:
raise SynapseError(
msg="Upload request missing 'Content-Type'",
code=400,
)
# if headers.hasHeader("Content-Disposition"):
# disposition = headers.getRawHeaders("Content-Disposition")[0]
# TODO(markjh): parse content-dispostion
content_uri = yield self.create_content(
media_type, None, request.content.read(),
content_length, auth_user
)
respond_with_json(
request, 200, {"content_uri": content_uri}, send_cors=True
)

View File

@ -1,3 +1,17 @@
# Copyright 2015 OpenMarket Ltd
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import json
import logging

View File

@ -1,3 +1,17 @@
/* Copyright 2015 OpenMarket Ltd
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
CREATE TABLE IF NOT EXISTS push_rules_enable (
id INTEGER PRIMARY KEY AUTOINCREMENT,
user_name VARCHAR(150) NOT NULL,

View File

@ -32,3 +32,22 @@ def run_on_reactor():
iteration of the main loop
"""
return sleep(0)
def create_observer(deferred):
"""Creates a deferred that observes the result or failure of the given
deferred *without* affecting the given deferred.
"""
d = defer.Deferred()
def callback(r):
d.callback(r)
return r
def errback(f):
d.errback(f)
return f
deferred.addCallbacks(callback, errback)
return d

View File

@ -883,6 +883,71 @@ class PresencePushTestCase(MockedDatastorePresenceTestCase):
state
)
@defer.inlineCallbacks
def test_recv_remote_offline(self):
""" Various tests relating to SYN-261 """
potato_set = self.handler._remote_recvmap.setdefault(self.u_potato,
set())
potato_set.add(self.u_apple)
self.room_members = [self.u_banana, self.u_potato]
self.assertEquals(self.event_source.get_current_key(), 0)
yield self.mock_federation_resource.trigger("PUT",
"/_matrix/federation/v1/send/1000000/",
_make_edu_json("elsewhere", "m.presence",
content={
"push": [
{"user_id": "@potato:remote",
"presence": "offline"},
],
}
)
)
self.assertEquals(self.event_source.get_current_key(), 1)
(events, _) = yield self.event_source.get_new_events_for_user(
self.u_apple, 0, None
)
self.assertEquals(events,
[
{"type": "m.presence",
"content": {
"user_id": "@potato:remote",
"presence": OFFLINE,
}}
]
)
yield self.mock_federation_resource.trigger("PUT",
"/_matrix/federation/v1/send/1000001/",
_make_edu_json("elsewhere", "m.presence",
content={
"push": [
{"user_id": "@potato:remote",
"presence": "online"},
],
}
)
)
self.assertEquals(self.event_source.get_current_key(), 2)
(events, _) = yield self.event_source.get_new_events_for_user(
self.u_apple, 0, None
)
self.assertEquals(events,
[
{"type": "m.presence",
"content": {
"user_id": "@potato:remote",
"presence": ONLINE,
}}
]
)
@defer.inlineCallbacks
def test_join_room_local(self):
self.room_members = [self.u_apple, self.u_banana]