In `MatrixFederationHttpClient._send_request()`, we make a HTTP request
using an `Agent`, wrap that request in a timeout and await the resulting
`Deferred`. On its own, the `Agent` performing the HTTP request
correctly stashes and restores the logging context while waiting.
The addition of the timeout introduces a path where the logging context
is not restored when execution resumes.
To address this, we wrap the timeout `Deferred` in a
`make_deferred_yieldable()` to stash the logging context and restore it
on completion of the `await`. However this is not sufficient, since by
the time we construct the timeout `Deferred`, the `Agent` has already
stashed and cleared the logging context when using
`make_deferred_yieldable()` to produce its `Deferred` for the request.
Hence, we wrap the `Agent` request in a `run_in_background()` to "fork"
and preserve the logging context so that we can stash and restore it
when `await`ing the timeout `Deferred`.
This approach is similar to the one used with `defer.gatherResults`.
Note that the code is still not fully correct. When a timeout occurs,
the request remains running in the background (existing behavior which
is nothing to do with the new call to `run_in_background`) and may
re-start the logging context after it has finished.
Mostly this involves decorating a few Deferred declarations with extra type hints. We wrap the types in quotes to avoid runtime errors when running against older versions of Twisted that don't have generics on Deferred.
* switch from `types.CoroutineType` to `typing.Coroutine`
these should be identical semantically, and since `defer.ensureDeferred` is
defined to take a `typing.Coroutine`, will keep mypy happy
* Fix some annotations on inlineCallbacks functions
* changelog
Improves type hints for:
* parse_{boolean,integer}
* parse_{boolean,integer}_from_args
* parse_json_{value,object}_from_request
And fixes any incorrect calls that resulted from unknown types.
* Drop Origin & Accept from Access-Control-Allow-Headers value
This change drops the Origin and Accept header names from the value of the
Access-Control-Allow-Headers response header sent by Synapse. Per the CORS
protocol, it’s not necessary or useful to include those header names.
Details:
Per-spec at https://fetch.spec.whatwg.org/#forbidden-header-name, Origin
is a “forbidden header name” set by the browser and that frontend
JavaScript code is never allowed to set.
So the value of Access-Control-Allow-Headers isn’t relevant to Origin or
in general to other headers set by the browser itself — the browser
never ever consults the Access-Control-Allow-Headers value to confirm
that it’s OK for the request to include an Origin header.
And per-spec at https://fetch.spec.whatwg.org/#cors-safelisted-request-header,
Accept is a “CORS-safelisted request-header”, which means that browsers
allow requests to contain the Accept header regardless of whether the
Access-Control-Allow-Headers value contains "Accept".
So it’s unnecessary for the Access-Control-Allow-Headers to explicitly
include Accept. Browsers will not perform a CORS preflight for requests
containing an Accept request header.
Related: https://github.com/matrix-org/matrix-doc/pull/3225
Signed-off-by: Michael[tm] Smith <mike@w3.org>
Add 'federation_ip_range_whitelist'. This allows backwards-compatibility, If 'federation_ip_range_blacklist' is set. Otherwise 'ip_range_whitelist' will be used for federation servers.
Signed-off-by: Michael Kutzner 1mikure@gmail.com
Instead of parsing the full response to `/send_join` into Python objects (which can be huge for large rooms) and *then* parsing that into events, we instead use ijson to stream parse the response directly into `EventBase` objects.
Part of #9744
Removes all redundant `# -*- coding: utf-8 -*-` lines from files, as python 3 automatically reads source code as utf-8 now.
`Signed-off-by: Jonathan de Jong <jonathan@automatia.nl>`
### Changes proposed in this PR
- Add support for the `no_proxy` and `NO_PROXY` environment variables
- Internally rely on urllib's [`proxy_bypass_environment`](bdb941be42/Lib/urllib/request.py (L2519))
- Extract env variables using urllib's `getproxies`/[`getproxies_environment`](bdb941be42/Lib/urllib/request.py (L2488)) which supports lowercase + uppercase, preferring lowercase, except for `HTTP_PROXY` in a CGI environment
This does contain behaviour changes for consumers so making sure these are called out:
- `no_proxy`/`NO_PROXY` is now respected
- lowercase `https_proxy` is now allowed and taken over `HTTPS_PROXY`
Related to #9306 which also uses `ProxyAgent`
Signed-off-by: Timothy Leung tim95@hotmail.co.uk