mirror of
https://github.com/matrix-org/pantalaimon.git
synced 2025-04-06 21:23:38 -04:00
Compare commits
74 Commits
Author | SHA1 | Date | |
---|---|---|---|
![]() |
257ef6a2e5 | ||
![]() |
0f52d303d4 | ||
![]() |
c3ee162802 | ||
![]() |
26d9a55ce8 | ||
![]() |
21fb28d090 | ||
![]() |
42cdcc2519 | ||
![]() |
29d18653dc | ||
![]() |
1af0a04282 | ||
![]() |
e9fb8a4a57 | ||
![]() |
46a762d93a | ||
![]() |
93a1fcb36f | ||
![]() |
a2a2d704ee | ||
![]() |
a82652f6ad | ||
![]() |
6d88b8a215 | ||
![]() |
b8ca9b478d | ||
![]() |
bfb3b06153 | ||
![]() |
e2abab1ecc | ||
![]() |
5426d5bf9d | ||
![]() |
33909aa251 | ||
![]() |
b66ed95319 | ||
![]() |
5caaaf5651 | ||
![]() |
f459d585ca | ||
![]() |
369f73f3fb | ||
![]() |
76dc74d250 | ||
![]() |
adef63443e | ||
![]() |
634ac7ed68 | ||
![]() |
8b2a1173fd | ||
![]() |
3968c69aa8 | ||
![]() |
6638393042 | ||
![]() |
807deb94ee | ||
![]() |
313a5d528c | ||
![]() |
127373fdcc | ||
![]() |
b5a419e488 | ||
![]() |
64a3fb0a48 | ||
![]() |
4254a5d9d5 | ||
![]() |
6b7b87bb6a | ||
![]() |
2883df45c0 | ||
![]() |
492e4bb358 | ||
![]() |
8cdea3e637 | ||
![]() |
109ceed0bb | ||
![]() |
eb4f3b54b4 | ||
![]() |
85c7b685e5 | ||
![]() |
82fcbf8e85 | ||
![]() |
86060a2f75 | ||
![]() |
3dd8051707 | ||
![]() |
e62cfe068a | ||
![]() |
c89e87c22a | ||
![]() |
e5fb0b7f17 | ||
![]() |
cd36ca68d5 | ||
![]() |
ed7aa55ef0 | ||
![]() |
054a3bcb0b | ||
![]() |
e3be7bee3b | ||
![]() |
c6b021ed11 | ||
![]() |
a9f6ad2c7e | ||
![]() |
f875499d6b | ||
![]() |
340dbf2eb6 | ||
![]() |
c36aca183b | ||
![]() |
3dcaad8a9f | ||
![]() |
95bf217657 | ||
![]() |
c455b37b67 | ||
![]() |
48fd703772 | ||
![]() |
a7286dfc9f | ||
![]() |
2b359c22d2 | ||
![]() |
90cdc55451 | ||
![]() |
eabd5f5b51 | ||
![]() |
4cdf1be376 | ||
![]() |
5e81131ecf | ||
![]() |
4cd28da781 | ||
![]() |
7b4cc0527d | ||
![]() |
87169df413 | ||
![]() |
73f68c76fb | ||
![]() |
db9cf77891 | ||
![]() |
edab476a90 | ||
![]() |
376ca72014 |
45
.github/workflows/ci.yml
vendored
Normal file
45
.github/workflows/ci.yml
vendored
Normal file
@ -0,0 +1,45 @@
|
||||
name: Build Status
|
||||
|
||||
on: [push, pull_request]
|
||||
|
||||
jobs:
|
||||
build:
|
||||
|
||||
runs-on: ubuntu-latest
|
||||
strategy:
|
||||
matrix:
|
||||
python-version: ['3.8', '3.9', '3.10']
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
- uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: ${{ matrix.python-version }}
|
||||
- name: Install Tox and any other packages
|
||||
run: |
|
||||
wget https://gitlab.matrix.org/matrix-org/olm/-/archive/master/olm-master.tar.bz2
|
||||
tar -xvf olm-master.tar.bz2
|
||||
pushd olm-master && make && sudo make PREFIX="/usr" install && popd
|
||||
rm -r olm-master
|
||||
pip install tox
|
||||
- name: Run Tox
|
||||
run: tox -e py
|
||||
|
||||
coverage:
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
- name: Setup Python
|
||||
uses: actions/setup-python@v2
|
||||
with:
|
||||
python-version: "3.10"
|
||||
- name: Install Tox and any other packages
|
||||
run: |
|
||||
wget https://gitlab.matrix.org/matrix-org/olm/-/archive/master/olm-master.tar.bz2
|
||||
tar -xvf olm-master.tar.bz2
|
||||
pushd olm-master && make && sudo make PREFIX="/usr" install && popd
|
||||
rm -r olm-master
|
||||
pip install tox
|
||||
- name: Run Tox
|
||||
run: tox -e coverage
|
92
CHANGELOG.md
92
CHANGELOG.md
@ -4,12 +4,103 @@ All notable changes to this project will be documented in this file.
|
||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||
|
||||
## 0.10.5 2022-09-28
|
||||
|
||||
### Added
|
||||
|
||||
- [[#137]] Proxy the v3 endpoints as well
|
||||
|
||||
### Fixed
|
||||
|
||||
- [[#130]] Make sure the token variable is declared
|
||||
|
||||
[#137]: https://github.com/matrix-org/pantalaimon/pull/137
|
||||
[#130]: https://github.com/matrix-org/pantalaimon/pull/130
|
||||
|
||||
## 0.10.4 2022-02-04
|
||||
|
||||
### Fixed
|
||||
|
||||
- [[#122]] Fix the GLib import for panctl on some distributions
|
||||
- [[#120]] Don't use strip to filter Bearer from the auth header
|
||||
- [[#118]] Don't use the raw path if we need to sanitize filters, fixing room
|
||||
history fetching for Fractal
|
||||
|
||||
[#122]: https://github.com/matrix-org/pantalaimon/pull/122
|
||||
[#120]: https://github.com/matrix-org/pantalaimon/pull/120
|
||||
[#118]: https://github.com/matrix-org/pantalaimon/pull/118
|
||||
|
||||
## 0.10.3 2021-09-02
|
||||
|
||||
### Fixed
|
||||
|
||||
- [[#105]] Use the raw_path when forwarding requests, avoiding URL
|
||||
decoding/encoding issues.
|
||||
|
||||
[#105]: https://github.com/matrix-org/pantalaimon/pull/105
|
||||
|
||||
|
||||
## 0.10.2 2021-07-14
|
||||
|
||||
### Fixed
|
||||
|
||||
- [[#103]] Prevent E2EE downgrade on failed syncs
|
||||
|
||||
[#103]: https://github.com/matrix-org/pantalaimon/pull/103
|
||||
|
||||
|
||||
## 0.10.1 2021-07-06
|
||||
|
||||
### Fixed
|
||||
|
||||
- [[#100]] Don't require the rooms dicts in the sync response
|
||||
- [[#99]] Thumbnails not generating for media uploaded in unencrypted rooms
|
||||
whole LRU cache when it shouldn't
|
||||
|
||||
[#100]: https://github.com/matrix-org/pantalaimon/pull/100
|
||||
[#99]: https://github.com/matrix-org/pantalaimon/pull/99
|
||||
|
||||
|
||||
## 0.10.0 2021-05-14
|
||||
|
||||
### Added
|
||||
|
||||
- [[#98]] Add the ability to remove old room keys
|
||||
- [[#95]] Encrypt thumbnails uploaded by a client
|
||||
|
||||
### Fixed
|
||||
|
||||
- [[#96]] Split out the media cache loading logic to avoid returning the
|
||||
whole LRU cache when it shouldn't
|
||||
|
||||
[#98]: https://github.com/matrix-org/pantalaimon/pull/98
|
||||
[#96]: https://github.com/matrix-org/pantalaimon/pull/96
|
||||
[#95]: https://github.com/matrix-org/pantalaimon/pull/95
|
||||
|
||||
## 0.9.3 2021-05-14
|
||||
|
||||
### Changed
|
||||
|
||||
- [[#73f68c7]] Bump the allowed nio version
|
||||
|
||||
[73f68c7]: https://github.com/matrix-org/pantalaimon/commit/73f68c76fb05037bd7fe71688ce39eb1f526a385
|
||||
|
||||
## 0.9.2 2021-03-10
|
||||
|
||||
### Changed
|
||||
|
||||
- [[#89]] Bump the allowed nio version
|
||||
|
||||
[#89]: https://github.com/matrix-org/pantalaimon/pull/89
|
||||
|
||||
## 0.9.1 2021-01-19
|
||||
|
||||
### Changed
|
||||
|
||||
- [[3baae08]] Bump the allowed nio version
|
||||
|
||||
[3baae08]: https://github.com/matrix-org/pantalaimon/commit/3baae08ac36e258632e224b655e177a765a939f3
|
||||
|
||||
## 0.9.0 2021-01-19
|
||||
|
||||
### Fixed
|
||||
@ -21,7 +112,6 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
|
||||
|
||||
- [[#79]] Support media uploads, thanks to @aspacca
|
||||
|
||||
[3baae08]: https://github.com/matrix-org/pantalaimon/commit/3baae08ac36e258632e224b655e177a765a939f3
|
||||
[59051c5]: https://github.com/matrix-org/pantalaimon/commit/59051c530a343a6887ea0f9ccddd6f6964f6d923
|
||||
[#79]: https://github.com/matrix-org/pantalaimon/pull/79
|
||||
|
||||
|
56
README.md
56
README.md
@ -25,11 +25,19 @@ Installing pantalaimon works like usually with python packages:
|
||||
|
||||
python setup.py install
|
||||
|
||||
or you can use `pip` and install it with:
|
||||
```
|
||||
pip install .[ui]
|
||||
```
|
||||
|
||||
It is recommended that you create a virtual environment first or install dependencies
|
||||
via your package manager. They are usually found with `python-<package-name>`.
|
||||
|
||||
Pantalaimon can also be found on pypi:
|
||||
|
||||
pip install pantalaimon
|
||||
|
||||
Pantalaimon contains a dbus based UI that can be used to controll the daemon.
|
||||
Pantalaimon contains a dbus based UI that can be used to control the daemon.
|
||||
The dbus based UI is completely optional and needs to be installed with the
|
||||
daemon:
|
||||
|
||||
@ -77,6 +85,10 @@ docker build -t pantalaimon .
|
||||
# volume below is for where Pantalaimon should dump some data.
|
||||
docker run -it --rm -v /path/to/pantalaimon/dir:/data -p 8008:8008 pantalaimon
|
||||
```
|
||||
The Docker image in the above example can alternatively be built straight from any branch or tag without the need to clone the repo, just by using this syntax:
|
||||
```bash
|
||||
docker build -t pantalaimon github.com/matrix-org/pantalaimon#master
|
||||
```
|
||||
|
||||
An example `pantalaimon.conf` for Docker is:
|
||||
```conf
|
||||
@ -96,7 +108,7 @@ IgnoreVerification = True
|
||||
Usage
|
||||
=====
|
||||
|
||||
While pantalaimon is a daemon, it is meant to be run as your own user. It won't
|
||||
While pantalaimon is a daemon, it is meant to be run as the same user as the app it is proxying for. It won't
|
||||
verify devices for you automatically, unless configured to do so, and requires
|
||||
user interaction to verify, ignore or blacklist devices. A more complete
|
||||
description of Pantalaimon can be found in the [man page](docs/man/pantalaimon.8.md).
|
||||
@ -107,7 +119,7 @@ specifies one or more homeservers for pantalaimon to connect to.
|
||||
A minimal pantalaimon configuration looks like this:
|
||||
```dosini
|
||||
[local-matrix]
|
||||
Homeserver = https://localhost:8448
|
||||
Homeserver = https://localhost:443
|
||||
ListenAddress = localhost
|
||||
ListenPort = 8009
|
||||
```
|
||||
@ -136,3 +148,41 @@ To control the daemon an interactive utility is provided in the form of
|
||||
`panctl` can be used to verify, blacklist or ignore devices, import or export
|
||||
session keys, or to introspect devices of users that we share encrypted rooms
|
||||
with.
|
||||
|
||||
### Setup
|
||||
This is all coming from an excellent comment that you can find [here](https://github.com/matrix-org/pantalaimon/issues/154#issuecomment-1951591191).
|
||||
|
||||
|
||||
|
||||
1) Ensure you have an OS keyring installed. In my case I installed `gnome-keyring`. You may also want a GUI like `seahorse` to inspect the keyring. (pantalaimon will work without a keyring but your client will have to log in with the password every time `pantalaimon` is restarted, instead of being able to reuse the access token from the previous successful login.)
|
||||
|
||||
2) In case you have prior attempts, clean the slate by deleting the `~/.local/share/pantalaimon` directory.
|
||||
|
||||
3) Start `pantalaimon`.
|
||||
|
||||
4) Connect a client to the `ListenAddress:ListenPort` you specified in `pantalaimon.conf`, eg to `127.0.0.1:8009`, using the same username and password you would've used to login to your homeserver directly.
|
||||
|
||||
5) The login should succeed, but at this point all encrypted messages will fail to decrypt. This is fine.
|
||||
|
||||
6) Start another client that you were already using for your encrypted chats previously. In my case this was `app.element.io`, so the rest of the steps here assume that.
|
||||
|
||||
7) Run `panctl`. At the prompt, run `start-verification <user ID> <user ID> <Element's device ID>`. `<user ID>` here is the full user ID like `@arnavion:arnavion.dev`. If you only have the one Element session, `panctl` will show you the device ID as an autocomplete hint so you don't have to look it up. If you do need to look it up, go to Element -> profile icon -> All Settings -> Sessions, expand the "Current session" item, and the "Session ID" is the device ID.
|
||||
|
||||
8) In Element you will see a popup "Incoming Verification Request". Click "Continue". It will change to a popup containing some emojis, and `panctl` will print the same emojis. Click the "They match" button. It will now change to a popup like "Waiting for other client to confirm..."
|
||||
|
||||
9) In `panctl`, run `confirm-verification <user ID> <user ID> <Element's device ID>`, ie the same command as before but with `confirm-verification` instead of `start-verification`.
|
||||
|
||||
10) At this point, if you look at all your sessions in Element (profile icon -> All Settings -> Sessions), you should see "pantalaimon" in the "Other sessions" list as a "Verified" session.
|
||||
|
||||
11) Export the E2E room keys that Element was using via profile icon -> Security & Privacy -> Export E2E room keys. Pick any password and then save the file to some path.
|
||||
|
||||
12) Back in `panctl`, run `import-keys <user ID> <path of file> <password you used to encrypt the file>`. After a few seconds, in the output of `pantalaimon`, you should see a log like `INFO: pantalaimon: Successfully imported keys for <user ID> from <path of file>`.
|
||||
|
||||
13) Close and restart the client you had used in step 5, ie the one you want to connect to `pantalaimon`. Now, finally, you should be able to see the encrypted chats be decrypted.
|
||||
|
||||
14) Delete the E2E room keys backup file from step 12. You don't need it any more.
|
||||
|
||||
|
||||
15) If in step 11 you had other unverified sessions from pantalaimon from your prior attempts, you can sign out of them too.
|
||||
|
||||
You will probably have to repeat steps 11-14 any time you start a new encrypted chat in Element.
|
||||
|
@ -12,3 +12,4 @@ Proxy = http://localhost:8080
|
||||
SSL = False
|
||||
IgnoreVerification = False
|
||||
UseKeyring = True
|
||||
DropOldKeys = False
|
||||
|
@ -51,7 +51,7 @@ The message will be sent away after all devices are marked as ignored.
|
||||
In contrast to the
|
||||
.Cm send-anyways
|
||||
command this command cancels the sending of a message to an encrypted room with
|
||||
unverified devices and gives the user the oportunity to verify or blacklist
|
||||
unverified devices and gives the user the opportunity to verify or blacklist
|
||||
devices as they see fit.
|
||||
.It Cm import-keys Ar pan-user Ar file Ar passphrase
|
||||
Import end-to-end encryption keys from the given file for the given pan-user.
|
||||
|
@ -74,7 +74,7 @@ are as follows:
|
||||
> In contrast to the
|
||||
> **send-anyways**
|
||||
> command this command cancels the sending of a message to an encrypted room with
|
||||
> unverified devices and gives the user the oportunity to verify or blacklist
|
||||
> unverified devices and gives the user the opportunity to verify or blacklist
|
||||
> devices as they see fit.
|
||||
|
||||
**import-keys** *pan-user* *file* *passphrase*
|
||||
|
@ -51,6 +51,11 @@ This option configures if a proxy instance should use the OS keyring to store
|
||||
its own access tokens. The access tokens are required for the daemon to resume
|
||||
operation. If this is set to "No", access tokens are stored in the pantalaimon
|
||||
database in plaintext. Defaults to "Yes".
|
||||
.It Cm DropOldKeys
|
||||
This option configures if a proxy instance should only keep the latest version
|
||||
of a room key from a certain user around. This effectively means that only newly
|
||||
incoming messages will be decryptable, the proxy will be unable to decrypt the
|
||||
room history. Defaults to "No".
|
||||
.It Cm SearchRequests
|
||||
This option configures if the proxy should make additional HTTP requests to the
|
||||
server when clients use the search API endpoint. Some data that is required to
|
||||
@ -81,7 +86,7 @@ The amount of time to wait between room message history requests to the
|
||||
Homeserver in ms. Defaults to 3000.
|
||||
.El
|
||||
.Pp
|
||||
Aditional to the homeserver section a special section with the name
|
||||
Additional to the homeserver section a special section with the name
|
||||
.Cm Default
|
||||
can be used to configure the following values for all homeservers:
|
||||
.Cm ListenAddress ,
|
||||
|
@ -62,7 +62,14 @@ The following keys are optional in the proxy instance sections:
|
||||
> operation. If this is set to "No", access tokens are stored in the pantalaimon
|
||||
> database in plaintext. Defaults to "Yes".
|
||||
|
||||
Aditional to the homeserver section a special section with the name
|
||||
**DropOldKeys**
|
||||
|
||||
> This option configures if a proxy instance should only keep the latest version
|
||||
> of a room key from a certain user around. This effectively means that only newly
|
||||
> incoming messages will be decryptable, the proxy will be unable to decrypt the
|
||||
> room history. Defaults to "No".
|
||||
|
||||
Additional to the homeserver section a special section with the name
|
||||
**Default**
|
||||
can be used to configure the following values for all homeservers:
|
||||
**ListenAddress**,
|
||||
@ -150,4 +157,4 @@ pantalaimon(8)
|
||||
was written by
|
||||
Damir Jelić <[poljar@termina.org.uk](mailto:poljar@termina.org.uk)>.
|
||||
|
||||
Linux 5.1.3-arch2-1-ARCH - May 8, 2019
|
||||
Linux 5.11.16-arch1-1 - May 8, 2019
|
||||
|
@ -24,7 +24,7 @@ behalf of the client.
|
||||
is supposed to run as your own user and listen to connections on a
|
||||
non-privileged port. A client needs to log in using the standard Matrix HTTP
|
||||
calls to register itself to the daemon, such a registered user is called a pan
|
||||
user and will have it's own sync loop to keep up with the server. Multiple matrix
|
||||
user and will have its own sync loop to keep up with the server. Multiple matrix
|
||||
clients can connect and use the same pan user.
|
||||
|
||||
If user interaction is required
|
||||
|
@ -16,7 +16,6 @@ import asyncio
|
||||
import os
|
||||
from collections import defaultdict
|
||||
from pprint import pformat
|
||||
from typing import Any, Dict, Optional
|
||||
from urllib.parse import urlparse
|
||||
|
||||
from aiohttp.client_exceptions import ClientConnectionError
|
||||
@ -135,7 +134,7 @@ class InvalidLimit(Exception):
|
||||
class SqliteQStore(SqliteStore):
|
||||
def _create_database(self):
|
||||
return SqliteQueueDatabase(
|
||||
self.database_path, pragmas=(("foregign_keys", 1), ("secure_delete", 1))
|
||||
self.database_path, pragmas=(("foreign_keys", 1), ("secure_delete", 1))
|
||||
)
|
||||
|
||||
def close(self):
|
||||
@ -410,6 +409,10 @@ class PanClient(AsyncClient):
|
||||
except (asyncio.CancelledError, KeyboardInterrupt):
|
||||
return
|
||||
|
||||
@property
|
||||
def has_been_synced(self) -> bool:
|
||||
self.last_sync_token is not None
|
||||
|
||||
async def sync_tasks(self, response):
|
||||
if self.index:
|
||||
await self.index.commit_events()
|
||||
@ -540,7 +543,6 @@ class PanClient(AsyncClient):
|
||||
timeout = 30000
|
||||
sync_filter = {"room": {"state": {"lazy_load_members": True}}}
|
||||
next_batch = self.pan_store.load_token(self.server_name, self.user_id)
|
||||
self.last_sync_token = next_batch
|
||||
|
||||
# We don't store any room state so initial sync needs to be with the
|
||||
# full_state parameter. Subsequent ones are normal.
|
||||
@ -551,6 +553,7 @@ class PanClient(AsyncClient):
|
||||
full_state=True,
|
||||
since=next_batch,
|
||||
loop_sleep_time=loop_sleep_time,
|
||||
set_presence="offline",
|
||||
)
|
||||
)
|
||||
self.task = task
|
||||
@ -705,7 +708,6 @@ class PanClient(AsyncClient):
|
||||
for share in self.get_active_key_requests(
|
||||
message.user_id, message.device_id
|
||||
):
|
||||
|
||||
continued = True
|
||||
|
||||
if not self.continue_key_share(share):
|
||||
@ -732,7 +734,7 @@ class PanClient(AsyncClient):
|
||||
pass
|
||||
|
||||
response = (
|
||||
f"Succesfully continued the key requests from "
|
||||
f"Successfully continued the key requests from "
|
||||
f"{message.user_id} via {message.device_id}"
|
||||
)
|
||||
ret = "m.ok"
|
||||
@ -757,7 +759,7 @@ class PanClient(AsyncClient):
|
||||
|
||||
if cancelled:
|
||||
response = (
|
||||
f"Succesfully cancelled key requests from "
|
||||
f"Successfully cancelled key requests from "
|
||||
f"{message.user_id} via {message.device_id}"
|
||||
)
|
||||
ret = "m.ok"
|
||||
@ -807,8 +809,9 @@ class PanClient(AsyncClient):
|
||||
|
||||
if not isinstance(event, MegolmEvent):
|
||||
logger.warn(
|
||||
"Encrypted event is not a megolm event:"
|
||||
"\n{}".format(pformat(event_dict))
|
||||
"Encrypted event is not a megolm event:" "\n{}".format(
|
||||
pformat(event_dict)
|
||||
)
|
||||
)
|
||||
return False
|
||||
|
||||
@ -832,9 +835,9 @@ class PanClient(AsyncClient):
|
||||
decrypted_event.source["content"]["url"] = decrypted_event.url
|
||||
|
||||
if decrypted_event.thumbnail_url:
|
||||
decrypted_event.source["content"]["info"][
|
||||
"thumbnail_url"
|
||||
] = decrypted_event.thumbnail_url
|
||||
decrypted_event.source["content"]["info"]["thumbnail_url"] = (
|
||||
decrypted_event.thumbnail_url
|
||||
)
|
||||
|
||||
event_dict.update(decrypted_event.source)
|
||||
event_dict["decrypted"] = True
|
||||
@ -905,7 +908,7 @@ class PanClient(AsyncClient):
|
||||
|
||||
self.handle_to_device_from_sync_body(body)
|
||||
|
||||
for room_id, room_dict in body["rooms"]["join"].items():
|
||||
for room_id, room_dict in body.get("rooms", {}).get("join", {}).items():
|
||||
try:
|
||||
if not self.rooms[room_id].encrypted:
|
||||
logger.info(
|
||||
@ -920,7 +923,7 @@ class PanClient(AsyncClient):
|
||||
# pan sync stream did. Let's assume that the room is encrypted.
|
||||
pass
|
||||
|
||||
for event in room_dict["timeline"]["events"]:
|
||||
for event in room_dict.get("timeline", {}).get("events", []):
|
||||
if "type" not in event:
|
||||
continue
|
||||
|
||||
|
@ -31,7 +31,7 @@ class PanConfigParser(configparser.ConfigParser):
|
||||
"IgnoreVerification": "False",
|
||||
"ListenAddress": "localhost",
|
||||
"ListenPort": "8009",
|
||||
"LogLevel": "warnig",
|
||||
"LogLevel": "warning",
|
||||
"Notifications": "on",
|
||||
"UseKeyring": "yes",
|
||||
"SearchRequests": "off",
|
||||
@ -39,6 +39,7 @@ class PanConfigParser(configparser.ConfigParser):
|
||||
"IndexingBatchSize": "100",
|
||||
"HistoryFetchDelay": "3000",
|
||||
"DebugEncryption": "False",
|
||||
"DropOldKeys": "False",
|
||||
},
|
||||
converters={
|
||||
"address": parse_address,
|
||||
@ -112,7 +113,7 @@ class ServerConfig:
|
||||
E2E encrypted messages.
|
||||
keyring (bool): Enable or disable the OS keyring for the storage of
|
||||
access tokens.
|
||||
search_requests (bool): Enable or disable aditional Homeserver requests
|
||||
search_requests (bool): Enable or disable additional Homeserver requests
|
||||
for the search API endpoint.
|
||||
index_encrypted_only (bool): Enable or disable message indexing fro
|
||||
non-encrypted rooms.
|
||||
@ -121,6 +122,8 @@ class ServerConfig:
|
||||
the room history.
|
||||
history_fetch_delay (int): The delay between room history fetching
|
||||
requests in seconds.
|
||||
drop_old_keys (bool): Should Pantalaimon only keep the most recent
|
||||
decryption key around.
|
||||
"""
|
||||
|
||||
name = attr.ib(type=str)
|
||||
@ -137,6 +140,7 @@ class ServerConfig:
|
||||
index_encrypted_only = attr.ib(type=bool, default=True)
|
||||
indexing_batch_size = attr.ib(type=int, default=100)
|
||||
history_fetch_delay = attr.ib(type=int, default=3)
|
||||
drop_old_keys = attr.ib(type=bool, default=False)
|
||||
|
||||
|
||||
@attr.s
|
||||
@ -182,7 +186,6 @@ class PanConfig:
|
||||
|
||||
try:
|
||||
for section_name, section in config.items():
|
||||
|
||||
if section_name == "Default":
|
||||
continue
|
||||
|
||||
@ -229,6 +232,7 @@ class PanConfig:
|
||||
f"already defined before."
|
||||
)
|
||||
listen_set.add(listen_tuple)
|
||||
drop_old_keys = section.getboolean("DropOldKeys")
|
||||
|
||||
server_conf = ServerConfig(
|
||||
section_name,
|
||||
@ -243,6 +247,7 @@ class PanConfig:
|
||||
index_encrypted_only,
|
||||
indexing_batch_size,
|
||||
history_fetch_delay / 1000,
|
||||
drop_old_keys,
|
||||
)
|
||||
|
||||
self.servers[section_name] = server_conf
|
||||
|
@ -85,6 +85,7 @@ CORS_HEADERS = {
|
||||
|
||||
class NotDecryptedAvailableError(Exception):
|
||||
"""Exception that signals that no decrypted upload is available"""
|
||||
|
||||
pass
|
||||
|
||||
|
||||
@ -120,10 +121,12 @@ class ProxyDaemon:
|
||||
self.hostname = self.homeserver.hostname
|
||||
self.store = PanStore(self.data_dir)
|
||||
accounts = self.store.load_users(self.name)
|
||||
self.media_info = self.store.load_media(self.name)
|
||||
self.media_info = self.store.load_media_cache(self.name)
|
||||
self.upload_info = self.store.load_upload(self.name)
|
||||
|
||||
for user_id, device_id in accounts:
|
||||
token = None
|
||||
|
||||
if self.conf.keyring:
|
||||
try:
|
||||
token = keyring.get_password(
|
||||
@ -224,7 +227,8 @@ class ProxyDaemon:
|
||||
|
||||
if ret:
|
||||
msg = (
|
||||
f"Device {device.id} of user " f"{device.user_id} succesfully verified."
|
||||
f"Device {device.id} of user "
|
||||
f"{device.user_id} successfully verified."
|
||||
)
|
||||
await client.send_update_device(device)
|
||||
else:
|
||||
@ -239,7 +243,7 @@ class ProxyDaemon:
|
||||
if ret:
|
||||
msg = (
|
||||
f"Device {device.id} of user "
|
||||
f"{device.user_id} succesfully unverified."
|
||||
f"{device.user_id} successfully unverified."
|
||||
)
|
||||
await client.send_update_device(device)
|
||||
else:
|
||||
@ -254,7 +258,7 @@ class ProxyDaemon:
|
||||
if ret:
|
||||
msg = (
|
||||
f"Device {device.id} of user "
|
||||
f"{device.user_id} succesfully blacklisted."
|
||||
f"{device.user_id} successfully blacklisted."
|
||||
)
|
||||
await client.send_update_device(device)
|
||||
else:
|
||||
@ -271,7 +275,7 @@ class ProxyDaemon:
|
||||
if ret:
|
||||
msg = (
|
||||
f"Device {device.id} of user "
|
||||
f"{device.user_id} succesfully unblacklisted."
|
||||
f"{device.user_id} successfully unblacklisted."
|
||||
)
|
||||
await client.send_update_device(device)
|
||||
else:
|
||||
@ -306,7 +310,6 @@ class ProxyDaemon:
|
||||
DeviceUnblacklistMessage,
|
||||
),
|
||||
):
|
||||
|
||||
device = client.device_store[message.user_id].get(message.device_id, None)
|
||||
|
||||
if not device:
|
||||
@ -355,7 +358,7 @@ class ProxyDaemon:
|
||||
|
||||
else:
|
||||
info_msg = (
|
||||
f"Succesfully exported keys for {client.user_id} " f"to {path}"
|
||||
f"Successfully exported keys for {client.user_id} " f"to {path}"
|
||||
)
|
||||
logger.info(info_msg)
|
||||
await self.send_response(
|
||||
@ -378,7 +381,7 @@ class ProxyDaemon:
|
||||
)
|
||||
else:
|
||||
info_msg = (
|
||||
f"Succesfully imported keys for {client.user_id} " f"from {path}"
|
||||
f"Successfully imported keys for {client.user_id} " f"from {path}"
|
||||
)
|
||||
logger.info(info_msg)
|
||||
await self.send_response(
|
||||
@ -417,7 +420,9 @@ class ProxyDaemon:
|
||||
access_token = request.query.get("access_token", "")
|
||||
|
||||
if not access_token:
|
||||
access_token = request.headers.get("Authorization", "").strip("Bearer ")
|
||||
access_token = request.headers.get("Authorization", "").replace(
|
||||
"Bearer ", "", 1
|
||||
)
|
||||
|
||||
return access_token
|
||||
|
||||
@ -459,6 +464,7 @@ class ProxyDaemon:
|
||||
data=None, # type: bytes
|
||||
session=None, # type: aiohttp.ClientSession
|
||||
token=None, # type: str
|
||||
use_raw_path=True, # type: bool
|
||||
):
|
||||
# type: (...) -> aiohttp.ClientResponse
|
||||
"""Forward the given request to our configured homeserver.
|
||||
@ -473,6 +479,10 @@ class ProxyDaemon:
|
||||
should be used to forward the request.
|
||||
token (str, optional): The access token that should be used for the
|
||||
request.
|
||||
use_raw_path (str, optional): Should the raw path be used from the
|
||||
request or should we use the path and re-encode it. Some may need
|
||||
their filters to be sanitized, this requires the parsed version of
|
||||
the path, otherwise we leave the path as is.
|
||||
"""
|
||||
if not session:
|
||||
if not self.default_session:
|
||||
@ -481,9 +491,7 @@ class ProxyDaemon:
|
||||
|
||||
assert session
|
||||
|
||||
path = urllib.parse.quote(
|
||||
request.path
|
||||
) # re-encode path stuff like room aliases
|
||||
path = request.raw_path if use_raw_path else urllib.parse.quote(request.path)
|
||||
method = request.method
|
||||
|
||||
headers = CIMultiDict(request.headers)
|
||||
@ -608,7 +616,9 @@ class ProxyDaemon:
|
||||
await pan_client.close()
|
||||
return
|
||||
|
||||
logger.info(f"Succesfully started new background sync client for " f"{user_id}")
|
||||
logger.info(
|
||||
f"Successfully started new background sync client for " f"{user_id}"
|
||||
)
|
||||
|
||||
await self.send_ui_message(
|
||||
UpdateUsersMessage(self.name, user_id, pan_client.device_id)
|
||||
@ -674,7 +684,7 @@ class ProxyDaemon:
|
||||
|
||||
if user_id and access_token:
|
||||
logger.info(
|
||||
f"User: {user} succesfully logged in, starting "
|
||||
f"User: {user} successfully logged in, starting "
|
||||
f"a background sync client."
|
||||
)
|
||||
await self.start_pan_client(
|
||||
@ -725,7 +735,7 @@ class ProxyDaemon:
|
||||
return decryption_method(body, ignore_failures=False)
|
||||
except EncryptionError:
|
||||
logger.info("Error decrypting sync, waiting for next pan " "sync")
|
||||
await client.synced.wait(),
|
||||
(await client.synced.wait(),)
|
||||
logger.info("Pan synced, retrying decryption.")
|
||||
|
||||
try:
|
||||
@ -762,7 +772,7 @@ class ProxyDaemon:
|
||||
|
||||
try:
|
||||
response = await self.forward_request(
|
||||
request, params=query, token=client.access_token
|
||||
request, params=query, token=client.access_token, use_raw_path=False
|
||||
)
|
||||
except ClientConnectionError as e:
|
||||
return web.Response(status=500, text=str(e))
|
||||
@ -785,6 +795,27 @@ class ProxyDaemon:
|
||||
body=await response.read(),
|
||||
)
|
||||
|
||||
async def createRoom(self, request):
|
||||
try:
|
||||
content = await request.json()
|
||||
except (JSONDecodeError, ContentTypeError):
|
||||
return self._not_json
|
||||
|
||||
invite = content.get("invite", ())
|
||||
if invite:
|
||||
access_token = self.get_access_token(request)
|
||||
|
||||
if not access_token:
|
||||
return self._missing_token
|
||||
|
||||
client = await self._find_client(access_token)
|
||||
if not client:
|
||||
return self._unknown_token
|
||||
|
||||
client.users_for_key_query.update(invite)
|
||||
|
||||
return await self.forward_to_web(request)
|
||||
|
||||
async def messages(self, request):
|
||||
access_token = self.get_access_token(request)
|
||||
|
||||
@ -810,7 +841,9 @@ class ProxyDaemon:
|
||||
query["filter"] = request_filter
|
||||
|
||||
try:
|
||||
response = await self.forward_request(request, params=query)
|
||||
response = await self.forward_request(
|
||||
request, params=query, use_raw_path=False
|
||||
)
|
||||
except ClientConnectionError as e:
|
||||
return web.Response(status=500, text=str(e))
|
||||
|
||||
@ -834,9 +867,7 @@ class ProxyDaemon:
|
||||
body=await response.read(),
|
||||
)
|
||||
|
||||
def _get_upload_and_media_info(self, content_key, content):
|
||||
content_uri = content[content_key]
|
||||
|
||||
def _get_upload_and_media_info(self, content_uri: str):
|
||||
try:
|
||||
upload_info = self.upload_info[content_uri]
|
||||
except KeyError:
|
||||
@ -846,7 +877,6 @@ class ProxyDaemon:
|
||||
|
||||
self.upload_info[content_uri] = upload_info
|
||||
|
||||
content_uri = content[content_key]
|
||||
mxc = urlparse(content_uri)
|
||||
mxc_server = mxc.netloc.strip("/")
|
||||
mxc_path = mxc.path.strip("/")
|
||||
@ -859,13 +889,14 @@ class ProxyDaemon:
|
||||
|
||||
return upload_info, media_info
|
||||
|
||||
async def _map_decrypted_uri(self, content_key, content, request, client):
|
||||
upload_info, media_info = self._get_upload_and_media_info(content_key, content)
|
||||
async def _decrypt_uri(self, content_uri, client):
|
||||
upload_info, media_info = self._get_upload_and_media_info(content_uri)
|
||||
if not upload_info or not media_info:
|
||||
raise NotDecryptedAvailableError
|
||||
|
||||
response, decrypted_file = await self._load_decrypted_file(media_info.mxc_server, media_info.mxc_path,
|
||||
upload_info.filename)
|
||||
response, decrypted_file = await self._load_decrypted_file(
|
||||
media_info.mxc_server, media_info.mxc_path, upload_info.filename
|
||||
)
|
||||
|
||||
if response is None and decrypted_file is None:
|
||||
raise NotDecryptedAvailableError
|
||||
@ -875,7 +906,7 @@ class ProxyDaemon:
|
||||
|
||||
decrypted_upload, _ = await client.upload(
|
||||
data_provider=BufferedReader(BytesIO(decrypted_file)),
|
||||
content_type=response.content_type,
|
||||
content_type=upload_info.mimetype,
|
||||
filename=upload_info.filename,
|
||||
encrypt=False,
|
||||
filesize=len(decrypted_file),
|
||||
@ -884,9 +915,7 @@ class ProxyDaemon:
|
||||
if not isinstance(decrypted_upload, UploadResponse):
|
||||
raise NotDecryptedAvailableError
|
||||
|
||||
content[content_key] = decrypted_upload.content_uri
|
||||
|
||||
return content
|
||||
return decrypted_upload.content_uri
|
||||
|
||||
async def send_message(self, request):
|
||||
access_token = self.get_access_token(request)
|
||||
@ -900,12 +929,36 @@ class ProxyDaemon:
|
||||
|
||||
room_id = request.match_info["room_id"]
|
||||
|
||||
# The room is not in the joined rooms list, just forward it.
|
||||
try:
|
||||
room = client.rooms[room_id]
|
||||
encrypt = room.encrypted
|
||||
except KeyError:
|
||||
return await self.forward_to_web(request, token=client.access_token)
|
||||
# The room is not in the joined rooms list, either the pan client
|
||||
# didn't manage to sync the state or we're not joined, in either
|
||||
# case send an error response.
|
||||
if client.has_been_synced:
|
||||
return web.json_response(
|
||||
{
|
||||
"errcode": "M_FORBIDDEN",
|
||||
"error": "You do not have permission to send the event.",
|
||||
},
|
||||
headers=CORS_HEADERS,
|
||||
status=403,
|
||||
)
|
||||
else:
|
||||
logger.error(
|
||||
"The internal Pantalaimon client did not manage "
|
||||
"to sync with the server."
|
||||
)
|
||||
return web.json_response(
|
||||
{
|
||||
"errcode": "M_UNKNOWN",
|
||||
"error": "The pantalaimon client did not manage to sync with "
|
||||
"the server",
|
||||
},
|
||||
headers=CORS_HEADERS,
|
||||
status=500,
|
||||
)
|
||||
|
||||
# Don't encrypt reactions for now - they are weird and clients
|
||||
# need to support them like this.
|
||||
@ -923,10 +976,23 @@ class ProxyDaemon:
|
||||
# The room isn't encrypted just forward the message.
|
||||
if not encrypt:
|
||||
content_msgtype = content.get("msgtype")
|
||||
if content_msgtype in ["m.image", "m.video", "m.audio", "m.file"] or msgtype == "m.room.avatar":
|
||||
if (
|
||||
content_msgtype in ["m.image", "m.video", "m.audio", "m.file"]
|
||||
or msgtype == "m.room.avatar"
|
||||
):
|
||||
try:
|
||||
content = await self._map_decrypted_uri("url", content, request, client)
|
||||
return await self.forward_to_web(request, data=json.dumps(content), token=client.access_token)
|
||||
content["url"] = await self._decrypt_uri(content["url"], client)
|
||||
if (
|
||||
"info" in content
|
||||
and "thumbnail_url" in content["info"]
|
||||
and content["info"]["thumbnail_url"] is not None
|
||||
):
|
||||
content["info"]["thumbnail_url"] = await self._decrypt_uri(
|
||||
content["info"]["thumbnail_url"], client
|
||||
)
|
||||
return await self.forward_to_web(
|
||||
request, data=json.dumps(content), token=client.access_token
|
||||
)
|
||||
except ClientConnectionError as e:
|
||||
return web.Response(status=500, text=str(e))
|
||||
except (KeyError, NotDecryptedAvailableError):
|
||||
@ -939,8 +1005,13 @@ class ProxyDaemon:
|
||||
async def _send(ignore_unverified=False):
|
||||
try:
|
||||
content_msgtype = content.get("msgtype")
|
||||
if content_msgtype in ["m.image", "m.video", "m.audio", "m.file"] or msgtype == "m.room.avatar":
|
||||
upload_info, media_info = self._get_upload_and_media_info("url", content)
|
||||
if (
|
||||
content_msgtype in ["m.image", "m.video", "m.audio", "m.file"]
|
||||
or msgtype == "m.room.avatar"
|
||||
):
|
||||
upload_info, media_info = self._get_upload_and_media_info(
|
||||
content["url"]
|
||||
)
|
||||
if not upload_info or not media_info:
|
||||
response = await client.room_send(
|
||||
room_id, msgtype, content, txnid, ignore_unverified
|
||||
@ -953,10 +1024,21 @@ class ProxyDaemon:
|
||||
body=await response.transport_response.read(),
|
||||
)
|
||||
|
||||
media_content = media_info.to_content(content, upload_info.mimetype)
|
||||
media_info.to_content(content, upload_info.mimetype)
|
||||
if content["info"].get("thumbnail_url", False):
|
||||
(
|
||||
thumb_upload_info,
|
||||
thumb_media_info,
|
||||
) = self._get_upload_and_media_info(
|
||||
content["info"]["thumbnail_url"]
|
||||
)
|
||||
if thumb_upload_info and thumb_media_info:
|
||||
thumb_media_info.to_thumbnail(
|
||||
content, thumb_upload_info.mimetype
|
||||
)
|
||||
|
||||
response = await client.room_send(
|
||||
room_id, msgtype, media_content, txnid, ignore_unverified
|
||||
room_id, msgtype, content, txnid, ignore_unverified
|
||||
)
|
||||
else:
|
||||
response = await client.room_send(
|
||||
@ -974,7 +1056,7 @@ class ProxyDaemon:
|
||||
except SendRetryError as e:
|
||||
return web.Response(status=503, text=str(e))
|
||||
|
||||
# Aquire a semaphore here so we only send out one
|
||||
# Acquire a semaphore here so we only send out one
|
||||
# UnverifiedDevicesSignal
|
||||
sem = client.send_semaphores[room_id]
|
||||
|
||||
@ -1158,14 +1240,22 @@ class ProxyDaemon:
|
||||
body=await response.transport_response.read(),
|
||||
)
|
||||
|
||||
self.store.save_upload(self.name, response.content_uri, file_name, content_type)
|
||||
self.store.save_upload(
|
||||
self.name, response.content_uri, file_name, content_type
|
||||
)
|
||||
|
||||
mxc = urlparse(response.content_uri)
|
||||
mxc_server = mxc.netloc.strip("/")
|
||||
mxc_path = mxc.path.strip("/")
|
||||
|
||||
logger.info(f"Adding media info for {mxc_server}/{mxc_path} to the store")
|
||||
media_info = MediaInfo(mxc_server, mxc_path, maybe_keys["key"], maybe_keys["iv"], maybe_keys["hashes"])
|
||||
media_info = MediaInfo(
|
||||
mxc_server,
|
||||
mxc_path,
|
||||
maybe_keys["key"],
|
||||
maybe_keys["iv"],
|
||||
maybe_keys["hashes"],
|
||||
)
|
||||
self.store.save_media(self.name, media_info)
|
||||
|
||||
return web.Response(
|
||||
@ -1206,7 +1296,9 @@ class ProxyDaemon:
|
||||
client = next(iter(self.pan_clients.values()))
|
||||
|
||||
try:
|
||||
response = await client.download(server_name, media_id, file_name)
|
||||
response = await client.download(
|
||||
server_name=server_name, media_id=media_id, filename=file_name
|
||||
)
|
||||
except ClientConnectionError as e:
|
||||
raise e
|
||||
|
||||
@ -1239,8 +1331,12 @@ class ProxyDaemon:
|
||||
return self._not_json
|
||||
|
||||
try:
|
||||
content = await self._map_decrypted_uri("avatar_url", content, request, client)
|
||||
return await self.forward_to_web(request, data=json.dumps(content), token=client.access_token)
|
||||
content["avatar_url"] = await self._decrypt_uri(
|
||||
content["avatar_url"], client
|
||||
)
|
||||
return await self.forward_to_web(
|
||||
request, data=json.dumps(content), token=client.access_token
|
||||
)
|
||||
except ClientConnectionError as e:
|
||||
return web.Response(status=500, text=str(e))
|
||||
except (KeyError, NotDecryptedAvailableError):
|
||||
@ -1252,7 +1348,9 @@ class ProxyDaemon:
|
||||
file_name = request.match_info.get("file_name")
|
||||
|
||||
try:
|
||||
response, decrypted_file = await self._load_decrypted_file(server_name, media_id, file_name)
|
||||
response, decrypted_file = await self._load_decrypted_file(
|
||||
server_name, media_id, file_name
|
||||
)
|
||||
|
||||
if response is None and decrypted_file is None:
|
||||
return await self.forward_to_web(request)
|
||||
|
@ -23,7 +23,6 @@ if False:
|
||||
import json
|
||||
import os
|
||||
from functools import partial
|
||||
from typing import Any, Dict, List, Optional, Tuple
|
||||
|
||||
import attr
|
||||
import tantivy
|
||||
@ -230,7 +229,6 @@ if False:
|
||||
)
|
||||
|
||||
for message in query:
|
||||
|
||||
event = message.event
|
||||
|
||||
event_dict = {
|
||||
@ -501,6 +499,5 @@ if False:
|
||||
|
||||
return search_result
|
||||
|
||||
|
||||
else:
|
||||
INDEXING_ENABLED = False
|
||||
|
@ -15,7 +15,6 @@
|
||||
import asyncio
|
||||
import os
|
||||
import signal
|
||||
from typing import Optional
|
||||
|
||||
import click
|
||||
import janus
|
||||
@ -23,12 +22,13 @@ import keyring
|
||||
import logbook
|
||||
import nio
|
||||
from aiohttp import web
|
||||
from appdirs import user_config_dir, user_data_dir
|
||||
from platformdirs import user_config_dir, user_data_dir
|
||||
from logbook import StderrHandler
|
||||
|
||||
from pantalaimon.config import PanConfig, PanConfigError, parse_log_level
|
||||
from pantalaimon.daemon import ProxyDaemon
|
||||
from pantalaimon.log import logger
|
||||
from pantalaimon.store import KeyDroppingSqliteStore
|
||||
from pantalaimon.thread_messages import DaemonResponse
|
||||
from pantalaimon.ui import UI_ENABLED
|
||||
|
||||
@ -47,6 +47,8 @@ def create_dirs(data_dir, conf_dir):
|
||||
|
||||
async def init(data_dir, server_conf, send_queue, recv_queue):
|
||||
"""Initialize the proxy and the http server."""
|
||||
store_class = KeyDroppingSqliteStore if server_conf.drop_old_keys else None
|
||||
|
||||
proxy = ProxyDaemon(
|
||||
server_conf.name,
|
||||
server_conf.homeserver,
|
||||
@ -56,36 +58,56 @@ async def init(data_dir, server_conf, send_queue, recv_queue):
|
||||
recv_queue=recv_queue.async_q if recv_queue else None,
|
||||
proxy=server_conf.proxy.geturl() if server_conf.proxy else None,
|
||||
ssl=None if server_conf.ssl is True else False,
|
||||
client_store_class=store_class,
|
||||
)
|
||||
|
||||
# 100 MB max POST size
|
||||
app = web.Application(client_max_size=1024 ** 2 * 100)
|
||||
app = web.Application(client_max_size=1024**2 * 100)
|
||||
|
||||
app.add_routes(
|
||||
[
|
||||
web.post("/_matrix/client/r0/login", proxy.login),
|
||||
web.post("/_matrix/client/v3/login", proxy.login),
|
||||
web.get("/_matrix/client/r0/sync", proxy.sync),
|
||||
web.get("/_matrix/client/v3/sync", proxy.sync),
|
||||
web.post("/_matrix/client/r0/createRoom", proxy.createRoom),
|
||||
web.post("/_matrix/client/v3/createRoom", proxy.createRoom),
|
||||
web.get("/_matrix/client/r0/rooms/{room_id}/messages", proxy.messages),
|
||||
web.get("/_matrix/client/v3/rooms/{room_id}/messages", proxy.messages),
|
||||
web.put(
|
||||
r"/_matrix/client/r0/rooms/{room_id}/send/{event_type}/{txnid}",
|
||||
proxy.send_message,
|
||||
),
|
||||
web.put(
|
||||
r"/_matrix/client/v3/rooms/{room_id}/send/{event_type}/{txnid}",
|
||||
proxy.send_message,
|
||||
),
|
||||
web.post(
|
||||
r"/_matrix/client/r0/rooms/{room_id}/send/{event_type}",
|
||||
proxy.send_message,
|
||||
),
|
||||
web.post("/_matrix/client/r0/user/{user_id}/filter", proxy.filter),
|
||||
web.post("/_matrix/client/v3/user/{user_id}/filter", proxy.filter),
|
||||
web.post("/.well-known/matrix/client", proxy.well_known),
|
||||
web.get("/.well-known/matrix/client", proxy.well_known),
|
||||
web.post("/_matrix/client/r0/search", proxy.search),
|
||||
web.post("/_matrix/client/v3/search", proxy.search),
|
||||
web.options("/_matrix/client/r0/search", proxy.search_opts),
|
||||
web.options("/_matrix/client/v3/search", proxy.search_opts),
|
||||
web.get(
|
||||
"/_matrix/media/v1/download/{server_name}/{media_id}", proxy.download
|
||||
),
|
||||
web.get(
|
||||
"/_matrix/media/v3/download/{server_name}/{media_id}", proxy.download
|
||||
),
|
||||
web.get(
|
||||
"/_matrix/media/v1/download/{server_name}/{media_id}/{file_name}",
|
||||
proxy.download,
|
||||
),
|
||||
web.get(
|
||||
"/_matrix/media/v3/download/{server_name}/{media_id}/{file_name}",
|
||||
proxy.download,
|
||||
),
|
||||
web.get(
|
||||
"/_matrix/media/r0/download/{server_name}/{media_id}", proxy.download
|
||||
),
|
||||
@ -97,11 +119,18 @@ async def init(data_dir, server_conf, send_queue, recv_queue):
|
||||
r"/_matrix/media/r0/upload",
|
||||
proxy.upload,
|
||||
),
|
||||
web.post(
|
||||
r"/_matrix/media/v3/upload",
|
||||
proxy.upload,
|
||||
),
|
||||
web.put(
|
||||
r"/_matrix/client/r0/profile/{userId}/avatar_url",
|
||||
proxy.profile,
|
||||
),
|
||||
|
||||
web.put(
|
||||
r"/_matrix/client/v3/profile/{userId}/avatar_url",
|
||||
proxy.profile,
|
||||
),
|
||||
]
|
||||
)
|
||||
app.router.add_route("*", "/" + "{proxyPath:.*}", proxy.router)
|
||||
@ -259,7 +288,7 @@ async def daemon(context, log_level, debug_encryption, config, data_path):
|
||||
"connect to pantalaimon."
|
||||
)
|
||||
)
|
||||
@click.version_option(version="0.9.1", prog_name="pantalaimon")
|
||||
@click.version_option(version="0.10.5", prog_name="pantalaimon")
|
||||
@click.option(
|
||||
"--log-level",
|
||||
type=click.Choice(["error", "warning", "info", "debug"]),
|
||||
|
@ -20,10 +20,16 @@ import sys
|
||||
from collections import defaultdict
|
||||
from itertools import zip_longest
|
||||
from typing import List
|
||||
from shlex import split
|
||||
|
||||
import attr
|
||||
import click
|
||||
from gi.repository import GLib
|
||||
|
||||
try:
|
||||
from gi.repository import GLib
|
||||
except ModuleNotFoundError:
|
||||
from pgi.repository import GLib
|
||||
|
||||
from prompt_toolkit import __version__ as ptk_version
|
||||
from prompt_toolkit import HTML, PromptSession, print_formatted_text
|
||||
from prompt_toolkit.completion import Completer, Completion, PathCompleter
|
||||
@ -459,7 +465,7 @@ class PanCtl:
|
||||
def sas_done(self, pan_user, user_id, device_id, _):
|
||||
print(
|
||||
f"Device {device_id} of user {user_id}"
|
||||
f" succesfully verified for pan user {pan_user}."
|
||||
f" successfully verified for pan user {pan_user}."
|
||||
)
|
||||
|
||||
def show_sas_invite(self, pan_user, user_id, device_id, _):
|
||||
@ -584,7 +590,7 @@ class PanCtl:
|
||||
parser = PanctlParser(self.commands)
|
||||
|
||||
try:
|
||||
args = parser.parse_args(result.split())
|
||||
args = parser.parse_args(split(result, posix=False))
|
||||
except ParseError:
|
||||
continue
|
||||
|
||||
@ -690,9 +696,9 @@ class PanCtl:
|
||||
"the pantalaimon daemon."
|
||||
)
|
||||
)
|
||||
@click.version_option(version="0.9.1", prog_name="panctl")
|
||||
@click.version_option(version="0.10.5", prog_name="panctl")
|
||||
def main():
|
||||
loop = asyncio.get_event_loop()
|
||||
loop = asyncio.new_event_loop()
|
||||
glib_loop = GLib.MainLoop()
|
||||
|
||||
try:
|
||||
|
@ -15,13 +15,15 @@
|
||||
import json
|
||||
import os
|
||||
from collections import defaultdict
|
||||
from typing import Any, Dict, List, Optional, Tuple
|
||||
from typing import Any, Dict
|
||||
|
||||
import attr
|
||||
from nio.crypto import TrustState
|
||||
from nio.crypto import TrustState, GroupSessionStore
|
||||
from nio.store import (
|
||||
Accounts,
|
||||
MegolmInboundSessions,
|
||||
DeviceKeys,
|
||||
SqliteStore,
|
||||
DeviceTrustState,
|
||||
use_database,
|
||||
use_database_atomic,
|
||||
@ -29,7 +31,6 @@ from nio.store import (
|
||||
from peewee import SQL, DoesNotExist, ForeignKeyField, Model, SqliteDatabase, TextField
|
||||
from cachetools import LRUCache
|
||||
|
||||
|
||||
MAX_LOADED_MEDIA = 10000
|
||||
MAX_LOADED_UPLOAD = 10000
|
||||
|
||||
@ -50,15 +51,23 @@ class MediaInfo:
|
||||
|
||||
def to_content(self, content: Dict, mime_type: str) -> Dict[Any, Any]:
|
||||
content["file"] = {
|
||||
"v": "v2",
|
||||
"key": self.key,
|
||||
"iv": self.iv,
|
||||
"hashes": self.hashes,
|
||||
"url": content["url"],
|
||||
"mimetype": mime_type,
|
||||
"v": "v2",
|
||||
"key": self.key,
|
||||
"iv": self.iv,
|
||||
"hashes": self.hashes,
|
||||
"url": content["url"],
|
||||
"mimetype": mime_type,
|
||||
}
|
||||
|
||||
return content
|
||||
def to_thumbnail(self, content: Dict, mime_type: str) -> Dict[Any, Any]:
|
||||
content["info"]["thumbnail_file"] = {
|
||||
"v": "v2",
|
||||
"key": self.key,
|
||||
"iv": self.iv,
|
||||
"hashes": self.hashes,
|
||||
"url": content["info"]["thumbnail_url"],
|
||||
"mimetype": mime_type,
|
||||
}
|
||||
|
||||
|
||||
@attr.s
|
||||
@ -245,32 +254,34 @@ class PanStore:
|
||||
hashes=media.hashes,
|
||||
).on_conflict_ignore().execute()
|
||||
|
||||
@use_database
|
||||
def load_media_cache(self, server):
|
||||
server, _ = Servers.get_or_create(name=server)
|
||||
media_cache = LRUCache(maxsize=MAX_LOADED_MEDIA)
|
||||
|
||||
for i, m in enumerate(server.media):
|
||||
if i > MAX_LOADED_MEDIA:
|
||||
break
|
||||
|
||||
media = MediaInfo(m.mxc_server, m.mxc_path, m.key, m.iv, m.hashes)
|
||||
media_cache[(m.mxc_server, m.mxc_path)] = media
|
||||
|
||||
return media_cache
|
||||
|
||||
@use_database
|
||||
def load_media(self, server, mxc_server=None, mxc_path=None):
|
||||
server, _ = Servers.get_or_create(name=server)
|
||||
|
||||
if not mxc_path:
|
||||
media_cache = LRUCache(maxsize=MAX_LOADED_MEDIA)
|
||||
m = PanMediaInfo.get_or_none(
|
||||
PanMediaInfo.server == server,
|
||||
PanMediaInfo.mxc_server == mxc_server,
|
||||
PanMediaInfo.mxc_path == mxc_path,
|
||||
)
|
||||
|
||||
for i, m in enumerate(server.media):
|
||||
if i > MAX_LOADED_MEDIA:
|
||||
break
|
||||
if not m:
|
||||
return None
|
||||
|
||||
media = MediaInfo(m.mxc_server, m.mxc_path, m.key, m.iv, m.hashes)
|
||||
media_cache[(m.mxc_server, m.mxc_path)] = media
|
||||
|
||||
return media_cache
|
||||
else:
|
||||
m = PanMediaInfo.get_or_none(
|
||||
PanMediaInfo.server == server,
|
||||
PanMediaInfo.mxc_server == mxc_server,
|
||||
PanMediaInfo.mxc_path == mxc_path,
|
||||
)
|
||||
|
||||
if not m:
|
||||
return None
|
||||
|
||||
return MediaInfo(m.mxc_server, m.mxc_path, m.key, m.iv, m.hashes)
|
||||
return MediaInfo(m.mxc_server, m.mxc_path, m.key, m.iv, m.hashes)
|
||||
|
||||
@use_database_atomic
|
||||
def replace_fetcher_task(self, server, pan_user, old_task, new_task):
|
||||
@ -420,7 +431,6 @@ class PanStore:
|
||||
device_store = defaultdict(dict)
|
||||
|
||||
for d in account.device_keys:
|
||||
|
||||
if d.deleted:
|
||||
continue
|
||||
|
||||
@ -443,3 +453,47 @@ class PanStore:
|
||||
store[account.user_id] = device_store
|
||||
|
||||
return store
|
||||
|
||||
|
||||
class KeyDroppingSqliteStore(SqliteStore):
|
||||
@use_database
|
||||
def save_inbound_group_session(self, session):
|
||||
"""Save the provided Megolm inbound group session to the database.
|
||||
|
||||
Args:
|
||||
session (InboundGroupSession): The session to save.
|
||||
"""
|
||||
account = self._get_account()
|
||||
assert account
|
||||
|
||||
MegolmInboundSessions.delete().where(
|
||||
MegolmInboundSessions.sender_key == session.sender_key,
|
||||
MegolmInboundSessions.account == account,
|
||||
MegolmInboundSessions.room_id == session.room_id,
|
||||
).execute()
|
||||
|
||||
super().save_inbound_group_session(session)
|
||||
|
||||
@use_database
|
||||
def load_inbound_group_sessions(self):
|
||||
store = super().load_inbound_group_sessions()
|
||||
|
||||
return KeyDroppingGroupSessionStore.from_group_session_store(store)
|
||||
|
||||
|
||||
class KeyDroppingGroupSessionStore(GroupSessionStore):
|
||||
def from_group_session_store(store):
|
||||
new_store = KeyDroppingGroupSessionStore()
|
||||
new_store._entries = store._entries
|
||||
|
||||
return new_store
|
||||
|
||||
def add(self, session) -> bool:
|
||||
room_id = session.room_id
|
||||
sender_key = session.sender_key
|
||||
if session in self._entries[room_id][sender_key].values():
|
||||
return False
|
||||
|
||||
self._entries[room_id][sender_key].clear()
|
||||
self._entries[room_id][sender_key][session.id] = session
|
||||
return True
|
||||
|
@ -470,14 +470,14 @@ if UI_ENABLED:
|
||||
self.bus.publish("org.pantalaimon1", self.control_if, self.device_if)
|
||||
|
||||
def unverified_notification(self, message):
|
||||
notificaton = notify2.Notification(
|
||||
notification = notify2.Notification(
|
||||
"Unverified devices.",
|
||||
message=(
|
||||
f"There are unverified devices in the room "
|
||||
f"{message.room_display_name}."
|
||||
),
|
||||
)
|
||||
notificaton.set_category("im")
|
||||
notification.set_category("im")
|
||||
|
||||
def send_cb(notification, action_key, user_data):
|
||||
message = user_data
|
||||
@ -488,20 +488,20 @@ if UI_ENABLED:
|
||||
self.control_if.CancelSending(message.pan_user, message.room_id)
|
||||
|
||||
if "actions" in notify2.get_server_caps():
|
||||
notificaton.add_action("send", "Send anyways", send_cb, message)
|
||||
notificaton.add_action("cancel", "Cancel sending", cancel_cb, message)
|
||||
notification.add_action("send", "Send anyways", send_cb, message)
|
||||
notification.add_action("cancel", "Cancel sending", cancel_cb, message)
|
||||
|
||||
notificaton.show()
|
||||
notification.show()
|
||||
|
||||
def sas_invite_notification(self, message):
|
||||
notificaton = notify2.Notification(
|
||||
notification = notify2.Notification(
|
||||
"Key verification invite",
|
||||
message=(
|
||||
f"{message.user_id} via {message.device_id} has started "
|
||||
f"a key verification process."
|
||||
),
|
||||
)
|
||||
notificaton.set_category("im")
|
||||
notification.set_category("im")
|
||||
|
||||
def accept_cb(notification, action_key, user_data):
|
||||
message = user_data
|
||||
@ -516,17 +516,17 @@ if UI_ENABLED:
|
||||
)
|
||||
|
||||
if "actions" in notify2.get_server_caps():
|
||||
notificaton.add_action("accept", "Accept", accept_cb, message)
|
||||
notificaton.add_action("cancel", "Cancel", cancel_cb, message)
|
||||
notification.add_action("accept", "Accept", accept_cb, message)
|
||||
notification.add_action("cancel", "Cancel", cancel_cb, message)
|
||||
|
||||
notificaton.show()
|
||||
notification.show()
|
||||
|
||||
def sas_show_notification(self, message):
|
||||
emojis = [x[0] for x in message.emoji]
|
||||
|
||||
emoji_str = " ".join(emojis)
|
||||
|
||||
notificaton = notify2.Notification(
|
||||
notification = notify2.Notification(
|
||||
"Short authentication string",
|
||||
message=(
|
||||
f"Short authentication string for the key verification of"
|
||||
@ -534,7 +534,7 @@ if UI_ENABLED:
|
||||
f"{emoji_str}"
|
||||
),
|
||||
)
|
||||
notificaton.set_category("im")
|
||||
notification.set_category("im")
|
||||
|
||||
def confirm_cb(notification, action_key, user_data):
|
||||
message = user_data
|
||||
@ -549,21 +549,21 @@ if UI_ENABLED:
|
||||
)
|
||||
|
||||
if "actions" in notify2.get_server_caps():
|
||||
notificaton.add_action("confirm", "Confirm", confirm_cb, message)
|
||||
notificaton.add_action("cancel", "Cancel", cancel_cb, message)
|
||||
notification.add_action("confirm", "Confirm", confirm_cb, message)
|
||||
notification.add_action("cancel", "Cancel", cancel_cb, message)
|
||||
|
||||
notificaton.show()
|
||||
notification.show()
|
||||
|
||||
def sas_done_notification(self, message):
|
||||
notificaton = notify2.Notification(
|
||||
notification = notify2.Notification(
|
||||
"Device successfully verified.",
|
||||
message=(
|
||||
f"Device {message.device_id} of user {message.user_id} "
|
||||
f"successfully verified."
|
||||
),
|
||||
)
|
||||
notificaton.set_category("im")
|
||||
notificaton.show()
|
||||
notification.set_category("im")
|
||||
notification.show()
|
||||
|
||||
def message_callback(self):
|
||||
try:
|
||||
|
19
setup.py
19
setup.py
@ -7,12 +7,11 @@ with open("README.md", encoding="utf-8") as f:
|
||||
|
||||
setup(
|
||||
name="pantalaimon",
|
||||
version="0.9.1",
|
||||
version="0.10.5",
|
||||
url="https://github.com/matrix-org/pantalaimon",
|
||||
author="The Matrix.org Team",
|
||||
author_email="poljar@termina.org.uk",
|
||||
description=("A Matrix proxy daemon that adds E2E encryption "
|
||||
"capabilities."),
|
||||
description=("A Matrix proxy daemon that adds E2E encryption " "capabilities."),
|
||||
long_description=long_description,
|
||||
long_description_content_type="text/markdown",
|
||||
license="Apache License, Version 2.0",
|
||||
@ -20,7 +19,7 @@ setup(
|
||||
install_requires=[
|
||||
"attrs >= 19.3.0",
|
||||
"aiohttp >= 3.6, < 4.0",
|
||||
"appdirs >= 1.4.4",
|
||||
"platformdirs >= 4.3.6",
|
||||
"click >= 7.1.2",
|
||||
"keyring >= 21.2.1",
|
||||
"logbook >= 1.5.3",
|
||||
@ -29,19 +28,21 @@ setup(
|
||||
"cachetools >= 3.0.0",
|
||||
"prompt_toolkit > 2, < 4",
|
||||
"typing;python_version<'3.5'",
|
||||
"matrix-nio[e2e] >= 0.14, < 0.17"
|
||||
"matrix-nio[e2e] >= 0.24, < 0.25.2",
|
||||
],
|
||||
extras_require={
|
||||
"ui": [
|
||||
"dbus-python >= 1.2, < 1.3",
|
||||
"PyGObject >= 3.36, < 3.39",
|
||||
"PyGObject >= 3.46, < 3.50",
|
||||
"pydbus >= 0.6, < 0.7",
|
||||
"notify2 >= 0.3, < 0.4",
|
||||
]
|
||||
},
|
||||
entry_points={
|
||||
"console_scripts": ["pantalaimon=pantalaimon.main:main",
|
||||
"panctl=pantalaimon.panctl:main"],
|
||||
"console_scripts": [
|
||||
"pantalaimon=pantalaimon.main:main",
|
||||
"panctl=pantalaimon.panctl:main",
|
||||
],
|
||||
},
|
||||
zip_safe=False
|
||||
zip_safe=False,
|
||||
)
|
||||
|
@ -34,11 +34,9 @@ class Provider(BaseProvider):
|
||||
def client(self):
|
||||
return ClientInfo(faker.mx_id(), faker.access_token())
|
||||
|
||||
|
||||
def avatar_url(self):
|
||||
return "mxc://{}/{}#auto".format(
|
||||
faker.hostname(),
|
||||
"".join(choices(ascii_letters) for i in range(24))
|
||||
faker.hostname(), "".join(choices(ascii_letters) for i in range(24))
|
||||
)
|
||||
|
||||
def olm_key_pair(self):
|
||||
@ -56,7 +54,6 @@ class Provider(BaseProvider):
|
||||
)
|
||||
|
||||
|
||||
|
||||
faker.add_provider(Provider)
|
||||
|
||||
|
||||
@ -80,13 +77,7 @@ def tempdir():
|
||||
@pytest.fixture
|
||||
def panstore(tempdir):
|
||||
for _ in range(10):
|
||||
store = SqliteStore(
|
||||
faker.mx_id(),
|
||||
faker.device_id(),
|
||||
tempdir,
|
||||
"",
|
||||
"pan.db"
|
||||
)
|
||||
store = SqliteStore(faker.mx_id(), faker.device_id(), tempdir, "", "pan.db")
|
||||
account = OlmAccount()
|
||||
store.save_account(account)
|
||||
|
||||
@ -130,21 +121,23 @@ async def pan_proxy_server(tempdir, aiohttp_server):
|
||||
recv_queue=ui_queue.async_q,
|
||||
proxy=None,
|
||||
ssl=False,
|
||||
client_store_class=SqliteStore
|
||||
client_store_class=SqliteStore,
|
||||
)
|
||||
|
||||
app.add_routes([
|
||||
web.post("/_matrix/client/r0/login", proxy.login),
|
||||
web.get("/_matrix/client/r0/sync", proxy.sync),
|
||||
web.get("/_matrix/client/r0/rooms/{room_id}/messages", proxy.messages),
|
||||
web.put(
|
||||
r"/_matrix/client/r0/rooms/{room_id}/send/{event_type}/{txnid}",
|
||||
proxy.send_message
|
||||
),
|
||||
web.post("/_matrix/client/r0/user/{user_id}/filter", proxy.filter),
|
||||
web.post("/_matrix/client/r0/search", proxy.search),
|
||||
web.options("/_matrix/client/r0/search", proxy.search_opts),
|
||||
])
|
||||
app.add_routes(
|
||||
[
|
||||
web.post("/_matrix/client/r0/login", proxy.login),
|
||||
web.get("/_matrix/client/r0/sync", proxy.sync),
|
||||
web.get("/_matrix/client/r0/rooms/{room_id}/messages", proxy.messages),
|
||||
web.put(
|
||||
r"/_matrix/client/r0/rooms/{room_id}/send/{event_type}/{txnid}",
|
||||
proxy.send_message,
|
||||
),
|
||||
web.post("/_matrix/client/r0/user/{user_id}/filter", proxy.filter),
|
||||
web.post("/_matrix/client/r0/search", proxy.search),
|
||||
web.options("/_matrix/client/r0/search", proxy.search_opts),
|
||||
]
|
||||
)
|
||||
|
||||
server = await aiohttp_server(app)
|
||||
|
||||
@ -161,7 +154,7 @@ async def running_proxy(pan_proxy_server, aioresponse, aiohttp_client):
|
||||
"access_token": "abc123",
|
||||
"device_id": "GHTYAJCE",
|
||||
"home_server": "example.org",
|
||||
"user_id": "@example:example.org"
|
||||
"user_id": "@example:example.org",
|
||||
}
|
||||
|
||||
aioclient = await aiohttp_client(server)
|
||||
@ -170,7 +163,7 @@ async def running_proxy(pan_proxy_server, aioresponse, aiohttp_client):
|
||||
"https://example.org/_matrix/client/r0/login",
|
||||
status=200,
|
||||
payload=login_response,
|
||||
repeat=True
|
||||
repeat=True,
|
||||
)
|
||||
|
||||
await aioclient.post(
|
||||
@ -179,7 +172,7 @@ async def running_proxy(pan_proxy_server, aioresponse, aiohttp_client):
|
||||
"type": "m.login.password",
|
||||
"user": "example",
|
||||
"password": "wordpass",
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
yield server, aioclient, proxy, queues
|
||||
|
@ -28,7 +28,7 @@ ALICE_ID = "@alice:example.org"
|
||||
async def client(tmpdir, loop):
|
||||
store = PanStore(tmpdir)
|
||||
queue = janus.Queue()
|
||||
conf = ServerConfig("example", "https://exapmle.org")
|
||||
conf = ServerConfig("example", "https://example.org")
|
||||
conf.history_fetch_delay = 0.1
|
||||
|
||||
store.save_server_user("example", "@example:example.org")
|
||||
@ -380,7 +380,9 @@ class TestClass(object):
|
||||
)
|
||||
|
||||
aioresponse.get(
|
||||
sync_url, status=200, payload=self.initial_sync_response,
|
||||
sync_url,
|
||||
status=200,
|
||||
payload=self.initial_sync_response,
|
||||
)
|
||||
|
||||
aioresponse.get(sync_url, status=200, payload=self.empty_sync, repeat=True)
|
||||
@ -421,7 +423,7 @@ class TestClass(object):
|
||||
tasks = client.pan_store.load_fetcher_tasks(client.server_name, client.user_id)
|
||||
assert len(tasks) == 1
|
||||
|
||||
# Check that the task is our prev_batch from the sync resposne
|
||||
# Check that the task is our prev_batch from the sync response
|
||||
assert tasks[0].room_id == TEST_ROOM_ID
|
||||
assert tasks[0].token == "t392-516_47314_0_7_1_1_1_11444_1"
|
||||
|
||||
@ -431,7 +433,7 @@ class TestClass(object):
|
||||
tasks = client.pan_store.load_fetcher_tasks(client.server_name, client.user_id)
|
||||
assert len(tasks) == 1
|
||||
|
||||
# Check that the task is our end token from the messages resposne
|
||||
# Check that the task is our end token from the messages response
|
||||
assert tasks[0].room_id == TEST_ROOM_ID
|
||||
assert tasks[0].token == "t47409-4357353_219380_26003_2265"
|
||||
|
||||
@ -454,7 +456,9 @@ class TestClass(object):
|
||||
)
|
||||
|
||||
aioresponse.get(
|
||||
sync_url, status=200, payload=self.initial_sync_response,
|
||||
sync_url,
|
||||
status=200,
|
||||
payload=self.initial_sync_response,
|
||||
)
|
||||
|
||||
aioresponse.get(sync_url, status=200, payload=self.empty_sync, repeat=True)
|
||||
@ -519,7 +523,7 @@ class TestClass(object):
|
||||
)
|
||||
assert len(tasks) == 1
|
||||
|
||||
# Check that the task is our end token from the messages resposne
|
||||
# Check that the task is our end token from the messages response
|
||||
assert tasks[0].room_id == TEST_ROOM_ID
|
||||
assert tasks[0].token == "t47409-4357353_219380_26003_2265"
|
||||
|
||||
|
@ -1,9 +1,7 @@
|
||||
import asyncio
|
||||
import json
|
||||
import re
|
||||
from collections import defaultdict
|
||||
|
||||
from aiohttp import web
|
||||
from nio.crypto import OlmDevice
|
||||
|
||||
from conftest import faker
|
||||
@ -27,7 +25,7 @@ class TestClass(object):
|
||||
"access_token": "abc123",
|
||||
"device_id": "GHTYAJCE",
|
||||
"home_server": "example.org",
|
||||
"user_id": "@example:example.org"
|
||||
"user_id": "@example:example.org",
|
||||
}
|
||||
|
||||
@property
|
||||
@ -36,12 +34,7 @@ class TestClass(object):
|
||||
|
||||
@property
|
||||
def keys_upload_response(self):
|
||||
return {
|
||||
"one_time_key_counts": {
|
||||
"curve25519": 10,
|
||||
"signed_curve25519": 20
|
||||
}
|
||||
}
|
||||
return {"one_time_key_counts": {"curve25519": 10, "signed_curve25519": 20}}
|
||||
|
||||
@property
|
||||
def example_devices(self):
|
||||
@ -52,10 +45,7 @@ class TestClass(object):
|
||||
devices[device.user_id][device.id] = device
|
||||
|
||||
bob_device = OlmDevice(
|
||||
BOB_ID,
|
||||
BOB_DEVICE,
|
||||
{"ed25519": BOB_ONETIME,
|
||||
"curve25519": BOB_CURVE}
|
||||
BOB_ID, BOB_DEVICE, {"ed25519": BOB_ONETIME, "curve25519": BOB_CURVE}
|
||||
)
|
||||
|
||||
devices[BOB_ID][BOB_DEVICE] = bob_device
|
||||
@ -71,7 +61,7 @@ class TestClass(object):
|
||||
"https://example.org/_matrix/client/r0/login",
|
||||
status=200,
|
||||
payload=self.login_response,
|
||||
repeat=True
|
||||
repeat=True,
|
||||
)
|
||||
|
||||
assert not daemon.pan_clients
|
||||
@ -82,7 +72,7 @@ class TestClass(object):
|
||||
"type": "m.login.password",
|
||||
"user": "example",
|
||||
"password": "wordpass",
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
assert resp.status == 200
|
||||
@ -105,11 +95,11 @@ class TestClass(object):
|
||||
"https://example.org/_matrix/client/r0/login",
|
||||
status=200,
|
||||
payload=self.login_response,
|
||||
repeat=True
|
||||
repeat=True,
|
||||
)
|
||||
|
||||
sync_url = re.compile(
|
||||
r'^https://example\.org/_matrix/client/r0/sync\?access_token=.*'
|
||||
r"^https://example\.org/_matrix/client/r0/sync\?access_token=.*"
|
||||
)
|
||||
|
||||
aioresponse.get(
|
||||
@ -124,14 +114,16 @@ class TestClass(object):
|
||||
"type": "m.login.password",
|
||||
"user": "example",
|
||||
"password": "wordpass",
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
# Check that the pan client started to sync after logging in.
|
||||
pan_client = list(daemon.pan_clients.values())[0]
|
||||
assert len(pan_client.rooms) == 1
|
||||
|
||||
async def test_pan_client_keys_upload(self, pan_proxy_server, aiohttp_client, aioresponse):
|
||||
async def test_pan_client_keys_upload(
|
||||
self, pan_proxy_server, aiohttp_client, aioresponse
|
||||
):
|
||||
server, daemon, _ = pan_proxy_server
|
||||
|
||||
client = await aiohttp_client(server)
|
||||
@ -140,11 +132,11 @@ class TestClass(object):
|
||||
"https://example.org/_matrix/client/r0/login",
|
||||
status=200,
|
||||
payload=self.login_response,
|
||||
repeat=True
|
||||
repeat=True,
|
||||
)
|
||||
|
||||
sync_url = re.compile(
|
||||
r'^https://example\.org/_matrix/client/r0/sync\?access_token=.*'
|
||||
r"^https://example\.org/_matrix/client/r0/sync\?access_token=.*"
|
||||
)
|
||||
|
||||
aioresponse.get(
|
||||
@ -169,7 +161,7 @@ class TestClass(object):
|
||||
"type": "m.login.password",
|
||||
"user": "example",
|
||||
"password": "wordpass",
|
||||
}
|
||||
},
|
||||
)
|
||||
|
||||
pan_client = list(daemon.pan_clients.values())[0]
|
||||
|
@ -1,12 +1,10 @@
|
||||
import asyncio
|
||||
import pdb
|
||||
import pprint
|
||||
import pytest
|
||||
|
||||
from nio import RoomMessage, RoomEncryptedMedia
|
||||
|
||||
from urllib.parse import urlparse
|
||||
from conftest import faker
|
||||
from pantalaimon.index import INDEXING_ENABLED
|
||||
from pantalaimon.store import FetchTask, MediaInfo, UploadInfo
|
||||
|
||||
@ -27,7 +25,7 @@ class TestClass(object):
|
||||
"type": "m.room.message",
|
||||
"unsigned": {"age": 43289803095},
|
||||
"user_id": "@example2:localhost",
|
||||
"age": 43289803095
|
||||
"age": 43289803095,
|
||||
}
|
||||
)
|
||||
|
||||
@ -43,43 +41,44 @@ class TestClass(object):
|
||||
"type": "m.room.message",
|
||||
"unsigned": {"age": 43289803095},
|
||||
"user_id": "@example2:localhost",
|
||||
"age": 43289803095
|
||||
"age": 43289803095,
|
||||
}
|
||||
)
|
||||
|
||||
@property
|
||||
def encrypted_media_event(self):
|
||||
return RoomEncryptedMedia.from_dict({
|
||||
"room_id": "!testroom:localhost",
|
||||
"event_id": "$15163622445EBvZK:localhost",
|
||||
"origin_server_ts": 1516362244030,
|
||||
"sender": "@example2:localhost",
|
||||
"type": "m.room.message",
|
||||
"content": {
|
||||
"body": "orange_cat.jpg",
|
||||
"msgtype": "m.image",
|
||||
"file": {
|
||||
"v": "v2",
|
||||
"key": {
|
||||
"alg": "A256CTR",
|
||||
"ext": True,
|
||||
"k": "yx0QvkgYlasdWEsdalkejaHBzCkKEBAp3tB7dGtWgrs",
|
||||
"key_ops": ["encrypt", "decrypt"],
|
||||
"kty": "oct"
|
||||
return RoomEncryptedMedia.from_dict(
|
||||
{
|
||||
"room_id": "!testroom:localhost",
|
||||
"event_id": "$15163622445EBvZK:localhost",
|
||||
"origin_server_ts": 1516362244030,
|
||||
"sender": "@example2:localhost",
|
||||
"type": "m.room.message",
|
||||
"content": {
|
||||
"body": "orange_cat.jpg",
|
||||
"msgtype": "m.image",
|
||||
"file": {
|
||||
"v": "v2",
|
||||
"key": {
|
||||
"alg": "A256CTR",
|
||||
"ext": True,
|
||||
"k": "yx0QvkgYlasdWEsdalkejaHBzCkKEBAp3tB7dGtWgrs",
|
||||
"key_ops": ["encrypt", "decrypt"],
|
||||
"kty": "oct",
|
||||
},
|
||||
"iv": "0pglXX7fspIBBBBAEERLFd",
|
||||
"hashes": {
|
||||
"sha256": "eXRDFvh+aXsQRj8a+5ZVVWUQ9Y6u9DYiz4tq1NvbLu8"
|
||||
},
|
||||
"url": "mxc://localhost/maDtasSiPFjROFMnlwxIhhyW",
|
||||
"mimetype": "image/jpeg",
|
||||
},
|
||||
"iv": "0pglXX7fspIBBBBAEERLFd",
|
||||
"hashes": {
|
||||
"sha256": "eXRDFvh+aXsQRj8a+5ZVVWUQ9Y6u9DYiz4tq1NvbLu8"
|
||||
},
|
||||
"url": "mxc://localhost/maDtasSiPFjROFMnlwxIhhyW",
|
||||
"mimetype": "image/jpeg"
|
||||
}
|
||||
},
|
||||
}
|
||||
})
|
||||
)
|
||||
|
||||
def test_account_loading(self, panstore):
|
||||
accounts = panstore.load_all_users()
|
||||
# pdb.set_trace()
|
||||
assert len(accounts) == 10
|
||||
|
||||
def test_token_saving(self, panstore, access_token):
|
||||
@ -130,7 +129,8 @@ class TestClass(object):
|
||||
if not INDEXING_ENABLED:
|
||||
pytest.skip("Indexing needs to be enabled to test this")
|
||||
|
||||
from pantalaimon.index import Index, IndexStore
|
||||
from pantalaimon.index import IndexStore
|
||||
|
||||
loop = asyncio.get_event_loop()
|
||||
|
||||
store = IndexStore("example", tempdir)
|
||||
@ -148,12 +148,14 @@ class TestClass(object):
|
||||
assert len(result["results"]) == 1
|
||||
assert result["count"] == 1
|
||||
assert result["results"][0]["result"] == self.test_event.source
|
||||
assert (result["results"][0]["context"]["events_after"][0]
|
||||
== self.another_event.source)
|
||||
assert (
|
||||
result["results"][0]["context"]["events_after"][0]
|
||||
== self.another_event.source
|
||||
)
|
||||
|
||||
def test_media_storage(self, panstore):
|
||||
server_name = "test"
|
||||
media_cache = panstore.load_media(server_name)
|
||||
media_cache = panstore.load_media_cache(server_name)
|
||||
assert not media_cache
|
||||
|
||||
event = self.encrypted_media_event
|
||||
@ -171,7 +173,7 @@ class TestClass(object):
|
||||
|
||||
panstore.save_media(server_name, media)
|
||||
|
||||
media_cache = panstore.load_media(server_name)
|
||||
media_cache = panstore.load_media_cache(server_name)
|
||||
|
||||
assert (mxc_server, mxc_path) in media_cache
|
||||
media_info = media_cache[(mxc_server, mxc_path)]
|
||||
|
13
tox.ini
13
tox.ini
@ -1,21 +1,14 @@
|
||||
# content of: tox.ini , put in same dir as setup.py
|
||||
[tox]
|
||||
envlist = py38,py39,coverage
|
||||
[testenv]
|
||||
basepython =
|
||||
py38: python3.8
|
||||
py39: python3.9
|
||||
py3: python3.9
|
||||
envlist = coverage
|
||||
|
||||
[testenv]
|
||||
deps = -rtest-requirements.txt
|
||||
install_command = pip install {opts} {packages}
|
||||
|
||||
passenv = TOXENV CI TRAVIS TRAVIS_*
|
||||
passenv = TOXENV,CI
|
||||
commands = pytest
|
||||
usedevelop = True
|
||||
|
||||
[testenv:coverage]
|
||||
basepython = python3.9
|
||||
commands =
|
||||
pytest --cov=pantalaimon --cov-report term-missing
|
||||
coverage xml
|
||||
|
Loading…
x
Reference in New Issue
Block a user