mirror of
https://github.com/matrix-org/pantalaimon.git
synced 2025-04-18 23:06:06 -04:00
Compare commits
53 Commits
Author | SHA1 | Date | |
---|---|---|---|
![]() |
9fe0e80128 | ||
![]() |
659bf093f8 | ||
![]() |
257ef6a2e5 | ||
![]() |
0f52d303d4 | ||
![]() |
c3ee162802 | ||
![]() |
26d9a55ce8 | ||
![]() |
21fb28d090 | ||
![]() |
42cdcc2519 | ||
![]() |
29d18653dc | ||
![]() |
1af0a04282 | ||
![]() |
e9fb8a4a57 | ||
![]() |
46a762d93a | ||
![]() |
93a1fcb36f | ||
![]() |
a2a2d704ee | ||
![]() |
a82652f6ad | ||
![]() |
6d88b8a215 | ||
![]() |
b8ca9b478d | ||
![]() |
bfb3b06153 | ||
![]() |
e2abab1ecc | ||
![]() |
5426d5bf9d | ||
![]() |
33909aa251 | ||
![]() |
b66ed95319 | ||
![]() |
5caaaf5651 | ||
![]() |
f459d585ca | ||
![]() |
369f73f3fb | ||
![]() |
76dc74d250 | ||
![]() |
adef63443e | ||
![]() |
634ac7ed68 | ||
![]() |
8b2a1173fd | ||
![]() |
3968c69aa8 | ||
![]() |
6638393042 | ||
![]() |
807deb94ee | ||
![]() |
313a5d528c | ||
![]() |
127373fdcc | ||
![]() |
b5a419e488 | ||
![]() |
64a3fb0a48 | ||
![]() |
4254a5d9d5 | ||
![]() |
6b7b87bb6a | ||
![]() |
2883df45c0 | ||
![]() |
492e4bb358 | ||
![]() |
8cdea3e637 | ||
![]() |
109ceed0bb | ||
![]() |
eb4f3b54b4 | ||
![]() |
85c7b685e5 | ||
![]() |
82fcbf8e85 | ||
![]() |
86060a2f75 | ||
![]() |
3dd8051707 | ||
![]() |
e62cfe068a | ||
![]() |
c89e87c22a | ||
![]() |
e5fb0b7f17 | ||
![]() |
cd36ca68d5 | ||
![]() |
ed7aa55ef0 | ||
![]() |
054a3bcb0b |
45
.github/workflows/ci.yml
vendored
Normal file
45
.github/workflows/ci.yml
vendored
Normal file
@ -0,0 +1,45 @@
|
|||||||
|
name: Build Status
|
||||||
|
|
||||||
|
on: [push, pull_request]
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
build:
|
||||||
|
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
strategy:
|
||||||
|
matrix:
|
||||||
|
python-version: ['3.8', '3.9', '3.10']
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v2
|
||||||
|
- uses: actions/setup-python@v2
|
||||||
|
with:
|
||||||
|
python-version: ${{ matrix.python-version }}
|
||||||
|
- name: Install Tox and any other packages
|
||||||
|
run: |
|
||||||
|
wget https://gitlab.matrix.org/matrix-org/olm/-/archive/master/olm-master.tar.bz2
|
||||||
|
tar -xvf olm-master.tar.bz2
|
||||||
|
pushd olm-master && make && sudo make PREFIX="/usr" install && popd
|
||||||
|
rm -r olm-master
|
||||||
|
pip install tox
|
||||||
|
- name: Run Tox
|
||||||
|
run: tox -e py
|
||||||
|
|
||||||
|
coverage:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- uses: actions/checkout@v2
|
||||||
|
- name: Setup Python
|
||||||
|
uses: actions/setup-python@v2
|
||||||
|
with:
|
||||||
|
python-version: "3.10"
|
||||||
|
- name: Install Tox and any other packages
|
||||||
|
run: |
|
||||||
|
wget https://gitlab.matrix.org/matrix-org/olm/-/archive/master/olm-master.tar.bz2
|
||||||
|
tar -xvf olm-master.tar.bz2
|
||||||
|
pushd olm-master && make && sudo make PREFIX="/usr" install && popd
|
||||||
|
rm -r olm-master
|
||||||
|
pip install tox
|
||||||
|
- name: Run Tox
|
||||||
|
run: tox -e coverage
|
36
CHANGELOG.md
36
CHANGELOG.md
@ -4,6 +4,42 @@ All notable changes to this project will be documented in this file.
|
|||||||
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||||
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||||
|
|
||||||
|
## 0.10.5 2022-09-28
|
||||||
|
|
||||||
|
### Added
|
||||||
|
|
||||||
|
- [[#137]] Proxy the v3 endpoints as well
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- [[#130]] Make sure the token variable is declared
|
||||||
|
|
||||||
|
[#137]: https://github.com/matrix-org/pantalaimon/pull/137
|
||||||
|
[#130]: https://github.com/matrix-org/pantalaimon/pull/130
|
||||||
|
|
||||||
|
## 0.10.4 2022-02-04
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- [[#122]] Fix the GLib import for panctl on some distributions
|
||||||
|
- [[#120]] Don't use strip to filter Bearer from the auth header
|
||||||
|
- [[#118]] Don't use the raw path if we need to sanitize filters, fixing room
|
||||||
|
history fetching for Fractal
|
||||||
|
|
||||||
|
[#122]: https://github.com/matrix-org/pantalaimon/pull/122
|
||||||
|
[#120]: https://github.com/matrix-org/pantalaimon/pull/120
|
||||||
|
[#118]: https://github.com/matrix-org/pantalaimon/pull/118
|
||||||
|
|
||||||
|
## 0.10.3 2021-09-02
|
||||||
|
|
||||||
|
### Fixed
|
||||||
|
|
||||||
|
- [[#105]] Use the raw_path when forwarding requests, avoiding URL
|
||||||
|
decoding/encoding issues.
|
||||||
|
|
||||||
|
[#105]: https://github.com/matrix-org/pantalaimon/pull/105
|
||||||
|
|
||||||
|
|
||||||
## 0.10.2 2021-07-14
|
## 0.10.2 2021-07-14
|
||||||
|
|
||||||
### Fixed
|
### Fixed
|
||||||
|
56
README.md
56
README.md
@ -25,11 +25,19 @@ Installing pantalaimon works like usually with python packages:
|
|||||||
|
|
||||||
python setup.py install
|
python setup.py install
|
||||||
|
|
||||||
|
or you can use `pip` and install it with:
|
||||||
|
```
|
||||||
|
pip install .[ui]
|
||||||
|
```
|
||||||
|
|
||||||
|
It is recommended that you create a virtual environment first or install dependencies
|
||||||
|
via your package manager. They are usually found with `python-<package-name>`.
|
||||||
|
|
||||||
Pantalaimon can also be found on pypi:
|
Pantalaimon can also be found on pypi:
|
||||||
|
|
||||||
pip install pantalaimon
|
pip install pantalaimon
|
||||||
|
|
||||||
Pantalaimon contains a dbus based UI that can be used to controll the daemon.
|
Pantalaimon contains a dbus based UI that can be used to control the daemon.
|
||||||
The dbus based UI is completely optional and needs to be installed with the
|
The dbus based UI is completely optional and needs to be installed with the
|
||||||
daemon:
|
daemon:
|
||||||
|
|
||||||
@ -77,6 +85,10 @@ docker build -t pantalaimon .
|
|||||||
# volume below is for where Pantalaimon should dump some data.
|
# volume below is for where Pantalaimon should dump some data.
|
||||||
docker run -it --rm -v /path/to/pantalaimon/dir:/data -p 8008:8008 pantalaimon
|
docker run -it --rm -v /path/to/pantalaimon/dir:/data -p 8008:8008 pantalaimon
|
||||||
```
|
```
|
||||||
|
The Docker image in the above example can alternatively be built straight from any branch or tag without the need to clone the repo, just by using this syntax:
|
||||||
|
```bash
|
||||||
|
docker build -t pantalaimon github.com/matrix-org/pantalaimon#master
|
||||||
|
```
|
||||||
|
|
||||||
An example `pantalaimon.conf` for Docker is:
|
An example `pantalaimon.conf` for Docker is:
|
||||||
```conf
|
```conf
|
||||||
@ -96,7 +108,7 @@ IgnoreVerification = True
|
|||||||
Usage
|
Usage
|
||||||
=====
|
=====
|
||||||
|
|
||||||
While pantalaimon is a daemon, it is meant to be run as your own user. It won't
|
While pantalaimon is a daemon, it is meant to be run as the same user as the app it is proxying for. It won't
|
||||||
verify devices for you automatically, unless configured to do so, and requires
|
verify devices for you automatically, unless configured to do so, and requires
|
||||||
user interaction to verify, ignore or blacklist devices. A more complete
|
user interaction to verify, ignore or blacklist devices. A more complete
|
||||||
description of Pantalaimon can be found in the [man page](docs/man/pantalaimon.8.md).
|
description of Pantalaimon can be found in the [man page](docs/man/pantalaimon.8.md).
|
||||||
@ -107,7 +119,7 @@ specifies one or more homeservers for pantalaimon to connect to.
|
|||||||
A minimal pantalaimon configuration looks like this:
|
A minimal pantalaimon configuration looks like this:
|
||||||
```dosini
|
```dosini
|
||||||
[local-matrix]
|
[local-matrix]
|
||||||
Homeserver = https://localhost:8448
|
Homeserver = https://localhost:443
|
||||||
ListenAddress = localhost
|
ListenAddress = localhost
|
||||||
ListenPort = 8009
|
ListenPort = 8009
|
||||||
```
|
```
|
||||||
@ -136,3 +148,41 @@ To control the daemon an interactive utility is provided in the form of
|
|||||||
`panctl` can be used to verify, blacklist or ignore devices, import or export
|
`panctl` can be used to verify, blacklist or ignore devices, import or export
|
||||||
session keys, or to introspect devices of users that we share encrypted rooms
|
session keys, or to introspect devices of users that we share encrypted rooms
|
||||||
with.
|
with.
|
||||||
|
|
||||||
|
### Setup
|
||||||
|
This is all coming from an excellent comment that you can find [here](https://github.com/matrix-org/pantalaimon/issues/154#issuecomment-1951591191).
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
1) Ensure you have an OS keyring installed. In my case I installed `gnome-keyring`. You may also want a GUI like `seahorse` to inspect the keyring. (pantalaimon will work without a keyring but your client will have to log in with the password every time `pantalaimon` is restarted, instead of being able to reuse the access token from the previous successful login.)
|
||||||
|
|
||||||
|
2) In case you have prior attempts, clean the slate by deleting the `~/.local/share/pantalaimon` directory.
|
||||||
|
|
||||||
|
3) Start `pantalaimon`.
|
||||||
|
|
||||||
|
4) Connect a client to the `ListenAddress:ListenPort` you specified in `pantalaimon.conf`, eg to `127.0.0.1:8009`, using the same username and password you would've used to login to your homeserver directly.
|
||||||
|
|
||||||
|
5) The login should succeed, but at this point all encrypted messages will fail to decrypt. This is fine.
|
||||||
|
|
||||||
|
6) Start another client that you were already using for your encrypted chats previously. In my case this was `app.element.io`, so the rest of the steps here assume that.
|
||||||
|
|
||||||
|
7) Run `panctl`. At the prompt, run `start-verification <user ID> <user ID> <Element's device ID>`. `<user ID>` here is the full user ID like `@arnavion:arnavion.dev`. If you only have the one Element session, `panctl` will show you the device ID as an autocomplete hint so you don't have to look it up. If you do need to look it up, go to Element -> profile icon -> All Settings -> Sessions, expand the "Current session" item, and the "Session ID" is the device ID.
|
||||||
|
|
||||||
|
8) In Element you will see a popup "Incoming Verification Request". Click "Continue". It will change to a popup containing some emojis, and `panctl` will print the same emojis. Click the "They match" button. It will now change to a popup like "Waiting for other client to confirm..."
|
||||||
|
|
||||||
|
9) In `panctl`, run `confirm-verification <user ID> <user ID> <Element's device ID>`, ie the same command as before but with `confirm-verification` instead of `start-verification`.
|
||||||
|
|
||||||
|
10) At this point, if you look at all your sessions in Element (profile icon -> All Settings -> Sessions), you should see "pantalaimon" in the "Other sessions" list as a "Verified" session.
|
||||||
|
|
||||||
|
11) Export the E2E room keys that Element was using via profile icon -> Security & Privacy -> Export E2E room keys. Pick any password and then save the file to some path.
|
||||||
|
|
||||||
|
12) Back in `panctl`, run `import-keys <user ID> <path of file> <password you used to encrypt the file>`. After a few seconds, in the output of `pantalaimon`, you should see a log like `INFO: pantalaimon: Successfully imported keys for <user ID> from <path of file>`.
|
||||||
|
|
||||||
|
13) Close and restart the client you had used in step 5, ie the one you want to connect to `pantalaimon`. Now, finally, you should be able to see the encrypted chats be decrypted.
|
||||||
|
|
||||||
|
14) Delete the E2E room keys backup file from step 12. You don't need it any more.
|
||||||
|
|
||||||
|
|
||||||
|
15) If in step 11 you had other unverified sessions from pantalaimon from your prior attempts, you can sign out of them too.
|
||||||
|
|
||||||
|
You will probably have to repeat steps 11-14 any time you start a new encrypted chat in Element.
|
||||||
|
@ -51,7 +51,7 @@ The message will be sent away after all devices are marked as ignored.
|
|||||||
In contrast to the
|
In contrast to the
|
||||||
.Cm send-anyways
|
.Cm send-anyways
|
||||||
command this command cancels the sending of a message to an encrypted room with
|
command this command cancels the sending of a message to an encrypted room with
|
||||||
unverified devices and gives the user the oportunity to verify or blacklist
|
unverified devices and gives the user the opportunity to verify or blacklist
|
||||||
devices as they see fit.
|
devices as they see fit.
|
||||||
.It Cm import-keys Ar pan-user Ar file Ar passphrase
|
.It Cm import-keys Ar pan-user Ar file Ar passphrase
|
||||||
Import end-to-end encryption keys from the given file for the given pan-user.
|
Import end-to-end encryption keys from the given file for the given pan-user.
|
||||||
|
@ -74,7 +74,7 @@ are as follows:
|
|||||||
> In contrast to the
|
> In contrast to the
|
||||||
> **send-anyways**
|
> **send-anyways**
|
||||||
> command this command cancels the sending of a message to an encrypted room with
|
> command this command cancels the sending of a message to an encrypted room with
|
||||||
> unverified devices and gives the user the oportunity to verify or blacklist
|
> unverified devices and gives the user the opportunity to verify or blacklist
|
||||||
> devices as they see fit.
|
> devices as they see fit.
|
||||||
|
|
||||||
**import-keys** *pan-user* *file* *passphrase*
|
**import-keys** *pan-user* *file* *passphrase*
|
||||||
|
@ -86,7 +86,7 @@ The amount of time to wait between room message history requests to the
|
|||||||
Homeserver in ms. Defaults to 3000.
|
Homeserver in ms. Defaults to 3000.
|
||||||
.El
|
.El
|
||||||
.Pp
|
.Pp
|
||||||
Aditional to the homeserver section a special section with the name
|
Additional to the homeserver section a special section with the name
|
||||||
.Cm Default
|
.Cm Default
|
||||||
can be used to configure the following values for all homeservers:
|
can be used to configure the following values for all homeservers:
|
||||||
.Cm ListenAddress ,
|
.Cm ListenAddress ,
|
||||||
|
@ -69,7 +69,7 @@ The following keys are optional in the proxy instance sections:
|
|||||||
> incoming messages will be decryptable, the proxy will be unable to decrypt the
|
> incoming messages will be decryptable, the proxy will be unable to decrypt the
|
||||||
> room history. Defaults to "No".
|
> room history. Defaults to "No".
|
||||||
|
|
||||||
Aditional to the homeserver section a special section with the name
|
Additional to the homeserver section a special section with the name
|
||||||
**Default**
|
**Default**
|
||||||
can be used to configure the following values for all homeservers:
|
can be used to configure the following values for all homeservers:
|
||||||
**ListenAddress**,
|
**ListenAddress**,
|
||||||
|
@ -24,7 +24,7 @@ behalf of the client.
|
|||||||
is supposed to run as your own user and listen to connections on a
|
is supposed to run as your own user and listen to connections on a
|
||||||
non-privileged port. A client needs to log in using the standard Matrix HTTP
|
non-privileged port. A client needs to log in using the standard Matrix HTTP
|
||||||
calls to register itself to the daemon, such a registered user is called a pan
|
calls to register itself to the daemon, such a registered user is called a pan
|
||||||
user and will have it's own sync loop to keep up with the server. Multiple matrix
|
user and will have its own sync loop to keep up with the server. Multiple matrix
|
||||||
clients can connect and use the same pan user.
|
clients can connect and use the same pan user.
|
||||||
|
|
||||||
If user interaction is required
|
If user interaction is required
|
||||||
|
@ -16,7 +16,6 @@ import asyncio
|
|||||||
import os
|
import os
|
||||||
from collections import defaultdict
|
from collections import defaultdict
|
||||||
from pprint import pformat
|
from pprint import pformat
|
||||||
from typing import Any, Dict, Optional
|
|
||||||
from urllib.parse import urlparse
|
from urllib.parse import urlparse
|
||||||
|
|
||||||
from aiohttp.client_exceptions import ClientConnectionError
|
from aiohttp.client_exceptions import ClientConnectionError
|
||||||
@ -135,7 +134,7 @@ class InvalidLimit(Exception):
|
|||||||
class SqliteQStore(SqliteStore):
|
class SqliteQStore(SqliteStore):
|
||||||
def _create_database(self):
|
def _create_database(self):
|
||||||
return SqliteQueueDatabase(
|
return SqliteQueueDatabase(
|
||||||
self.database_path, pragmas=(("foregign_keys", 1), ("secure_delete", 1))
|
self.database_path, pragmas=(("foreign_keys", 1), ("secure_delete", 1))
|
||||||
)
|
)
|
||||||
|
|
||||||
def close(self):
|
def close(self):
|
||||||
@ -554,6 +553,7 @@ class PanClient(AsyncClient):
|
|||||||
full_state=True,
|
full_state=True,
|
||||||
since=next_batch,
|
since=next_batch,
|
||||||
loop_sleep_time=loop_sleep_time,
|
loop_sleep_time=loop_sleep_time,
|
||||||
|
set_presence="offline",
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
self.task = task
|
self.task = task
|
||||||
@ -708,7 +708,6 @@ class PanClient(AsyncClient):
|
|||||||
for share in self.get_active_key_requests(
|
for share in self.get_active_key_requests(
|
||||||
message.user_id, message.device_id
|
message.user_id, message.device_id
|
||||||
):
|
):
|
||||||
|
|
||||||
continued = True
|
continued = True
|
||||||
|
|
||||||
if not self.continue_key_share(share):
|
if not self.continue_key_share(share):
|
||||||
@ -735,7 +734,7 @@ class PanClient(AsyncClient):
|
|||||||
pass
|
pass
|
||||||
|
|
||||||
response = (
|
response = (
|
||||||
f"Succesfully continued the key requests from "
|
f"Successfully continued the key requests from "
|
||||||
f"{message.user_id} via {message.device_id}"
|
f"{message.user_id} via {message.device_id}"
|
||||||
)
|
)
|
||||||
ret = "m.ok"
|
ret = "m.ok"
|
||||||
@ -760,7 +759,7 @@ class PanClient(AsyncClient):
|
|||||||
|
|
||||||
if cancelled:
|
if cancelled:
|
||||||
response = (
|
response = (
|
||||||
f"Succesfully cancelled key requests from "
|
f"Successfully cancelled key requests from "
|
||||||
f"{message.user_id} via {message.device_id}"
|
f"{message.user_id} via {message.device_id}"
|
||||||
)
|
)
|
||||||
ret = "m.ok"
|
ret = "m.ok"
|
||||||
@ -810,8 +809,9 @@ class PanClient(AsyncClient):
|
|||||||
|
|
||||||
if not isinstance(event, MegolmEvent):
|
if not isinstance(event, MegolmEvent):
|
||||||
logger.warn(
|
logger.warn(
|
||||||
"Encrypted event is not a megolm event:"
|
"Encrypted event is not a megolm event:" "\n{}".format(
|
||||||
"\n{}".format(pformat(event_dict))
|
pformat(event_dict)
|
||||||
|
)
|
||||||
)
|
)
|
||||||
return False
|
return False
|
||||||
|
|
||||||
@ -835,9 +835,9 @@ class PanClient(AsyncClient):
|
|||||||
decrypted_event.source["content"]["url"] = decrypted_event.url
|
decrypted_event.source["content"]["url"] = decrypted_event.url
|
||||||
|
|
||||||
if decrypted_event.thumbnail_url:
|
if decrypted_event.thumbnail_url:
|
||||||
decrypted_event.source["content"]["info"][
|
decrypted_event.source["content"]["info"]["thumbnail_url"] = (
|
||||||
"thumbnail_url"
|
decrypted_event.thumbnail_url
|
||||||
] = decrypted_event.thumbnail_url
|
)
|
||||||
|
|
||||||
event_dict.update(decrypted_event.source)
|
event_dict.update(decrypted_event.source)
|
||||||
event_dict["decrypted"] = True
|
event_dict["decrypted"] = True
|
||||||
|
@ -31,7 +31,7 @@ class PanConfigParser(configparser.ConfigParser):
|
|||||||
"IgnoreVerification": "False",
|
"IgnoreVerification": "False",
|
||||||
"ListenAddress": "localhost",
|
"ListenAddress": "localhost",
|
||||||
"ListenPort": "8009",
|
"ListenPort": "8009",
|
||||||
"LogLevel": "warnig",
|
"LogLevel": "warning",
|
||||||
"Notifications": "on",
|
"Notifications": "on",
|
||||||
"UseKeyring": "yes",
|
"UseKeyring": "yes",
|
||||||
"SearchRequests": "off",
|
"SearchRequests": "off",
|
||||||
@ -113,7 +113,7 @@ class ServerConfig:
|
|||||||
E2E encrypted messages.
|
E2E encrypted messages.
|
||||||
keyring (bool): Enable or disable the OS keyring for the storage of
|
keyring (bool): Enable or disable the OS keyring for the storage of
|
||||||
access tokens.
|
access tokens.
|
||||||
search_requests (bool): Enable or disable aditional Homeserver requests
|
search_requests (bool): Enable or disable additional Homeserver requests
|
||||||
for the search API endpoint.
|
for the search API endpoint.
|
||||||
index_encrypted_only (bool): Enable or disable message indexing fro
|
index_encrypted_only (bool): Enable or disable message indexing fro
|
||||||
non-encrypted rooms.
|
non-encrypted rooms.
|
||||||
@ -186,7 +186,6 @@ class PanConfig:
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
for section_name, section in config.items():
|
for section_name, section in config.items():
|
||||||
|
|
||||||
if section_name == "Default":
|
if section_name == "Default":
|
||||||
continue
|
continue
|
||||||
|
|
||||||
|
@ -125,6 +125,8 @@ class ProxyDaemon:
|
|||||||
self.upload_info = self.store.load_upload(self.name)
|
self.upload_info = self.store.load_upload(self.name)
|
||||||
|
|
||||||
for user_id, device_id in accounts:
|
for user_id, device_id in accounts:
|
||||||
|
token = None
|
||||||
|
|
||||||
if self.conf.keyring:
|
if self.conf.keyring:
|
||||||
try:
|
try:
|
||||||
token = keyring.get_password(
|
token = keyring.get_password(
|
||||||
@ -225,7 +227,8 @@ class ProxyDaemon:
|
|||||||
|
|
||||||
if ret:
|
if ret:
|
||||||
msg = (
|
msg = (
|
||||||
f"Device {device.id} of user " f"{device.user_id} succesfully verified."
|
f"Device {device.id} of user "
|
||||||
|
f"{device.user_id} successfully verified."
|
||||||
)
|
)
|
||||||
await client.send_update_device(device)
|
await client.send_update_device(device)
|
||||||
else:
|
else:
|
||||||
@ -240,7 +243,7 @@ class ProxyDaemon:
|
|||||||
if ret:
|
if ret:
|
||||||
msg = (
|
msg = (
|
||||||
f"Device {device.id} of user "
|
f"Device {device.id} of user "
|
||||||
f"{device.user_id} succesfully unverified."
|
f"{device.user_id} successfully unverified."
|
||||||
)
|
)
|
||||||
await client.send_update_device(device)
|
await client.send_update_device(device)
|
||||||
else:
|
else:
|
||||||
@ -255,7 +258,7 @@ class ProxyDaemon:
|
|||||||
if ret:
|
if ret:
|
||||||
msg = (
|
msg = (
|
||||||
f"Device {device.id} of user "
|
f"Device {device.id} of user "
|
||||||
f"{device.user_id} succesfully blacklisted."
|
f"{device.user_id} successfully blacklisted."
|
||||||
)
|
)
|
||||||
await client.send_update_device(device)
|
await client.send_update_device(device)
|
||||||
else:
|
else:
|
||||||
@ -272,7 +275,7 @@ class ProxyDaemon:
|
|||||||
if ret:
|
if ret:
|
||||||
msg = (
|
msg = (
|
||||||
f"Device {device.id} of user "
|
f"Device {device.id} of user "
|
||||||
f"{device.user_id} succesfully unblacklisted."
|
f"{device.user_id} successfully unblacklisted."
|
||||||
)
|
)
|
||||||
await client.send_update_device(device)
|
await client.send_update_device(device)
|
||||||
else:
|
else:
|
||||||
@ -307,7 +310,6 @@ class ProxyDaemon:
|
|||||||
DeviceUnblacklistMessage,
|
DeviceUnblacklistMessage,
|
||||||
),
|
),
|
||||||
):
|
):
|
||||||
|
|
||||||
device = client.device_store[message.user_id].get(message.device_id, None)
|
device = client.device_store[message.user_id].get(message.device_id, None)
|
||||||
|
|
||||||
if not device:
|
if not device:
|
||||||
@ -356,7 +358,7 @@ class ProxyDaemon:
|
|||||||
|
|
||||||
else:
|
else:
|
||||||
info_msg = (
|
info_msg = (
|
||||||
f"Succesfully exported keys for {client.user_id} " f"to {path}"
|
f"Successfully exported keys for {client.user_id} " f"to {path}"
|
||||||
)
|
)
|
||||||
logger.info(info_msg)
|
logger.info(info_msg)
|
||||||
await self.send_response(
|
await self.send_response(
|
||||||
@ -379,7 +381,7 @@ class ProxyDaemon:
|
|||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
info_msg = (
|
info_msg = (
|
||||||
f"Succesfully imported keys for {client.user_id} " f"from {path}"
|
f"Successfully imported keys for {client.user_id} " f"from {path}"
|
||||||
)
|
)
|
||||||
logger.info(info_msg)
|
logger.info(info_msg)
|
||||||
await self.send_response(
|
await self.send_response(
|
||||||
@ -418,7 +420,9 @@ class ProxyDaemon:
|
|||||||
access_token = request.query.get("access_token", "")
|
access_token = request.query.get("access_token", "")
|
||||||
|
|
||||||
if not access_token:
|
if not access_token:
|
||||||
access_token = request.headers.get("Authorization", "").strip("Bearer ")
|
access_token = request.headers.get("Authorization", "").replace(
|
||||||
|
"Bearer ", "", 1
|
||||||
|
)
|
||||||
|
|
||||||
return access_token
|
return access_token
|
||||||
|
|
||||||
@ -460,6 +464,7 @@ class ProxyDaemon:
|
|||||||
data=None, # type: bytes
|
data=None, # type: bytes
|
||||||
session=None, # type: aiohttp.ClientSession
|
session=None, # type: aiohttp.ClientSession
|
||||||
token=None, # type: str
|
token=None, # type: str
|
||||||
|
use_raw_path=True, # type: bool
|
||||||
):
|
):
|
||||||
# type: (...) -> aiohttp.ClientResponse
|
# type: (...) -> aiohttp.ClientResponse
|
||||||
"""Forward the given request to our configured homeserver.
|
"""Forward the given request to our configured homeserver.
|
||||||
@ -474,6 +479,10 @@ class ProxyDaemon:
|
|||||||
should be used to forward the request.
|
should be used to forward the request.
|
||||||
token (str, optional): The access token that should be used for the
|
token (str, optional): The access token that should be used for the
|
||||||
request.
|
request.
|
||||||
|
use_raw_path (str, optional): Should the raw path be used from the
|
||||||
|
request or should we use the path and re-encode it. Some may need
|
||||||
|
their filters to be sanitized, this requires the parsed version of
|
||||||
|
the path, otherwise we leave the path as is.
|
||||||
"""
|
"""
|
||||||
if not session:
|
if not session:
|
||||||
if not self.default_session:
|
if not self.default_session:
|
||||||
@ -482,9 +491,7 @@ class ProxyDaemon:
|
|||||||
|
|
||||||
assert session
|
assert session
|
||||||
|
|
||||||
path = urllib.parse.quote(
|
path = request.raw_path if use_raw_path else urllib.parse.quote(request.path)
|
||||||
request.path
|
|
||||||
) # re-encode path stuff like room aliases
|
|
||||||
method = request.method
|
method = request.method
|
||||||
|
|
||||||
headers = CIMultiDict(request.headers)
|
headers = CIMultiDict(request.headers)
|
||||||
@ -609,7 +616,9 @@ class ProxyDaemon:
|
|||||||
await pan_client.close()
|
await pan_client.close()
|
||||||
return
|
return
|
||||||
|
|
||||||
logger.info(f"Succesfully started new background sync client for " f"{user_id}")
|
logger.info(
|
||||||
|
f"Successfully started new background sync client for " f"{user_id}"
|
||||||
|
)
|
||||||
|
|
||||||
await self.send_ui_message(
|
await self.send_ui_message(
|
||||||
UpdateUsersMessage(self.name, user_id, pan_client.device_id)
|
UpdateUsersMessage(self.name, user_id, pan_client.device_id)
|
||||||
@ -675,7 +684,7 @@ class ProxyDaemon:
|
|||||||
|
|
||||||
if user_id and access_token:
|
if user_id and access_token:
|
||||||
logger.info(
|
logger.info(
|
||||||
f"User: {user} succesfully logged in, starting "
|
f"User: {user} successfully logged in, starting "
|
||||||
f"a background sync client."
|
f"a background sync client."
|
||||||
)
|
)
|
||||||
await self.start_pan_client(
|
await self.start_pan_client(
|
||||||
@ -726,7 +735,7 @@ class ProxyDaemon:
|
|||||||
return decryption_method(body, ignore_failures=False)
|
return decryption_method(body, ignore_failures=False)
|
||||||
except EncryptionError:
|
except EncryptionError:
|
||||||
logger.info("Error decrypting sync, waiting for next pan " "sync")
|
logger.info("Error decrypting sync, waiting for next pan " "sync")
|
||||||
await client.synced.wait(),
|
(await client.synced.wait(),)
|
||||||
logger.info("Pan synced, retrying decryption.")
|
logger.info("Pan synced, retrying decryption.")
|
||||||
|
|
||||||
try:
|
try:
|
||||||
@ -763,7 +772,7 @@ class ProxyDaemon:
|
|||||||
|
|
||||||
try:
|
try:
|
||||||
response = await self.forward_request(
|
response = await self.forward_request(
|
||||||
request, params=query, token=client.access_token
|
request, params=query, token=client.access_token, use_raw_path=False
|
||||||
)
|
)
|
||||||
except ClientConnectionError as e:
|
except ClientConnectionError as e:
|
||||||
return web.Response(status=500, text=str(e))
|
return web.Response(status=500, text=str(e))
|
||||||
@ -786,6 +795,27 @@ class ProxyDaemon:
|
|||||||
body=await response.read(),
|
body=await response.read(),
|
||||||
)
|
)
|
||||||
|
|
||||||
|
async def createRoom(self, request):
|
||||||
|
try:
|
||||||
|
content = await request.json()
|
||||||
|
except (JSONDecodeError, ContentTypeError):
|
||||||
|
return self._not_json
|
||||||
|
|
||||||
|
invite = content.get("invite", ())
|
||||||
|
if invite:
|
||||||
|
access_token = self.get_access_token(request)
|
||||||
|
|
||||||
|
if not access_token:
|
||||||
|
return self._missing_token
|
||||||
|
|
||||||
|
client = await self._find_client(access_token)
|
||||||
|
if not client:
|
||||||
|
return self._unknown_token
|
||||||
|
|
||||||
|
client.users_for_key_query.update(invite)
|
||||||
|
|
||||||
|
return await self.forward_to_web(request)
|
||||||
|
|
||||||
async def messages(self, request):
|
async def messages(self, request):
|
||||||
access_token = self.get_access_token(request)
|
access_token = self.get_access_token(request)
|
||||||
|
|
||||||
@ -811,7 +841,9 @@ class ProxyDaemon:
|
|||||||
query["filter"] = request_filter
|
query["filter"] = request_filter
|
||||||
|
|
||||||
try:
|
try:
|
||||||
response = await self.forward_request(request, params=query)
|
response = await self.forward_request(
|
||||||
|
request, params=query, use_raw_path=False
|
||||||
|
)
|
||||||
except ClientConnectionError as e:
|
except ClientConnectionError as e:
|
||||||
return web.Response(status=500, text=str(e))
|
return web.Response(status=500, text=str(e))
|
||||||
|
|
||||||
@ -908,7 +940,7 @@ class ProxyDaemon:
|
|||||||
return web.json_response(
|
return web.json_response(
|
||||||
{
|
{
|
||||||
"errcode": "M_FORBIDDEN",
|
"errcode": "M_FORBIDDEN",
|
||||||
"error": "You do not have permission to send the event."
|
"error": "You do not have permission to send the event.",
|
||||||
},
|
},
|
||||||
headers=CORS_HEADERS,
|
headers=CORS_HEADERS,
|
||||||
status=403,
|
status=403,
|
||||||
@ -950,7 +982,11 @@ class ProxyDaemon:
|
|||||||
):
|
):
|
||||||
try:
|
try:
|
||||||
content["url"] = await self._decrypt_uri(content["url"], client)
|
content["url"] = await self._decrypt_uri(content["url"], client)
|
||||||
if "info" in content and "thumbnail_url" in content["info"]:
|
if (
|
||||||
|
"info" in content
|
||||||
|
and "thumbnail_url" in content["info"]
|
||||||
|
and content["info"]["thumbnail_url"] is not None
|
||||||
|
):
|
||||||
content["info"]["thumbnail_url"] = await self._decrypt_uri(
|
content["info"]["thumbnail_url"] = await self._decrypt_uri(
|
||||||
content["info"]["thumbnail_url"], client
|
content["info"]["thumbnail_url"], client
|
||||||
)
|
)
|
||||||
@ -1020,7 +1056,7 @@ class ProxyDaemon:
|
|||||||
except SendRetryError as e:
|
except SendRetryError as e:
|
||||||
return web.Response(status=503, text=str(e))
|
return web.Response(status=503, text=str(e))
|
||||||
|
|
||||||
# Aquire a semaphore here so we only send out one
|
# Acquire a semaphore here so we only send out one
|
||||||
# UnverifiedDevicesSignal
|
# UnverifiedDevicesSignal
|
||||||
sem = client.send_semaphores[room_id]
|
sem = client.send_semaphores[room_id]
|
||||||
|
|
||||||
@ -1260,7 +1296,9 @@ class ProxyDaemon:
|
|||||||
client = next(iter(self.pan_clients.values()))
|
client = next(iter(self.pan_clients.values()))
|
||||||
|
|
||||||
try:
|
try:
|
||||||
response = await client.download(server_name, media_id, file_name)
|
response = await client.download(
|
||||||
|
server_name=server_name, media_id=media_id, filename=file_name
|
||||||
|
)
|
||||||
except ClientConnectionError as e:
|
except ClientConnectionError as e:
|
||||||
raise e
|
raise e
|
||||||
|
|
||||||
|
@ -23,7 +23,6 @@ if False:
|
|||||||
import json
|
import json
|
||||||
import os
|
import os
|
||||||
from functools import partial
|
from functools import partial
|
||||||
from typing import Any, Dict, List, Optional, Tuple
|
|
||||||
|
|
||||||
import attr
|
import attr
|
||||||
import tantivy
|
import tantivy
|
||||||
@ -230,7 +229,6 @@ if False:
|
|||||||
)
|
)
|
||||||
|
|
||||||
for message in query:
|
for message in query:
|
||||||
|
|
||||||
event = message.event
|
event = message.event
|
||||||
|
|
||||||
event_dict = {
|
event_dict = {
|
||||||
@ -501,6 +499,5 @@ if False:
|
|||||||
|
|
||||||
return search_result
|
return search_result
|
||||||
|
|
||||||
|
|
||||||
else:
|
else:
|
||||||
INDEXING_ENABLED = False
|
INDEXING_ENABLED = False
|
||||||
|
@ -15,7 +15,6 @@
|
|||||||
import asyncio
|
import asyncio
|
||||||
import os
|
import os
|
||||||
import signal
|
import signal
|
||||||
from typing import Optional
|
|
||||||
|
|
||||||
import click
|
import click
|
||||||
import janus
|
import janus
|
||||||
@ -23,7 +22,7 @@ import keyring
|
|||||||
import logbook
|
import logbook
|
||||||
import nio
|
import nio
|
||||||
from aiohttp import web
|
from aiohttp import web
|
||||||
from appdirs import user_config_dir, user_data_dir
|
from platformdirs import user_config_dir, user_data_dir
|
||||||
from logbook import StderrHandler
|
from logbook import StderrHandler
|
||||||
|
|
||||||
from pantalaimon.config import PanConfig, PanConfigError, parse_log_level
|
from pantalaimon.config import PanConfig, PanConfigError, parse_log_level
|
||||||
@ -63,33 +62,52 @@ async def init(data_dir, server_conf, send_queue, recv_queue):
|
|||||||
)
|
)
|
||||||
|
|
||||||
# 100 MB max POST size
|
# 100 MB max POST size
|
||||||
app = web.Application(client_max_size=1024 ** 2 * 100)
|
app = web.Application(client_max_size=1024**2 * 100)
|
||||||
|
|
||||||
app.add_routes(
|
app.add_routes(
|
||||||
[
|
[
|
||||||
web.post("/_matrix/client/r0/login", proxy.login),
|
web.post("/_matrix/client/r0/login", proxy.login),
|
||||||
|
web.post("/_matrix/client/v3/login", proxy.login),
|
||||||
web.get("/_matrix/client/r0/sync", proxy.sync),
|
web.get("/_matrix/client/r0/sync", proxy.sync),
|
||||||
|
web.get("/_matrix/client/v3/sync", proxy.sync),
|
||||||
|
web.post("/_matrix/client/r0/createRoom", proxy.createRoom),
|
||||||
|
web.post("/_matrix/client/v3/createRoom", proxy.createRoom),
|
||||||
web.get("/_matrix/client/r0/rooms/{room_id}/messages", proxy.messages),
|
web.get("/_matrix/client/r0/rooms/{room_id}/messages", proxy.messages),
|
||||||
|
web.get("/_matrix/client/v3/rooms/{room_id}/messages", proxy.messages),
|
||||||
web.put(
|
web.put(
|
||||||
r"/_matrix/client/r0/rooms/{room_id}/send/{event_type}/{txnid}",
|
r"/_matrix/client/r0/rooms/{room_id}/send/{event_type}/{txnid}",
|
||||||
proxy.send_message,
|
proxy.send_message,
|
||||||
),
|
),
|
||||||
|
web.put(
|
||||||
|
r"/_matrix/client/v3/rooms/{room_id}/send/{event_type}/{txnid}",
|
||||||
|
proxy.send_message,
|
||||||
|
),
|
||||||
web.post(
|
web.post(
|
||||||
r"/_matrix/client/r0/rooms/{room_id}/send/{event_type}",
|
r"/_matrix/client/r0/rooms/{room_id}/send/{event_type}",
|
||||||
proxy.send_message,
|
proxy.send_message,
|
||||||
),
|
),
|
||||||
web.post("/_matrix/client/r0/user/{user_id}/filter", proxy.filter),
|
web.post("/_matrix/client/r0/user/{user_id}/filter", proxy.filter),
|
||||||
|
web.post("/_matrix/client/v3/user/{user_id}/filter", proxy.filter),
|
||||||
web.post("/.well-known/matrix/client", proxy.well_known),
|
web.post("/.well-known/matrix/client", proxy.well_known),
|
||||||
web.get("/.well-known/matrix/client", proxy.well_known),
|
web.get("/.well-known/matrix/client", proxy.well_known),
|
||||||
web.post("/_matrix/client/r0/search", proxy.search),
|
web.post("/_matrix/client/r0/search", proxy.search),
|
||||||
|
web.post("/_matrix/client/v3/search", proxy.search),
|
||||||
web.options("/_matrix/client/r0/search", proxy.search_opts),
|
web.options("/_matrix/client/r0/search", proxy.search_opts),
|
||||||
|
web.options("/_matrix/client/v3/search", proxy.search_opts),
|
||||||
web.get(
|
web.get(
|
||||||
"/_matrix/media/v1/download/{server_name}/{media_id}", proxy.download
|
"/_matrix/media/v1/download/{server_name}/{media_id}", proxy.download
|
||||||
),
|
),
|
||||||
|
web.get(
|
||||||
|
"/_matrix/media/v3/download/{server_name}/{media_id}", proxy.download
|
||||||
|
),
|
||||||
web.get(
|
web.get(
|
||||||
"/_matrix/media/v1/download/{server_name}/{media_id}/{file_name}",
|
"/_matrix/media/v1/download/{server_name}/{media_id}/{file_name}",
|
||||||
proxy.download,
|
proxy.download,
|
||||||
),
|
),
|
||||||
|
web.get(
|
||||||
|
"/_matrix/media/v3/download/{server_name}/{media_id}/{file_name}",
|
||||||
|
proxy.download,
|
||||||
|
),
|
||||||
web.get(
|
web.get(
|
||||||
"/_matrix/media/r0/download/{server_name}/{media_id}", proxy.download
|
"/_matrix/media/r0/download/{server_name}/{media_id}", proxy.download
|
||||||
),
|
),
|
||||||
@ -101,10 +119,18 @@ async def init(data_dir, server_conf, send_queue, recv_queue):
|
|||||||
r"/_matrix/media/r0/upload",
|
r"/_matrix/media/r0/upload",
|
||||||
proxy.upload,
|
proxy.upload,
|
||||||
),
|
),
|
||||||
|
web.post(
|
||||||
|
r"/_matrix/media/v3/upload",
|
||||||
|
proxy.upload,
|
||||||
|
),
|
||||||
web.put(
|
web.put(
|
||||||
r"/_matrix/client/r0/profile/{userId}/avatar_url",
|
r"/_matrix/client/r0/profile/{userId}/avatar_url",
|
||||||
proxy.profile,
|
proxy.profile,
|
||||||
),
|
),
|
||||||
|
web.put(
|
||||||
|
r"/_matrix/client/v3/profile/{userId}/avatar_url",
|
||||||
|
proxy.profile,
|
||||||
|
),
|
||||||
]
|
]
|
||||||
)
|
)
|
||||||
app.router.add_route("*", "/" + "{proxyPath:.*}", proxy.router)
|
app.router.add_route("*", "/" + "{proxyPath:.*}", proxy.router)
|
||||||
@ -262,7 +288,7 @@ async def daemon(context, log_level, debug_encryption, config, data_path):
|
|||||||
"connect to pantalaimon."
|
"connect to pantalaimon."
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
@click.version_option(version="0.10.2", prog_name="pantalaimon")
|
@click.version_option(version="0.10.5", prog_name="pantalaimon")
|
||||||
@click.option(
|
@click.option(
|
||||||
"--log-level",
|
"--log-level",
|
||||||
type=click.Choice(["error", "warning", "info", "debug"]),
|
type=click.Choice(["error", "warning", "info", "debug"]),
|
||||||
|
@ -20,10 +20,16 @@ import sys
|
|||||||
from collections import defaultdict
|
from collections import defaultdict
|
||||||
from itertools import zip_longest
|
from itertools import zip_longest
|
||||||
from typing import List
|
from typing import List
|
||||||
|
from shlex import split
|
||||||
|
|
||||||
import attr
|
import attr
|
||||||
import click
|
import click
|
||||||
from gi.repository import GLib
|
|
||||||
|
try:
|
||||||
|
from gi.repository import GLib
|
||||||
|
except ModuleNotFoundError:
|
||||||
|
from pgi.repository import GLib
|
||||||
|
|
||||||
from prompt_toolkit import __version__ as ptk_version
|
from prompt_toolkit import __version__ as ptk_version
|
||||||
from prompt_toolkit import HTML, PromptSession, print_formatted_text
|
from prompt_toolkit import HTML, PromptSession, print_formatted_text
|
||||||
from prompt_toolkit.completion import Completer, Completion, PathCompleter
|
from prompt_toolkit.completion import Completer, Completion, PathCompleter
|
||||||
@ -459,7 +465,7 @@ class PanCtl:
|
|||||||
def sas_done(self, pan_user, user_id, device_id, _):
|
def sas_done(self, pan_user, user_id, device_id, _):
|
||||||
print(
|
print(
|
||||||
f"Device {device_id} of user {user_id}"
|
f"Device {device_id} of user {user_id}"
|
||||||
f" succesfully verified for pan user {pan_user}."
|
f" successfully verified for pan user {pan_user}."
|
||||||
)
|
)
|
||||||
|
|
||||||
def show_sas_invite(self, pan_user, user_id, device_id, _):
|
def show_sas_invite(self, pan_user, user_id, device_id, _):
|
||||||
@ -584,7 +590,7 @@ class PanCtl:
|
|||||||
parser = PanctlParser(self.commands)
|
parser = PanctlParser(self.commands)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
args = parser.parse_args(result.split())
|
args = parser.parse_args(split(result, posix=False))
|
||||||
except ParseError:
|
except ParseError:
|
||||||
continue
|
continue
|
||||||
|
|
||||||
@ -690,9 +696,9 @@ class PanCtl:
|
|||||||
"the pantalaimon daemon."
|
"the pantalaimon daemon."
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
@click.version_option(version="0.10.2", prog_name="panctl")
|
@click.version_option(version="0.10.5", prog_name="panctl")
|
||||||
def main():
|
def main():
|
||||||
loop = asyncio.get_event_loop()
|
loop = asyncio.new_event_loop()
|
||||||
glib_loop = GLib.MainLoop()
|
glib_loop = GLib.MainLoop()
|
||||||
|
|
||||||
try:
|
try:
|
||||||
|
@ -15,7 +15,7 @@
|
|||||||
import json
|
import json
|
||||||
import os
|
import os
|
||||||
from collections import defaultdict
|
from collections import defaultdict
|
||||||
from typing import Any, Dict, List, Optional, Tuple
|
from typing import Any, Dict
|
||||||
|
|
||||||
import attr
|
import attr
|
||||||
from nio.crypto import TrustState, GroupSessionStore
|
from nio.crypto import TrustState, GroupSessionStore
|
||||||
@ -431,7 +431,6 @@ class PanStore:
|
|||||||
device_store = defaultdict(dict)
|
device_store = defaultdict(dict)
|
||||||
|
|
||||||
for d in account.device_keys:
|
for d in account.device_keys:
|
||||||
|
|
||||||
if d.deleted:
|
if d.deleted:
|
||||||
continue
|
continue
|
||||||
|
|
||||||
|
@ -470,14 +470,14 @@ if UI_ENABLED:
|
|||||||
self.bus.publish("org.pantalaimon1", self.control_if, self.device_if)
|
self.bus.publish("org.pantalaimon1", self.control_if, self.device_if)
|
||||||
|
|
||||||
def unverified_notification(self, message):
|
def unverified_notification(self, message):
|
||||||
notificaton = notify2.Notification(
|
notification = notify2.Notification(
|
||||||
"Unverified devices.",
|
"Unverified devices.",
|
||||||
message=(
|
message=(
|
||||||
f"There are unverified devices in the room "
|
f"There are unverified devices in the room "
|
||||||
f"{message.room_display_name}."
|
f"{message.room_display_name}."
|
||||||
),
|
),
|
||||||
)
|
)
|
||||||
notificaton.set_category("im")
|
notification.set_category("im")
|
||||||
|
|
||||||
def send_cb(notification, action_key, user_data):
|
def send_cb(notification, action_key, user_data):
|
||||||
message = user_data
|
message = user_data
|
||||||
@ -488,20 +488,20 @@ if UI_ENABLED:
|
|||||||
self.control_if.CancelSending(message.pan_user, message.room_id)
|
self.control_if.CancelSending(message.pan_user, message.room_id)
|
||||||
|
|
||||||
if "actions" in notify2.get_server_caps():
|
if "actions" in notify2.get_server_caps():
|
||||||
notificaton.add_action("send", "Send anyways", send_cb, message)
|
notification.add_action("send", "Send anyways", send_cb, message)
|
||||||
notificaton.add_action("cancel", "Cancel sending", cancel_cb, message)
|
notification.add_action("cancel", "Cancel sending", cancel_cb, message)
|
||||||
|
|
||||||
notificaton.show()
|
notification.show()
|
||||||
|
|
||||||
def sas_invite_notification(self, message):
|
def sas_invite_notification(self, message):
|
||||||
notificaton = notify2.Notification(
|
notification = notify2.Notification(
|
||||||
"Key verification invite",
|
"Key verification invite",
|
||||||
message=(
|
message=(
|
||||||
f"{message.user_id} via {message.device_id} has started "
|
f"{message.user_id} via {message.device_id} has started "
|
||||||
f"a key verification process."
|
f"a key verification process."
|
||||||
),
|
),
|
||||||
)
|
)
|
||||||
notificaton.set_category("im")
|
notification.set_category("im")
|
||||||
|
|
||||||
def accept_cb(notification, action_key, user_data):
|
def accept_cb(notification, action_key, user_data):
|
||||||
message = user_data
|
message = user_data
|
||||||
@ -516,17 +516,17 @@ if UI_ENABLED:
|
|||||||
)
|
)
|
||||||
|
|
||||||
if "actions" in notify2.get_server_caps():
|
if "actions" in notify2.get_server_caps():
|
||||||
notificaton.add_action("accept", "Accept", accept_cb, message)
|
notification.add_action("accept", "Accept", accept_cb, message)
|
||||||
notificaton.add_action("cancel", "Cancel", cancel_cb, message)
|
notification.add_action("cancel", "Cancel", cancel_cb, message)
|
||||||
|
|
||||||
notificaton.show()
|
notification.show()
|
||||||
|
|
||||||
def sas_show_notification(self, message):
|
def sas_show_notification(self, message):
|
||||||
emojis = [x[0] for x in message.emoji]
|
emojis = [x[0] for x in message.emoji]
|
||||||
|
|
||||||
emoji_str = " ".join(emojis)
|
emoji_str = " ".join(emojis)
|
||||||
|
|
||||||
notificaton = notify2.Notification(
|
notification = notify2.Notification(
|
||||||
"Short authentication string",
|
"Short authentication string",
|
||||||
message=(
|
message=(
|
||||||
f"Short authentication string for the key verification of"
|
f"Short authentication string for the key verification of"
|
||||||
@ -534,7 +534,7 @@ if UI_ENABLED:
|
|||||||
f"{emoji_str}"
|
f"{emoji_str}"
|
||||||
),
|
),
|
||||||
)
|
)
|
||||||
notificaton.set_category("im")
|
notification.set_category("im")
|
||||||
|
|
||||||
def confirm_cb(notification, action_key, user_data):
|
def confirm_cb(notification, action_key, user_data):
|
||||||
message = user_data
|
message = user_data
|
||||||
@ -549,21 +549,21 @@ if UI_ENABLED:
|
|||||||
)
|
)
|
||||||
|
|
||||||
if "actions" in notify2.get_server_caps():
|
if "actions" in notify2.get_server_caps():
|
||||||
notificaton.add_action("confirm", "Confirm", confirm_cb, message)
|
notification.add_action("confirm", "Confirm", confirm_cb, message)
|
||||||
notificaton.add_action("cancel", "Cancel", cancel_cb, message)
|
notification.add_action("cancel", "Cancel", cancel_cb, message)
|
||||||
|
|
||||||
notificaton.show()
|
notification.show()
|
||||||
|
|
||||||
def sas_done_notification(self, message):
|
def sas_done_notification(self, message):
|
||||||
notificaton = notify2.Notification(
|
notification = notify2.Notification(
|
||||||
"Device successfully verified.",
|
"Device successfully verified.",
|
||||||
message=(
|
message=(
|
||||||
f"Device {message.device_id} of user {message.user_id} "
|
f"Device {message.device_id} of user {message.user_id} "
|
||||||
f"successfully verified."
|
f"successfully verified."
|
||||||
),
|
),
|
||||||
)
|
)
|
||||||
notificaton.set_category("im")
|
notification.set_category("im")
|
||||||
notificaton.show()
|
notification.show()
|
||||||
|
|
||||||
def message_callback(self):
|
def message_callback(self):
|
||||||
try:
|
try:
|
||||||
|
19
setup.py
19
setup.py
@ -7,12 +7,11 @@ with open("README.md", encoding="utf-8") as f:
|
|||||||
|
|
||||||
setup(
|
setup(
|
||||||
name="pantalaimon",
|
name="pantalaimon",
|
||||||
version="0.10.2",
|
version="0.10.5",
|
||||||
url="https://github.com/matrix-org/pantalaimon",
|
url="https://github.com/matrix-org/pantalaimon",
|
||||||
author="The Matrix.org Team",
|
author="The Matrix.org Team",
|
||||||
author_email="poljar@termina.org.uk",
|
author_email="poljar@termina.org.uk",
|
||||||
description=("A Matrix proxy daemon that adds E2E encryption "
|
description=("A Matrix proxy daemon that adds E2E encryption " "capabilities."),
|
||||||
"capabilities."),
|
|
||||||
long_description=long_description,
|
long_description=long_description,
|
||||||
long_description_content_type="text/markdown",
|
long_description_content_type="text/markdown",
|
||||||
license="Apache License, Version 2.0",
|
license="Apache License, Version 2.0",
|
||||||
@ -20,7 +19,7 @@ setup(
|
|||||||
install_requires=[
|
install_requires=[
|
||||||
"attrs >= 19.3.0",
|
"attrs >= 19.3.0",
|
||||||
"aiohttp >= 3.6, < 4.0",
|
"aiohttp >= 3.6, < 4.0",
|
||||||
"appdirs >= 1.4.4",
|
"platformdirs >= 4.3.6",
|
||||||
"click >= 7.1.2",
|
"click >= 7.1.2",
|
||||||
"keyring >= 21.2.1",
|
"keyring >= 21.2.1",
|
||||||
"logbook >= 1.5.3",
|
"logbook >= 1.5.3",
|
||||||
@ -29,19 +28,21 @@ setup(
|
|||||||
"cachetools >= 3.0.0",
|
"cachetools >= 3.0.0",
|
||||||
"prompt_toolkit > 2, < 4",
|
"prompt_toolkit > 2, < 4",
|
||||||
"typing;python_version<'3.5'",
|
"typing;python_version<'3.5'",
|
||||||
"matrix-nio[e2e] >= 0.18, < 0.19"
|
"matrix-nio[e2e] >= 0.24, < 0.25.2",
|
||||||
],
|
],
|
||||||
extras_require={
|
extras_require={
|
||||||
"ui": [
|
"ui": [
|
||||||
"dbus-python >= 1.2, < 1.3",
|
"dbus-python >= 1.2, < 1.3",
|
||||||
"PyGObject >= 3.36, < 3.39",
|
"PyGObject >= 3.46, < 3.50",
|
||||||
"pydbus >= 0.6, < 0.7",
|
"pydbus >= 0.6, < 0.7",
|
||||||
"notify2 >= 0.3, < 0.4",
|
"notify2 >= 0.3, < 0.4",
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
entry_points={
|
entry_points={
|
||||||
"console_scripts": ["pantalaimon=pantalaimon.main:main",
|
"console_scripts": [
|
||||||
"panctl=pantalaimon.panctl:main"],
|
"pantalaimon=pantalaimon.main:main",
|
||||||
|
"panctl=pantalaimon.panctl:main",
|
||||||
|
],
|
||||||
},
|
},
|
||||||
zip_safe=False
|
zip_safe=False,
|
||||||
)
|
)
|
||||||
|
@ -34,11 +34,9 @@ class Provider(BaseProvider):
|
|||||||
def client(self):
|
def client(self):
|
||||||
return ClientInfo(faker.mx_id(), faker.access_token())
|
return ClientInfo(faker.mx_id(), faker.access_token())
|
||||||
|
|
||||||
|
|
||||||
def avatar_url(self):
|
def avatar_url(self):
|
||||||
return "mxc://{}/{}#auto".format(
|
return "mxc://{}/{}#auto".format(
|
||||||
faker.hostname(),
|
faker.hostname(), "".join(choices(ascii_letters) for i in range(24))
|
||||||
"".join(choices(ascii_letters) for i in range(24))
|
|
||||||
)
|
)
|
||||||
|
|
||||||
def olm_key_pair(self):
|
def olm_key_pair(self):
|
||||||
@ -56,7 +54,6 @@ class Provider(BaseProvider):
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
faker.add_provider(Provider)
|
faker.add_provider(Provider)
|
||||||
|
|
||||||
|
|
||||||
@ -80,13 +77,7 @@ def tempdir():
|
|||||||
@pytest.fixture
|
@pytest.fixture
|
||||||
def panstore(tempdir):
|
def panstore(tempdir):
|
||||||
for _ in range(10):
|
for _ in range(10):
|
||||||
store = SqliteStore(
|
store = SqliteStore(faker.mx_id(), faker.device_id(), tempdir, "", "pan.db")
|
||||||
faker.mx_id(),
|
|
||||||
faker.device_id(),
|
|
||||||
tempdir,
|
|
||||||
"",
|
|
||||||
"pan.db"
|
|
||||||
)
|
|
||||||
account = OlmAccount()
|
account = OlmAccount()
|
||||||
store.save_account(account)
|
store.save_account(account)
|
||||||
|
|
||||||
@ -130,21 +121,23 @@ async def pan_proxy_server(tempdir, aiohttp_server):
|
|||||||
recv_queue=ui_queue.async_q,
|
recv_queue=ui_queue.async_q,
|
||||||
proxy=None,
|
proxy=None,
|
||||||
ssl=False,
|
ssl=False,
|
||||||
client_store_class=SqliteStore
|
client_store_class=SqliteStore,
|
||||||
)
|
)
|
||||||
|
|
||||||
app.add_routes([
|
app.add_routes(
|
||||||
web.post("/_matrix/client/r0/login", proxy.login),
|
[
|
||||||
web.get("/_matrix/client/r0/sync", proxy.sync),
|
web.post("/_matrix/client/r0/login", proxy.login),
|
||||||
web.get("/_matrix/client/r0/rooms/{room_id}/messages", proxy.messages),
|
web.get("/_matrix/client/r0/sync", proxy.sync),
|
||||||
web.put(
|
web.get("/_matrix/client/r0/rooms/{room_id}/messages", proxy.messages),
|
||||||
r"/_matrix/client/r0/rooms/{room_id}/send/{event_type}/{txnid}",
|
web.put(
|
||||||
proxy.send_message
|
r"/_matrix/client/r0/rooms/{room_id}/send/{event_type}/{txnid}",
|
||||||
),
|
proxy.send_message,
|
||||||
web.post("/_matrix/client/r0/user/{user_id}/filter", proxy.filter),
|
),
|
||||||
web.post("/_matrix/client/r0/search", proxy.search),
|
web.post("/_matrix/client/r0/user/{user_id}/filter", proxy.filter),
|
||||||
web.options("/_matrix/client/r0/search", proxy.search_opts),
|
web.post("/_matrix/client/r0/search", proxy.search),
|
||||||
])
|
web.options("/_matrix/client/r0/search", proxy.search_opts),
|
||||||
|
]
|
||||||
|
)
|
||||||
|
|
||||||
server = await aiohttp_server(app)
|
server = await aiohttp_server(app)
|
||||||
|
|
||||||
@ -161,7 +154,7 @@ async def running_proxy(pan_proxy_server, aioresponse, aiohttp_client):
|
|||||||
"access_token": "abc123",
|
"access_token": "abc123",
|
||||||
"device_id": "GHTYAJCE",
|
"device_id": "GHTYAJCE",
|
||||||
"home_server": "example.org",
|
"home_server": "example.org",
|
||||||
"user_id": "@example:example.org"
|
"user_id": "@example:example.org",
|
||||||
}
|
}
|
||||||
|
|
||||||
aioclient = await aiohttp_client(server)
|
aioclient = await aiohttp_client(server)
|
||||||
@ -170,7 +163,7 @@ async def running_proxy(pan_proxy_server, aioresponse, aiohttp_client):
|
|||||||
"https://example.org/_matrix/client/r0/login",
|
"https://example.org/_matrix/client/r0/login",
|
||||||
status=200,
|
status=200,
|
||||||
payload=login_response,
|
payload=login_response,
|
||||||
repeat=True
|
repeat=True,
|
||||||
)
|
)
|
||||||
|
|
||||||
await aioclient.post(
|
await aioclient.post(
|
||||||
@ -179,7 +172,7 @@ async def running_proxy(pan_proxy_server, aioresponse, aiohttp_client):
|
|||||||
"type": "m.login.password",
|
"type": "m.login.password",
|
||||||
"user": "example",
|
"user": "example",
|
||||||
"password": "wordpass",
|
"password": "wordpass",
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
yield server, aioclient, proxy, queues
|
yield server, aioclient, proxy, queues
|
||||||
|
@ -25,10 +25,10 @@ ALICE_ID = "@alice:example.org"
|
|||||||
|
|
||||||
|
|
||||||
@pytest.fixture
|
@pytest.fixture
|
||||||
async def client(tmpdir, loop):
|
async def client(tmpdir):
|
||||||
store = PanStore(tmpdir)
|
store = PanStore(tmpdir)
|
||||||
queue = janus.Queue()
|
queue = janus.Queue()
|
||||||
conf = ServerConfig("example", "https://exapmle.org")
|
conf = ServerConfig("example", "https://example.org")
|
||||||
conf.history_fetch_delay = 0.1
|
conf.history_fetch_delay = 0.1
|
||||||
|
|
||||||
store.save_server_user("example", "@example:example.org")
|
store.save_server_user("example", "@example:example.org")
|
||||||
@ -371,7 +371,7 @@ class TestClass(object):
|
|||||||
|
|
||||||
await client.loop_stop()
|
await client.loop_stop()
|
||||||
|
|
||||||
async def test_history_fetching_tasks(self, client, aioresponse, loop):
|
async def test_history_fetching_tasks(self, client, aioresponse):
|
||||||
if not INDEXING_ENABLED:
|
if not INDEXING_ENABLED:
|
||||||
pytest.skip("Indexing needs to be enabled to test this")
|
pytest.skip("Indexing needs to be enabled to test this")
|
||||||
|
|
||||||
@ -380,7 +380,9 @@ class TestClass(object):
|
|||||||
)
|
)
|
||||||
|
|
||||||
aioresponse.get(
|
aioresponse.get(
|
||||||
sync_url, status=200, payload=self.initial_sync_response,
|
sync_url,
|
||||||
|
status=200,
|
||||||
|
payload=self.initial_sync_response,
|
||||||
)
|
)
|
||||||
|
|
||||||
aioresponse.get(sync_url, status=200, payload=self.empty_sync, repeat=True)
|
aioresponse.get(sync_url, status=200, payload=self.empty_sync, repeat=True)
|
||||||
@ -421,7 +423,7 @@ class TestClass(object):
|
|||||||
tasks = client.pan_store.load_fetcher_tasks(client.server_name, client.user_id)
|
tasks = client.pan_store.load_fetcher_tasks(client.server_name, client.user_id)
|
||||||
assert len(tasks) == 1
|
assert len(tasks) == 1
|
||||||
|
|
||||||
# Check that the task is our prev_batch from the sync resposne
|
# Check that the task is our prev_batch from the sync response
|
||||||
assert tasks[0].room_id == TEST_ROOM_ID
|
assert tasks[0].room_id == TEST_ROOM_ID
|
||||||
assert tasks[0].token == "t392-516_47314_0_7_1_1_1_11444_1"
|
assert tasks[0].token == "t392-516_47314_0_7_1_1_1_11444_1"
|
||||||
|
|
||||||
@ -431,7 +433,7 @@ class TestClass(object):
|
|||||||
tasks = client.pan_store.load_fetcher_tasks(client.server_name, client.user_id)
|
tasks = client.pan_store.load_fetcher_tasks(client.server_name, client.user_id)
|
||||||
assert len(tasks) == 1
|
assert len(tasks) == 1
|
||||||
|
|
||||||
# Check that the task is our end token from the messages resposne
|
# Check that the task is our end token from the messages response
|
||||||
assert tasks[0].room_id == TEST_ROOM_ID
|
assert tasks[0].room_id == TEST_ROOM_ID
|
||||||
assert tasks[0].token == "t47409-4357353_219380_26003_2265"
|
assert tasks[0].token == "t47409-4357353_219380_26003_2265"
|
||||||
|
|
||||||
@ -445,7 +447,7 @@ class TestClass(object):
|
|||||||
|
|
||||||
await client.loop_stop()
|
await client.loop_stop()
|
||||||
|
|
||||||
async def test_history_fetching_resume(self, client, aioresponse, loop):
|
async def test_history_fetching_resume(self, client, aioresponse):
|
||||||
if not INDEXING_ENABLED:
|
if not INDEXING_ENABLED:
|
||||||
pytest.skip("Indexing needs to be enabled to test this")
|
pytest.skip("Indexing needs to be enabled to test this")
|
||||||
|
|
||||||
@ -454,7 +456,9 @@ class TestClass(object):
|
|||||||
)
|
)
|
||||||
|
|
||||||
aioresponse.get(
|
aioresponse.get(
|
||||||
sync_url, status=200, payload=self.initial_sync_response,
|
sync_url,
|
||||||
|
status=200,
|
||||||
|
payload=self.initial_sync_response,
|
||||||
)
|
)
|
||||||
|
|
||||||
aioresponse.get(sync_url, status=200, payload=self.empty_sync, repeat=True)
|
aioresponse.get(sync_url, status=200, payload=self.empty_sync, repeat=True)
|
||||||
@ -519,7 +523,7 @@ class TestClass(object):
|
|||||||
)
|
)
|
||||||
assert len(tasks) == 1
|
assert len(tasks) == 1
|
||||||
|
|
||||||
# Check that the task is our end token from the messages resposne
|
# Check that the task is our end token from the messages response
|
||||||
assert tasks[0].room_id == TEST_ROOM_ID
|
assert tasks[0].room_id == TEST_ROOM_ID
|
||||||
assert tasks[0].token == "t47409-4357353_219380_26003_2265"
|
assert tasks[0].token == "t47409-4357353_219380_26003_2265"
|
||||||
|
|
||||||
|
@ -1,9 +1,7 @@
|
|||||||
import asyncio
|
|
||||||
import json
|
import json
|
||||||
import re
|
import re
|
||||||
from collections import defaultdict
|
from collections import defaultdict
|
||||||
|
|
||||||
from aiohttp import web
|
|
||||||
from nio.crypto import OlmDevice
|
from nio.crypto import OlmDevice
|
||||||
|
|
||||||
from conftest import faker
|
from conftest import faker
|
||||||
@ -27,7 +25,7 @@ class TestClass(object):
|
|||||||
"access_token": "abc123",
|
"access_token": "abc123",
|
||||||
"device_id": "GHTYAJCE",
|
"device_id": "GHTYAJCE",
|
||||||
"home_server": "example.org",
|
"home_server": "example.org",
|
||||||
"user_id": "@example:example.org"
|
"user_id": "@example:example.org",
|
||||||
}
|
}
|
||||||
|
|
||||||
@property
|
@property
|
||||||
@ -36,12 +34,7 @@ class TestClass(object):
|
|||||||
|
|
||||||
@property
|
@property
|
||||||
def keys_upload_response(self):
|
def keys_upload_response(self):
|
||||||
return {
|
return {"one_time_key_counts": {"curve25519": 10, "signed_curve25519": 20}}
|
||||||
"one_time_key_counts": {
|
|
||||||
"curve25519": 10,
|
|
||||||
"signed_curve25519": 20
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def example_devices(self):
|
def example_devices(self):
|
||||||
@ -52,10 +45,7 @@ class TestClass(object):
|
|||||||
devices[device.user_id][device.id] = device
|
devices[device.user_id][device.id] = device
|
||||||
|
|
||||||
bob_device = OlmDevice(
|
bob_device = OlmDevice(
|
||||||
BOB_ID,
|
BOB_ID, BOB_DEVICE, {"ed25519": BOB_ONETIME, "curve25519": BOB_CURVE}
|
||||||
BOB_DEVICE,
|
|
||||||
{"ed25519": BOB_ONETIME,
|
|
||||||
"curve25519": BOB_CURVE}
|
|
||||||
)
|
)
|
||||||
|
|
||||||
devices[BOB_ID][BOB_DEVICE] = bob_device
|
devices[BOB_ID][BOB_DEVICE] = bob_device
|
||||||
@ -71,7 +61,7 @@ class TestClass(object):
|
|||||||
"https://example.org/_matrix/client/r0/login",
|
"https://example.org/_matrix/client/r0/login",
|
||||||
status=200,
|
status=200,
|
||||||
payload=self.login_response,
|
payload=self.login_response,
|
||||||
repeat=True
|
repeat=True,
|
||||||
)
|
)
|
||||||
|
|
||||||
assert not daemon.pan_clients
|
assert not daemon.pan_clients
|
||||||
@ -82,7 +72,7 @@ class TestClass(object):
|
|||||||
"type": "m.login.password",
|
"type": "m.login.password",
|
||||||
"user": "example",
|
"user": "example",
|
||||||
"password": "wordpass",
|
"password": "wordpass",
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
assert resp.status == 200
|
assert resp.status == 200
|
||||||
@ -105,11 +95,11 @@ class TestClass(object):
|
|||||||
"https://example.org/_matrix/client/r0/login",
|
"https://example.org/_matrix/client/r0/login",
|
||||||
status=200,
|
status=200,
|
||||||
payload=self.login_response,
|
payload=self.login_response,
|
||||||
repeat=True
|
repeat=True,
|
||||||
)
|
)
|
||||||
|
|
||||||
sync_url = re.compile(
|
sync_url = re.compile(
|
||||||
r'^https://example\.org/_matrix/client/r0/sync\?access_token=.*'
|
r"^https://example\.org/_matrix/client/r0/sync\?access_token=.*"
|
||||||
)
|
)
|
||||||
|
|
||||||
aioresponse.get(
|
aioresponse.get(
|
||||||
@ -124,14 +114,16 @@ class TestClass(object):
|
|||||||
"type": "m.login.password",
|
"type": "m.login.password",
|
||||||
"user": "example",
|
"user": "example",
|
||||||
"password": "wordpass",
|
"password": "wordpass",
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
# Check that the pan client started to sync after logging in.
|
# Check that the pan client started to sync after logging in.
|
||||||
pan_client = list(daemon.pan_clients.values())[0]
|
pan_client = list(daemon.pan_clients.values())[0]
|
||||||
assert len(pan_client.rooms) == 1
|
assert len(pan_client.rooms) == 1
|
||||||
|
|
||||||
async def test_pan_client_keys_upload(self, pan_proxy_server, aiohttp_client, aioresponse):
|
async def test_pan_client_keys_upload(
|
||||||
|
self, pan_proxy_server, aiohttp_client, aioresponse
|
||||||
|
):
|
||||||
server, daemon, _ = pan_proxy_server
|
server, daemon, _ = pan_proxy_server
|
||||||
|
|
||||||
client = await aiohttp_client(server)
|
client = await aiohttp_client(server)
|
||||||
@ -140,11 +132,11 @@ class TestClass(object):
|
|||||||
"https://example.org/_matrix/client/r0/login",
|
"https://example.org/_matrix/client/r0/login",
|
||||||
status=200,
|
status=200,
|
||||||
payload=self.login_response,
|
payload=self.login_response,
|
||||||
repeat=True
|
repeat=True,
|
||||||
)
|
)
|
||||||
|
|
||||||
sync_url = re.compile(
|
sync_url = re.compile(
|
||||||
r'^https://example\.org/_matrix/client/r0/sync\?access_token=.*'
|
r"^https://example\.org/_matrix/client/r0/sync\?access_token=.*"
|
||||||
)
|
)
|
||||||
|
|
||||||
aioresponse.get(
|
aioresponse.get(
|
||||||
@ -169,7 +161,7 @@ class TestClass(object):
|
|||||||
"type": "m.login.password",
|
"type": "m.login.password",
|
||||||
"user": "example",
|
"user": "example",
|
||||||
"password": "wordpass",
|
"password": "wordpass",
|
||||||
}
|
},
|
||||||
)
|
)
|
||||||
|
|
||||||
pan_client = list(daemon.pan_clients.values())[0]
|
pan_client = list(daemon.pan_clients.values())[0]
|
||||||
|
@ -1,12 +1,10 @@
|
|||||||
import asyncio
|
import asyncio
|
||||||
import pdb
|
|
||||||
import pprint
|
import pprint
|
||||||
import pytest
|
import pytest
|
||||||
|
|
||||||
from nio import RoomMessage, RoomEncryptedMedia
|
from nio import RoomMessage, RoomEncryptedMedia
|
||||||
|
|
||||||
from urllib.parse import urlparse
|
from urllib.parse import urlparse
|
||||||
from conftest import faker
|
|
||||||
from pantalaimon.index import INDEXING_ENABLED
|
from pantalaimon.index import INDEXING_ENABLED
|
||||||
from pantalaimon.store import FetchTask, MediaInfo, UploadInfo
|
from pantalaimon.store import FetchTask, MediaInfo, UploadInfo
|
||||||
|
|
||||||
@ -27,7 +25,7 @@ class TestClass(object):
|
|||||||
"type": "m.room.message",
|
"type": "m.room.message",
|
||||||
"unsigned": {"age": 43289803095},
|
"unsigned": {"age": 43289803095},
|
||||||
"user_id": "@example2:localhost",
|
"user_id": "@example2:localhost",
|
||||||
"age": 43289803095
|
"age": 43289803095,
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
|
|
||||||
@ -43,43 +41,44 @@ class TestClass(object):
|
|||||||
"type": "m.room.message",
|
"type": "m.room.message",
|
||||||
"unsigned": {"age": 43289803095},
|
"unsigned": {"age": 43289803095},
|
||||||
"user_id": "@example2:localhost",
|
"user_id": "@example2:localhost",
|
||||||
"age": 43289803095
|
"age": 43289803095,
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def encrypted_media_event(self):
|
def encrypted_media_event(self):
|
||||||
return RoomEncryptedMedia.from_dict({
|
return RoomEncryptedMedia.from_dict(
|
||||||
"room_id": "!testroom:localhost",
|
{
|
||||||
"event_id": "$15163622445EBvZK:localhost",
|
"room_id": "!testroom:localhost",
|
||||||
"origin_server_ts": 1516362244030,
|
"event_id": "$15163622445EBvZK:localhost",
|
||||||
"sender": "@example2:localhost",
|
"origin_server_ts": 1516362244030,
|
||||||
"type": "m.room.message",
|
"sender": "@example2:localhost",
|
||||||
"content": {
|
"type": "m.room.message",
|
||||||
"body": "orange_cat.jpg",
|
"content": {
|
||||||
"msgtype": "m.image",
|
"body": "orange_cat.jpg",
|
||||||
"file": {
|
"msgtype": "m.image",
|
||||||
"v": "v2",
|
"file": {
|
||||||
"key": {
|
"v": "v2",
|
||||||
"alg": "A256CTR",
|
"key": {
|
||||||
"ext": True,
|
"alg": "A256CTR",
|
||||||
"k": "yx0QvkgYlasdWEsdalkejaHBzCkKEBAp3tB7dGtWgrs",
|
"ext": True,
|
||||||
"key_ops": ["encrypt", "decrypt"],
|
"k": "yx0QvkgYlasdWEsdalkejaHBzCkKEBAp3tB7dGtWgrs",
|
||||||
"kty": "oct"
|
"key_ops": ["encrypt", "decrypt"],
|
||||||
|
"kty": "oct",
|
||||||
|
},
|
||||||
|
"iv": "0pglXX7fspIBBBBAEERLFd",
|
||||||
|
"hashes": {
|
||||||
|
"sha256": "eXRDFvh+aXsQRj8a+5ZVVWUQ9Y6u9DYiz4tq1NvbLu8"
|
||||||
|
},
|
||||||
|
"url": "mxc://localhost/maDtasSiPFjROFMnlwxIhhyW",
|
||||||
|
"mimetype": "image/jpeg",
|
||||||
},
|
},
|
||||||
"iv": "0pglXX7fspIBBBBAEERLFd",
|
},
|
||||||
"hashes": {
|
|
||||||
"sha256": "eXRDFvh+aXsQRj8a+5ZVVWUQ9Y6u9DYiz4tq1NvbLu8"
|
|
||||||
},
|
|
||||||
"url": "mxc://localhost/maDtasSiPFjROFMnlwxIhhyW",
|
|
||||||
"mimetype": "image/jpeg"
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
})
|
)
|
||||||
|
|
||||||
def test_account_loading(self, panstore):
|
def test_account_loading(self, panstore):
|
||||||
accounts = panstore.load_all_users()
|
accounts = panstore.load_all_users()
|
||||||
# pdb.set_trace()
|
|
||||||
assert len(accounts) == 10
|
assert len(accounts) == 10
|
||||||
|
|
||||||
def test_token_saving(self, panstore, access_token):
|
def test_token_saving(self, panstore, access_token):
|
||||||
@ -130,7 +129,8 @@ class TestClass(object):
|
|||||||
if not INDEXING_ENABLED:
|
if not INDEXING_ENABLED:
|
||||||
pytest.skip("Indexing needs to be enabled to test this")
|
pytest.skip("Indexing needs to be enabled to test this")
|
||||||
|
|
||||||
from pantalaimon.index import Index, IndexStore
|
from pantalaimon.index import IndexStore
|
||||||
|
|
||||||
loop = asyncio.get_event_loop()
|
loop = asyncio.get_event_loop()
|
||||||
|
|
||||||
store = IndexStore("example", tempdir)
|
store = IndexStore("example", tempdir)
|
||||||
@ -148,8 +148,10 @@ class TestClass(object):
|
|||||||
assert len(result["results"]) == 1
|
assert len(result["results"]) == 1
|
||||||
assert result["count"] == 1
|
assert result["count"] == 1
|
||||||
assert result["results"][0]["result"] == self.test_event.source
|
assert result["results"][0]["result"] == self.test_event.source
|
||||||
assert (result["results"][0]["context"]["events_after"][0]
|
assert (
|
||||||
== self.another_event.source)
|
result["results"][0]["context"]["events_after"][0]
|
||||||
|
== self.another_event.source
|
||||||
|
)
|
||||||
|
|
||||||
def test_media_storage(self, panstore):
|
def test_media_storage(self, panstore):
|
||||||
server_name = "test"
|
server_name = "test"
|
||||||
|
13
tox.ini
13
tox.ini
@ -1,21 +1,14 @@
|
|||||||
# content of: tox.ini , put in same dir as setup.py
|
|
||||||
[tox]
|
[tox]
|
||||||
envlist = py38,py39,coverage
|
envlist = coverage
|
||||||
[testenv]
|
|
||||||
basepython =
|
|
||||||
py38: python3.8
|
|
||||||
py39: python3.9
|
|
||||||
py3: python3.9
|
|
||||||
|
|
||||||
|
[testenv]
|
||||||
deps = -rtest-requirements.txt
|
deps = -rtest-requirements.txt
|
||||||
install_command = pip install {opts} {packages}
|
install_command = pip install {opts} {packages}
|
||||||
|
|
||||||
passenv = TOXENV CI TRAVIS TRAVIS_*
|
passenv = TOXENV,CI
|
||||||
commands = pytest
|
commands = pytest
|
||||||
usedevelop = True
|
|
||||||
|
|
||||||
[testenv:coverage]
|
[testenv:coverage]
|
||||||
basepython = python3.9
|
|
||||||
commands =
|
commands =
|
||||||
pytest --cov=pantalaimon --cov-report term-missing
|
pytest --cov=pantalaimon --cov-report term-missing
|
||||||
coverage xml
|
coverage xml
|
||||||
|
Loading…
x
Reference in New Issue
Block a user