This commit is contained in:
osiris account 2023-03-11 21:07:23 -08:00
parent 95da45b3da
commit 45872d1dcc
3 changed files with 39 additions and 13 deletions

View File

@ -12,7 +12,7 @@ TOKEN_DECIMALS =
#########################
MAX_RETRIES = 4
SIZE_CHUNK_NEXT = 50000
SIZE_CHUNK_NEXT = 5000
#########################

View File

@ -1,12 +1,21 @@
## token scanner api and cli
## 🛠🪙 token scanner api and cli
<br>
##### 👉 this project implements a cli tool that indexes transfer events for a particular token, and is deployed to a restful api for fast balance and ownership statistics retrieval. this is the first step for training machine learning models on the chains (*e.g.*, high-frequency trading with deep learning).
##### 📚 more details can be found in my mirror post, **[quant #3: building a scalable event scanner for ethereum](https://mirror.xyz/steinkirch.eth/vSF18xcLyfXLIWwxjreRa3I_XskwgnjSc6pScegNJWI)**.
<br>
---
### setting up
<br>
#### installing dependencies
create a venv, either using virtualenv, pipenv, or poetry.
because of some of the dependencies in this code, we will be developing on a python3.9 environment (install here if you dont have that version on disk):
```
@ -18,9 +27,9 @@ pip3 install -r requirements.txt
<br>
#### add environment variables
#### adding environment variables
now, create an .env file and add an RPC_PROVIDER_URL to connect to ethereum mainnet nodes (you can pick from any of this list of nodes as a service):
create a `.env` file and add an `RPC_PROVIDER_URL` to connect to ethereum mainnet nodes (for example, from [this list](https://ethereumnodes.com/)):
```
cp .env.example .env
@ -33,24 +42,40 @@ vim .env
```
make install
indexer -h
```
<br>
#### deploying on production
----
we use vercel to deploy this app at .
### running
<br>
to deploy new changes, first install vercel:
```
yarn
indexer -h
```
then run:
<br>
---
### development
<br>
#### deploying in production
we use vercel to deploy this app:
```
vercel login
vercel .
```
```
<br>

View File

@ -117,6 +117,7 @@ class TokenIndexer:
def _process_logs(self, logs: list) -> dict:
"""Process the logs and return a dictionary with the results."""
log_info(f'Processing {len(logs)} logs...')
processed_logs = defaultdict()
try: