.github/workflows | ||
src | ||
.dockerignore | ||
.env.example | ||
.full-env.example | ||
.gitignore | ||
.pre-commit-config.yaml | ||
CHANGELOG.md | ||
compose.yaml | ||
config.json.example | ||
custom_help_message.txt | ||
Dockerfile | ||
full-config.json.example | ||
LICENSE | ||
README.md | ||
requirements-dev.txt | ||
requirements.txt |
Introduction
This is a simple Matrix bot that support using OpenAI API, Langchain to generate responses from user inputs. The bot responds to these commands: !gpt
, !chat
, !v
, !pic
, !new
, !lc
and !help
depending on the first word of the prompt.
Feature
- Support official openai api and self host models(LocalAI)
- Support E2E Encrypted Room
- Colorful code blocks
- Langchain(Flowise)
- Image Generation with DALL·E or LocalAI or stable-diffusion-webui
- GPT Vision(openai or GPT Vision API compatible such as LocalAI)
- Room level and thread level chat context
Installation and Setup
Docker method(Recommended):
Edit config.json
or .env
with proper values
For explainations and complete parameter list see: https://github.com/hibobmaster/matrix_chatgpt_bot/wiki
Create two empty file, for persist database only
touch sync_db context.db manage_db
sudo docker compose up -d
manage_db(can be ignored) is for langchain agent, sync_db is for matrix sync database, context.db is for bot chat context
Normal Method:
system dependece:
libolm-dev
- Clone the repository and create virtual environment:
git clone https://github.com/hibobmaster/matrix_chatgpt_bot.git
python -m venv venv
source venv/bin/activate
- Install the required dependencies:
pip install -U pip setuptools wheel
pip install -r requirements.txt
- Create a new config.json file and complete it with the necessary information:
If not set:
room_id
: bot will work in the room where it is in
{
"homeserver": "YOUR_HOMESERVER",
"user_id": "YOUR_USER_ID",
"password": "YOUR_PASSWORD",
"device_id": "YOUR_DEVICE_ID",
"room_id": "YOUR_ROOM_ID",
"openai_api_key": "YOUR_API_KEY",
"gpt_api_endpoint": "xxxxxxxxx"
}
- Launch the bot:
python src/main.py
Usage
To interact with the bot, simply send a message to the bot in the Matrix room with one of the following prompts:
-
!help
help message -
!gpt
To generate a one time response:
!gpt What is the meaning of life?
!chat
To chat using official api with context conversation
!chat Can you tell me a joke?
- GPT Vision command
You can refer the screenshot
Room Level: quote a image and @bot + {prompt}
Thread Level: quote a image with a {prompt}
!lc
To chat using langchain api endpoint
!lc All the world is a stage
!pic
To generate an image using openai DALL·E or LocalAI
!pic A bridal bouquet made of succulents
!agent
display or set langchain agent
!agent list
!agent use {agent_name}
!new + {chat}
Start a new converstaion
LangChain(flowise) admin: https://github.com/hibobmaster/matrix_chatgpt_bot/wiki/Langchain-(flowise)
Image Generation
https://github.com/hibobmaster/matrix_chatgpt_bot/wiki/
GPT Vision
Thread Level:
Thread level Context
Mention bot with prompt, bot will reply in thread.
To keep context just send prompt in thread directly without mention it.