A simple matrix bot that supports chatting with ChatGPT, Bing AI, Google Bard
Go to file
hibobmaster eead15665c
📝 Update GPT Vision description
2024-04-26 18:15:23 +08:00
.github/workflows Support dall-e-3 and Update workflows 2024-04-24 16:48:23 +08:00
src Add GPT Vision thread level response to context 2024-04-26 17:54:48 +08:00
.dockerignore Update .dockerignore 2023-05-05 07:31:21 +08:00
.env.example fix and do some improvements 2023-09-21 12:38:29 +08:00
.full-env.example feat: Add GPT Vision 2024-03-08 07:07:01 +00:00
.gitignore Support thread level context (#29) 2024-04-23 20:23:02 +08:00
.pre-commit-config.yaml Bump pre-commit hook version 2023-12-12 16:35:25 +08:00
CHANGELOG.md v1.7.2 2024-04-26 17:58:49 +08:00
Dockerfile Fix docker build 2023-09-18 00:13:30 +08:00
LICENSE Create LICENSE 2023-04-11 08:48:02 +08:00
README.md 📝 Update GPT Vision description 2024-04-26 18:15:23 +08:00
compose.yaml Support thread level context (#29) 2024-04-23 20:23:02 +08:00
config.json.example fix and do some improvements 2023-09-21 12:38:29 +08:00
full-config.json.example feat: Add GPT Vision 2024-03-08 07:07:01 +00:00
requirements-dev.txt Optimize 2023-09-16 15:13:17 +08:00
requirements.txt Optimize 2023-09-16 15:13:17 +08:00

README.md

Introduction

This is a simple Matrix bot that support using OpenAI API, Langchain to generate responses from user inputs. The bot responds to these commands: !gpt, !chat, !v, !pic, !new, !lc and !help depending on the first word of the prompt. ChatGPT

Feature

  1. Support official openai api and self host models(LocalAI)
  2. Support E2E Encrypted Room
  3. Colorful code blocks
  4. Langchain(Flowise)
  5. Image Generation with DALL·E or LocalAI or stable-diffusion-webui
  6. GPT Vision(openai or GPT Vision API compatible such as LocalAI)
  7. Room level and thread level chat context

Installation and Setup

Docker method(Recommended):
Edit config.json or .env with proper values
For explainations and complete parameter list see: https://github.com/hibobmaster/matrix_chatgpt_bot/wiki
Create two empty file, for persist database only

touch sync_db context.db manage_db
sudo docker compose up -d

manage_db(can be ignored) is for langchain agent, sync_db is for matrix sync database, context.db is for bot chat context


Normal Method:
system dependece: libolm-dev
  1. Clone the repository and create virtual environment:
git clone https://github.com/hibobmaster/matrix_chatgpt_bot.git

python -m venv venv
source venv/bin/activate
  1. Install the required dependencies:
pip install -U pip setuptools wheel
pip install -r requirements.txt
  1. Create a new config.json file and complete it with the necessary information:
    If not set:
    room_id: bot will work in the room where it is in
{
  "homeserver": "YOUR_HOMESERVER",
  "user_id": "YOUR_USER_ID",
  "password": "YOUR_PASSWORD",
  "device_id": "YOUR_DEVICE_ID",
  "room_id": "YOUR_ROOM_ID",
  "openai_api_key": "YOUR_API_KEY",
  "gpt_api_endpoint": "xxxxxxxxx"
}
  1. Launch the bot:
python src/main.py

Usage

To interact with the bot, simply send a message to the bot in the Matrix room with one of the following prompts:

  • !help help message

  • !gpt To generate a one time response:

!gpt What is the meaning of life?
  • !chat To chat using official api with context conversation
!chat Can you tell me a joke?
  • GPT Vision command

You can refer the screenshot

Room Level: quote a image and @bot + {prompt}
Thread Level: quote a image with a {prompt}
  • !lc To chat using langchain api endpoint
!lc All the world is a stage
  • !pic To generate an image using openai DALL·E or LocalAI
!pic A bridal bouquet made of succulents
  • !agent display or set langchain agent
!agent list
!agent use {agent_name}
  • !new + {chat} Start a new converstaion

LangChain(flowise) admin: https://github.com/hibobmaster/matrix_chatgpt_bot/wiki/Langchain-(flowise)

Image Generation

demo1 demo2 https://github.com/hibobmaster/matrix_chatgpt_bot/wiki/

GPT Vision

Room Level: GPT Vision Room Level

Thread Level:

GPT Vision Thread Level

Thread level Context

Mention bot with prompt, bot will reply in thread.

To keep context just send prompt in thread directly without mention it.

thread level context 1 thread level context 2

Thanks

  1. matrix-nio
  2. acheong08
  3. 8go