mirror of
https://github.com/nomic-ai/gpt4all.git
synced 2024-10-01 01:06:10 -04:00
gpt4all: an ecosystem of open-source chatbots trained on a massive collections of clean assistant data including code, stories and dialogue
chat | ||
configs | ||
eval_data | ||
figs | ||
peft@098962fa65 | ||
transformers@cae78c46d6 | ||
.gitignore | ||
.gitmodules | ||
clean.py | ||
data.py | ||
env.yaml | ||
eval_figures.py | ||
eval_self_instruct.py | ||
generate.py | ||
gpt4all-lora-demo.gif | ||
read.py | ||
README.md | ||
requirements.txt | ||
train.py | ||
TRAINING_LOG.md |
GPT4All
Demo, data and code to train an assistant-style large language model on ~440k GPT-3.5-Turbo Generations
Try it yourself
Clone this repository down, go the chat
directory and download the CPU quantized gpt4all model.
Place the quantized model in the chat
directory and start chatting by running:
./gpt4all-lora-quantized-OSX-m1
on Mac/OSX./gpt4all-lora-quantized-linux-x86
on Windows/Linux
To compile for custom hardware, see our fork of the Alpaca C++ repo.
Reproducibility
Trained LoRa Weights:
- gpt4all-lora: https://huggingface.co/nomic-ai/gpt4all-lora
- gpt4all-lora-epoch-2 https://huggingface.co/nomic-ai/gpt4all-lora-epoch-2
Raw Data:
We are not distributing a LLaMa 7B checkpoint.
You can reproduce our trained model by doing the following:
Setup
Clone the repo
git clone --recurse-submodules git@github.com:nomic-ai/gpt4all.git
git submodule configure && git submodule update
Setup the environment
python -m pip install -r requirements.txt
cd transformers
pip install -e .
cd ../peft
pip install -e .
Training
accelerate launch --dynamo_backend=inductor --num_processes=8 --num_machines=1 --machine_rank=0 --deepspeed_multinode_launcher standard --mixed_precision=bf16 --use_deepspeed --deepspeed_config_file=configs/deepspeed/ds_config.json train.py --config configs/train/finetune-7b.yaml
Generate
python generate.py --config configs/generate/generate.yaml --prompt "Write a script to reverse a string in Python
If you utilize this reposistory, models or data in a downstream project, please consider citing it with:
@misc{gpt4all,
author = {Yuvanesh Anand and Zach Nussbaum and Brandon Duderstadt and Benjamin Schmidt and Andriy Mulyar},
title = {GPT4All: Training an Assistant-style Chatbot with Large Scale Data Distillation from GPT-3.5-Turbo},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/nomic-ai/gpt4all}},
}