gpt4all: an ecosystem of open-source chatbots trained on a massive collections of clean assistant data including code, stories and dialogue
Go to file
2023-03-28 17:25:06 -04:00
chat Added binaries to chat 2023-03-28 16:35:39 -04:00
configs Update generate.yaml 2023-03-28 13:59:26 -07:00
eval_data started eval script and added eval data 2023-03-27 21:50:08 +00:00
figs Merge branch 'main' of https://github.com/nomic-ai/gpt4all into main 2023-03-28 20:23:42 +00:00
peft@098962fa65 feat: peft submodule 2023-03-25 16:23:14 +00:00
transformers@cae78c46d6 feat: transformers submodule, gitignore 2023-03-25 16:16:11 +00:00
.gitignore chore: gitignore 2023-03-27 16:33:00 +00:00
.gitmodules feat: peft submodule 2023-03-25 16:23:14 +00:00
clean.py fix: naming 2023-03-27 17:30:33 +00:00
data.py fix: just read from watermark file 2023-03-27 17:30:44 +00:00
env.yaml feat: env for conda, pip 2023-03-25 16:16:40 +00:00
eval_figures.py updated eval 2023-03-28 20:22:48 +00:00
eval_self_instruct.py added eval code 2023-03-28 18:47:38 +00:00
generate.py metrics run on configs now 2023-03-28 00:09:47 +00:00
gpt4all-lora-demo.gif GIF 2023-03-28 15:54:44 -04:00
read.py feat: train and clean data 2023-03-25 16:17:48 +00:00
README.md Update README.md 2023-03-28 17:25:06 -04:00
requirements.txt feat: env for conda, pip 2023-03-25 16:16:40 +00:00
train.py fix: append first_epoch 2023-03-27 17:30:24 +00:00
TRAINING_LOG.md Update TRAINING_LOG.md 2023-03-28 13:47:22 -07:00

GPT4All

Demo, data and code to train an assistant-style large language model with ~800k GPT-3.5-Turbo Generations based on LLaMa

📗 Technical Report

gpt4all-lora-demo

Run on M1 Mac (not sped up!)

Try it yourself

Clone this repository down and download the CPU quantized gpt4all model.

Place the quantized model in the chat directory and start chatting by running:

  • ./chat/gpt4all-lora-quantized-OSX-m1 on M1 Mac/OSX
  • ./chat/gpt4all-lora-quantized-linux-x86 on Windows/Linux

To compile for custom hardware, see our fork of the Alpaca C++ repo.

Note: the full model on GPU (16GB of RAM required) performs much better in our qualitative evaluations.

Reproducibility

Trained LoRa Weights:

Raw Data:

We are not distributing a LLaMa 7B checkpoint.

You can reproduce our trained model by doing the following:

Setup

Clone the repo

git clone --recurse-submodules git@github.com:nomic-ai/gpt4all.git

git submodule configure && git submodule update

Setup the environment

python -m pip install -r requirements.txt

cd transformers
pip install -e . 

cd ../peft
pip install -e .

Training

accelerate launch --dynamo_backend=inductor --num_processes=8 --num_machines=1 --machine_rank=0 --deepspeed_multinode_launcher standard --mixed_precision=bf16  --use_deepspeed --deepspeed_config_file=configs/deepspeed/ds_config.json train.py --config configs/train/finetune-7b.yaml

Generate

python generate.py --config configs/generate/generate.yaml --prompt "Write a script to reverse a string in Python

Sample Generations

Provide instructions for the given exercise. Leg Raises

Leg Raises

  1. Stand with your feet shoulder-width apart and your knees slightly bent.
  2. Raise your arms straight out in front of you.
  3. Slowly bend your knees and raise your heels off the ground.
  4. Hold this position for a count of 3, then slowly lower your heels back to the ground.
  5. Repeat the exercise 10 times.
  6. Do not let your knees go beyond your toes.
  7. Keep your back straight and your abdominal muscles engaged.
  8. Do not let your hips sag or your back arch.
  9. Keep your arms straight and your elbows locked.
  10. Do not let your head drop or your shoulders rise.
  11. Breathe normally throughout the exercise.
  12. Rest for 30 seconds before repeating the exercise.
  13. Gradually increase the number of repetitions as you become stronger.

A color description has been provided. Find the CSS code associated with that color. A light red color with a medium light shade of pink

The CSS code associated with the color provided is: #FF6347

Come up with an interesting idea for a new movie plot. Your plot should be described with a title and a summary.

  1. The title of your movie plot should be "The Last Stand". 2. The summary of your movie plot should be: A group of mercenaries are hired to protect a small town from a dangerous criminal who is on the run. However, the town's sheriff is not willing to let them take over and decides to take matters into his own hands.

If you utilize this reposistory, models or data in a downstream project, please consider citing it with:

@misc{gpt4all,
  author = {Yuvanesh Anand and Zach Nussbaum and Brandon Duderstadt and Benjamin Schmidt and Andriy Mulyar},
  title = {GPT4All: Training an Assistant-style Chatbot with Large Scale Data Distillation from GPT-3.5-Turbo},
  year = {2023},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/nomic-ai/gpt4all}},
}