Carl Kenner
|
814f754451
|
Support for MPT, INCITE, WizardLM, StableLM, Galactica, Vicuna, Guanaco, and Baize instruction following (#1596)
|
2023-05-09 20:37:31 -03:00 |
|
Matthew McAllister
|
06c7db017d
|
Add config for pygmalion-7b and metharme-7b (#1887)
|
2023-05-09 20:31:27 -03:00 |
|
oobabooga
|
b5260b24f1
|
Add support for custom chat styles (#1917)
|
2023-05-08 12:35:03 -03:00 |
|
oobabooga
|
00e333d790
|
Add MOSS support
|
2023-05-04 23:20:34 -03:00 |
|
oobabooga
|
97a6a50d98
|
Use oasst tokenizer instead of universal tokenizer
|
2023-05-04 15:55:39 -03:00 |
|
oobabooga
|
dbddedca3f
|
Detect oasst-sft-6-llama-30b
|
2023-05-04 15:13:37 -03:00 |
|
oobabooga
|
91745f63c3
|
Use Vicuna-v0 by default for Vicuna models
|
2023-04-26 17:45:38 -03:00 |
|
TiagoGF
|
a941c19337
|
Fixing Vicuna text generation (#1579)
|
2023-04-26 16:20:27 -03:00 |
|
oobabooga
|
d87ca8f2af
|
LLaVA fixes
|
2023-04-26 03:47:34 -03:00 |
|
oobabooga
|
a777c058af
|
Precise prompts for instruct mode
|
2023-04-26 03:21:53 -03:00 |
|
Wojtab
|
12212cf6be
|
LLaVA support (#1487)
|
2023-04-23 20:32:22 -03:00 |
|
Forkoz
|
c6fe1ced01
|
Add ChatGLM support (#1256)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2023-04-16 19:15:03 -03:00 |
|
oobabooga
|
cb95a2432c
|
Add Koala support
|
2023-04-16 14:41:06 -03:00 |
|
oobabooga
|
b937c9d8c2
|
Add skip_special_tokens checkbox for Dolly model (#1218)
|
2023-04-16 14:24:49 -03:00 |
|
oobabooga
|
7d7d122edb
|
Cover one more model
|
2023-04-14 11:15:59 -03:00 |
|
oobabooga
|
8eba88061a
|
Remove unused config
|
2023-04-14 11:12:17 -03:00 |
|
oobabooga
|
8e31f2bad4
|
Automatically set wbits/groupsize/instruct based on model name (#1167)
|
2023-04-14 11:07:28 -03:00 |
|
oobabooga
|
dd70f7edd5
|
Add the default folders
|
2023-01-06 02:38:09 -03:00 |
|