Replace old presets with the results of Preset Arena (#2830)

This commit is contained in:
oobabooga 2023-06-23 01:48:29 -03:00 committed by GitHub
parent aa1f1ef46a
commit 383c50f05b
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
31 changed files with 68 additions and 82 deletions

View File

@ -1,7 +1,3 @@
**Help find the best parameters for text generation!**
https://oobabooga.github.io/arena/preliminary-results.html
# Text generation web UI
A gradio web UI for running Large Language Models like LLaMA, llama.cpp, GPT-J, Pythia, OPT, and GALACTICA.
@ -340,7 +336,7 @@ Out of memory errors? [Check the low VRAM guide](docs/Low-VRAM-guide.md).
Inference settings presets can be created under `presets/` as yaml files. These files are detected automatically at startup.
By default, 10 presets based on NovelAI and KoboldAI presets are included. These were selected out of a sample of 43 presets after applying a K-Means clustering algorithm and selecting the elements closest to the average of each cluster: [tSNE visualization](https://user-images.githubusercontent.com/112222186/228956352-1addbdb9-2456-465a-b51d-089f462cd385.png).
The presets that are included by default are the result of a contest that received 7215 votes. More details can be found [here](https://github.com/oobabooga/oobabooga.github.io/blob/main/arena/results.md).
## Contributing
@ -352,5 +348,5 @@ By default, 10 presets based on NovelAI and KoboldAI presets are included. These
## Credits
- Gradio dropdown menu refresh button, code for reloading the interface: https://github.com/AUTOMATIC1111/stable-diffusion-webui
- NovelAI and KoboldAI presets: https://github.com/KoboldAI/KoboldAI-Client/wiki/Settings-Presets
- Godlike preset: https://github.com/KoboldAI/KoboldAI-Client/wiki/Settings-Presets
- Code for early stopping in chat mode, code for some of the sliders: https://github.com/PygmalionAI/gradio-ui/

View File

@ -65,7 +65,7 @@ settings = {
'chat_generation_attempts_max': 10,
'default_extensions': [],
'chat_default_extensions': ['gallery'],
'preset': 'LLaMA-Precise',
'preset': 'simple-1',
'prompt': 'QA',
}

6
presets/Asterism.yaml Normal file
View File

@ -0,0 +1,6 @@
temperature: 1.68
top_p: 0.17
tfs: 0.97
top_a: 0.42
repetition_penalty: 1.02
top_k: 77

6
presets/Big O.yaml Normal file
View File

@ -0,0 +1,6 @@
temperature: 0.87
top_p: 0.99
typical_p: 0.68
tfs: 0.68
repetition_penalty: 1.01
top_k: 85

View File

@ -0,0 +1,3 @@
do_sample: false
top_k: 4
penalty_alpha: 0.3

View File

@ -1 +1 @@
do_sample: False
do_sample: false

View File

@ -0,0 +1,7 @@
temperature: 1.31
top_p: 0.14
epsilon_cutoff: 1.49
eta_cutoff: 10.42
top_a: 0.52
repetition_penalty: 1.17
top_k: 49

View File

@ -1,6 +1,4 @@
do_sample: true
top_p: 0.5
top_k: 0
temperature: 0.7
repetition_penalty: 1.1
top_p: 0.5
typical_p: 0.19
repetition_penalty: 1.1

View File

@ -1,6 +0,0 @@
do_sample: true
top_p: 1.0
top_k: 0
temperature: 0.66
repetition_penalty: 1.1
typical_p: 0.6

View File

@ -1,6 +1,4 @@
do_sample: true
top_p: 0.1
top_k: 40
temperature: 0.7
top_p: 0.1
repetition_penalty: 1.18
typical_p: 1.0
top_k: 40

View File

@ -0,0 +1,4 @@
temperature: 0.98
top_p: 0.37
repetition_penalty: 1.18
top_k: 100

2
presets/Mirostat.yaml Normal file
View File

@ -0,0 +1,2 @@
mirostat_mode: 2
mirostat_tau: 8

View File

@ -1,4 +0,0 @@
do_sample: true
temperature: 0.7
top_p: 0.85
top_k: 50

View File

@ -1,6 +0,0 @@
do_sample: true
top_p: 0.9
top_k: 100
temperature: 0.8
repetition_penalty: 1.15
typical_p: 1.0

View File

@ -1,6 +0,0 @@
do_sample: true
top_p: 1.0
top_k: 100
temperature: 2
repetition_penalty: 1
typical_p: 0.97

View File

@ -1,6 +0,0 @@
do_sample: true
top_p: 0.98
top_k: 0
temperature: 0.63
repetition_penalty: 1.05
typical_p: 1.0

View File

@ -1,6 +0,0 @@
do_sample: true
top_p: 0.85
top_k: 12
temperature: 2
repetition_penalty: 1.15
typical_p: 1.0

View File

@ -1,6 +0,0 @@
do_sample: true
top_p: 1.0
top_k: 100
temperature: 1.07
repetition_penalty: 1.05
typical_p: 1.0

View File

@ -1,6 +0,0 @@
do_sample: true
top_p: 1.0
top_k: 0
temperature: 0.44
repetition_penalty: 1.15
typical_p: 1.0

View File

@ -1,6 +0,0 @@
do_sample: true
top_p: 0.18
top_k: 30
temperature: 2.0
repetition_penalty: 1.15
typical_p: 1.0

View File

@ -1,6 +0,0 @@
do_sample: true
top_p: 0.73
top_k: 0
temperature: 0.72
repetition_penalty: 1.1
typical_p: 1.0

5
presets/Shortwave.yaml Normal file
View File

@ -0,0 +1,5 @@
temperature: 1.53
top_p: 0.64
top_a: 0.04
repetition_penalty: 1.07
top_k: 33

4
presets/Space Alien.yaml Normal file
View File

@ -0,0 +1,4 @@
temperature: 1.31
top_p: 0.29
repetition_penalty: 1.09
top_k: 72

View File

@ -1,3 +0,0 @@
do_sample: False
penalty_alpha: 0.6
top_k: 4

View File

@ -1,4 +0,0 @@
do_sample: true
eta_cutoff: 3
temperature: 0.7
repetition_penalty: 1.18

3
presets/StarChat.yaml Normal file
View File

@ -0,0 +1,3 @@
temperature: 0.2
top_p: 0.95
top_k: 50

7
presets/Titanic.yaml Normal file
View File

@ -0,0 +1,7 @@
temperature: 1.01
top_p: 0.21
eta_cutoff: 10.78
top_a: 0.75
repetition_penalty: 1.21
encoder_repetition_penalty: 1.07
top_k: 91

4
presets/Yara.yaml Normal file
View File

@ -0,0 +1,4 @@
temperature: 0.82
top_p: 0.21
repetition_penalty: 1.19
top_k: 72

4
presets/simple-1.yaml Normal file
View File

@ -0,0 +1,4 @@
temperature: 0.7
top_p: 0.9
repetition_penalty: 1.15
top_k: 20

View File

@ -0,0 +1,4 @@
temperature: 0.7
tfs: 0.95
top_a: 0.2
repetition_penalty: 1.15

View File

@ -39,5 +39,5 @@ chat_generation_attempts_max: 10
default_extensions: []
chat_default_extensions:
- gallery
preset: LLaMA-Precise
preset: simple-1
prompt: QA