oobabooga
1d1d9e40cd
Add seed to settings
2023-03-31 12:22:07 -03:00
oobabooga
fd72afd8e7
Increase the textbox sizes
2023-03-31 00:43:00 -03:00
oobabooga
bd65940a48
Increase --chat box height
2023-03-30 00:43:49 -03:00
oobabooga
55755e27b9
Don't hardcode prompts in the settings dict/json
2023-03-29 22:47:01 -03:00
oobabooga
1cb9246160
Adapt to the new model names
2023-03-29 21:47:36 -03:00
oobabooga
cac577d99f
Fix interface reloading
2023-03-28 13:25:58 -03:00
Alex "mcmonkey" Goodwin
9cc811a0e6
fix LoRA path typo in #549
2023-03-27 22:16:40 -07:00
Alex "mcmonkey" Goodwin
31f04dc615
Merge branch 'main' into add-train-lora-tab
2023-03-27 20:03:30 -07:00
oobabooga
005f552ea3
Some simplifications
2023-03-27 23:29:52 -03:00
oobabooga
fde92048af
Merge branch 'main' into catalpaaa-lora-and-model-dir
2023-03-27 23:16:44 -03:00
oobabooga
2f0571bfa4
Small style changes
2023-03-27 21:24:39 -03:00
oobabooga
c2cad30772
Merge branch 'main' into mcmonkey4eva-add-train-lora-tab
2023-03-27 21:05:44 -03:00
oobabooga
641e1a09a7
Don't flash when selecting a new prompt
2023-03-27 14:48:43 -03:00
oobabooga
268abd1cba
Add some space in notebook mode
2023-03-27 13:52:12 -03:00
Alex "mcmonkey" Goodwin
c07bcd0850
add some outputs to indicate progress updates (sorta)
...
Actual progressbar still needed. Also minor formatting fixes.
2023-03-27 09:41:06 -07:00
oobabooga
af65c12900
Change Stop button behavior
2023-03-27 13:23:59 -03:00
oobabooga
572bafcd24
Less verbose message
2023-03-27 12:43:37 -03:00
Alex "mcmonkey" Goodwin
2afe1c13c1
move Training to before Interface mode
...
as Interface Mode seems to be a core 'settings' page that naturally belongs at the very end
2023-03-27 08:32:32 -07:00
oobabooga
202e981d00
Make Generate/Stop buttons smaller in notebook mode
2023-03-27 12:30:57 -03:00
Alex "mcmonkey" Goodwin
e439228ed8
Merge branch 'main' into add-train-lora-tab
2023-03-27 08:21:19 -07:00
oobabooga
57345b8f30
Add prompt loading/saving menus + reorganize interface
2023-03-27 12:16:37 -03:00
oobabooga
95c97e1747
Unload the model using the "Remove all" button
2023-03-26 23:47:29 -03:00
oobabooga
e07c9e3093
Merge branch 'main' into Brawlence-main
2023-03-26 23:40:51 -03:00
oobabooga
1c77fdca4c
Change notebook mode appearance
2023-03-26 22:20:30 -03:00
oobabooga
49c10c5570
Add support for the latest GPTQ models with group-size ( #530 )
...
**Warning: old 4-bit weights will not work anymore!**
See here how to get up to date weights: https://github.com/oobabooga/text-generation-webui/wiki/LLaMA-model#step-2-get-the-pre-converted-weights
2023-03-26 00:11:33 -03:00
Alex "mcmonkey" Goodwin
566898a79a
initial lora training tab
2023-03-25 12:08:26 -07:00
catalpaaa
d51cb8292b
Update server.py
...
yea i should go to bed
2023-03-24 17:36:31 -07:00
catalpaaa
9e2963e0c8
Update server.py
2023-03-24 17:35:45 -07:00
catalpaaa
ec2a1facee
Update server.py
2023-03-24 17:34:33 -07:00
catalpaaa
b37c54edcf
lora-dir, model-dir and login auth
...
Added lora-dir, model-dir, and a login auth arguments that points to a file contains usernames and passwords in the format of "u:pw,u:pw,..."
2023-03-24 17:30:18 -07:00
oobabooga
d8e950d6bd
Don't load the model twice when using --lora
2023-03-24 16:30:32 -03:00
oobabooga
fd99995b01
Make the Stop button more consistent in chat mode
2023-03-24 15:59:27 -03:00
oobabooga
9bdb3c784d
Minor fix
2023-03-23 22:02:40 -03:00
oobabooga
bf22d16ebc
Clear cache while switching LoRAs
2023-03-23 21:56:26 -03:00
Φφ
483d173d23
Code reuse + indication
...
Now shows the message in the console when unloading weights. Also reload_model() calls unload_model() first to free the memory so that multiple reloads won't overfill it.
2023-03-23 07:06:26 +03:00
Φφ
1917b15275
Unload and reload models on request
2023-03-23 07:06:26 +03:00
wywywywy
61346b88ea
Add "seed" menu in the Parameters tab
2023-03-22 15:40:20 -03:00
oobabooga
4d701a6eb9
Create a mirror for the preset menu
2023-03-19 12:51:47 -03:00
oobabooga
20f5b455bf
Add parameters reference #386 #331
2023-03-17 20:19:04 -03:00
oobabooga
a717fd709d
Sort the imports
2023-03-17 11:42:25 -03:00
oobabooga
29fe7b1c74
Remove LoRA tab, move it into the Parameters menu
2023-03-17 11:39:48 -03:00
oobabooga
214dc6868e
Several QoL changes related to LoRA
2023-03-17 11:24:52 -03:00
oobabooga
104293f411
Add LoRA support
2023-03-16 21:31:39 -03:00
oobabooga
38d7017657
Add all command-line flags to "Interface mode"
2023-03-16 12:44:03 -03:00
oobabooga
d54f3f4a34
Add no-stream checkbox to the interface
2023-03-16 10:19:00 -03:00
oobabooga
25a00eaf98
Add "Experimental" warning
2023-03-15 23:43:35 -03:00
oobabooga
599d3139fd
Increase the reload timeout a bit
2023-03-15 23:34:08 -03:00
oobabooga
4d64a57092
Add Interface mode tab
2023-03-15 23:29:56 -03:00
oobabooga
ffb898608b
Mini refactor
2023-03-15 20:44:34 -03:00
oobabooga
67d62475dc
Further reorganize chat UI
2023-03-15 18:56:26 -03:00