Commit Graph

437 Commits

Author SHA1 Message Date
oobabooga
20484f26f3
Trying to make character bias more consistent 2023-02-15 23:38:52 -03:00
oobabooga
7bd2ae05bf Force your name to be "You" for pygmalion
This allows you to customize your displayed name.
2023-02-15 21:32:53 -03:00
oobabooga
3746d72853 More style fixes 2023-02-15 21:13:12 -03:00
oobabooga
6f213b8c14 Style fix 2023-02-15 20:58:17 -03:00
oobabooga
ccf10db60f Move stuff into tabs in chat mode 2023-02-15 20:55:32 -03:00
oobabooga
a55e8836f6 Bump gradio version
It looks uglier, but the old one was bugged and unstable.
2023-02-15 20:20:56 -03:00
oobabooga
0e89ff4b13 Clear the persistent history after clicking on "Clear history" 2023-02-15 16:49:52 -03:00
oobabooga
05b53e4626 Update README 2023-02-15 14:43:34 -03:00
oobabooga
ed73d00bd5 Update README 2023-02-15 14:43:13 -03:00
oobabooga
30fcb26737 Update README 2023-02-15 14:42:41 -03:00
oobabooga
b3bcd2881d Implement regenerate/impersonate the proper way (fixes #78) 2023-02-15 14:39:26 -03:00
oobabooga
5ee9283cae Mention BLIP 2023-02-15 13:53:38 -03:00
oobabooga
8d3b3959e7 Document --picture option 2023-02-15 13:50:18 -03:00
oobabooga
2eea0f4edb Minor change 2023-02-15 12:58:11 -03:00
oobabooga
3c31fa7079 Simplifications 2023-02-15 12:46:11 -03:00
oobabooga
80fbc584f7 Readability 2023-02-15 11:38:44 -03:00
oobabooga
b397bea387 Make chat history persistent 2023-02-15 11:30:38 -03:00
oobabooga
7be372829d Set chat prompt size in tokens 2023-02-15 10:18:50 -03:00
oobabooga
1622059179 Move BLIP to the CPU
It's just as fast
2023-02-15 00:03:19 -03:00
oobabooga
d4d90a8000
Merge pull request #76 from SillyLossy/main
Use BLIP to send a picture to model
2023-02-14 23:57:44 -03:00
oobabooga
8c3ef58e00 Use BLIP directly + some simplifications 2023-02-14 23:55:46 -03:00
SillyLossy
a7d98f494a Use BLIP to send a picture to model 2023-02-15 01:38:21 +02:00
oobabooga
79d3a524f2 Add a file 2023-02-14 15:18:05 -03:00
oobabooga
f6bf74dcd5 Add Silero TTS extension 2023-02-14 15:06:06 -03:00
oobabooga
01e5772302
Update README.md 2023-02-14 13:06:26 -03:00
oobabooga
d910d435cd Consider the softprompt in the maximum prompt length calculation 2023-02-14 12:06:47 -03:00
oobabooga
8b3bb512ef Minor bug fix (soft prompt was being loaded twice) 2023-02-13 23:34:04 -03:00
oobabooga
56bbc996a4 Minor CSS change for readability 2023-02-13 23:01:14 -03:00
oobabooga
210c918199
Update README.md 2023-02-13 21:49:19 -03:00
oobabooga
2fe9d7f372 Merge branch 'main' of github.com:oobabooga/text-generation-webui 2023-02-13 18:48:46 -03:00
oobabooga
7739a29524 Some simplifications 2023-02-13 18:48:32 -03:00
oobabooga
b7ddcab53a
Update README.md 2023-02-13 15:52:49 -03:00
oobabooga
3277b751f5 Add softprompt support (for real this time)
Is this too much voodoo for our purposes?
2023-02-13 15:25:16 -03:00
oobabooga
aa1177ff15 Send last internal reply to input rather than visible 2023-02-13 03:29:23 -03:00
oobabooga
61aed97439 Slightly increase a margin 2023-02-12 17:38:54 -03:00
oobabooga
2c3abcf57a Add support for rosey/chip/joi instruct models 2023-02-12 09:46:34 -03:00
oobabooga
7ef7bba6e6 Add progress bar for model loading 2023-02-12 09:36:27 -03:00
oobabooga
939e9d00a2
Update README.md 2023-02-12 00:47:03 -03:00
oobabooga
bf9dd8f8ee Add --text-only option to the download script 2023-02-12 00:42:56 -03:00
oobabooga
42cc307409
Update README.md 2023-02-12 00:34:55 -03:00
oobabooga
66862203fc Only download safetensors if both pytorch and safetensors are present 2023-02-12 00:06:22 -03:00
oobabooga
5d3f15b915 Use the CPU if no GPU is detected 2023-02-11 23:17:06 -03:00
oobabooga
337290777b Rename example extension to "softprompt" 2023-02-11 17:17:10 -03:00
oobabooga
b3c4657c47 Remove commas from preset files 2023-02-11 14:54:29 -03:00
oobabooga
144857acfe Update README 2023-02-11 14:49:11 -03:00
oobabooga
0dd1409f24 Add penalty_alpha parameter (contrastive search) 2023-02-11 14:48:12 -03:00
oobabooga
8aafb55693
1-click installer now also works for AMD GPUs
(I think)
2023-02-11 14:24:47 -03:00
oobabooga
7eed553337 Merge branch 'main' of github.com:oobabooga/text-generation-webui 2023-02-11 08:00:29 -03:00
oobabooga
2ed0386d87 Fix replace last reply in --chat mode (for #69) 2023-02-11 07:59:54 -03:00
oobabooga
1e97cb9570
Merge pull request #68 from Spencer-Dawson/patch-1
Added ROCm Install instructions to README
2023-02-11 07:56:30 -03:00