Commit Graph

363 Commits

Author SHA1 Message Date
Martin J
06a4664805 Fix a regex issue in tokenize_dialogue.
The existing regex would fail if using character names that start with
numbers, for example: 9S or 2B.
2023-02-05 07:42:57 +01:00
oobabooga
2fe235738e Reorganize chat buttons 2023-02-04 22:53:42 -03:00
oobabooga
2207d44986 Windows doesn't like : in filenames 2023-02-04 20:07:39 -03:00
oobabooga
b4fc8dfa8f Add safetensors version 2023-02-04 18:58:17 -03:00
oobabooga
65266f3349 Fix loading official colab chat logs 2023-02-03 22:43:02 -03:00
oobabooga
54a6a74b7d Merge branch 'main' of github.com:oobabooga/text-generation-webui 2023-02-03 20:07:38 -03:00
oobabooga
3dbebe30b1 Remove deepspeed requirement (only works on Linux for now) 2023-02-03 20:07:13 -03:00
oobabooga
90bb2caffd
Update README.md 2023-02-03 19:45:11 -03:00
oobabooga
9215e281ba Add --threads option to the download script 2023-02-03 18:57:12 -03:00
oobabooga
03f084f311 Add safetensors support 2023-02-03 18:36:32 -03:00
oobabooga
93b0d1b1b8
Update README.md 2023-02-03 10:14:52 -03:00
oobabooga
44e8c671f9 Fix API documentation formatting in chat mode 2023-02-03 10:00:05 -03:00
oobabooga
03ebfba0fb Bump bitsandbytes version 2023-02-03 09:29:31 -03:00
oobabooga
6212b41930 Update README 2023-02-03 09:13:14 -03:00
oobabooga
a28f0d8bd7 Show it/s in the same units with or without streaming
Closes #49
2023-02-03 09:11:11 -03:00
oobabooga
4e4cd67223 Save chat history with name/date in filename
closes #50
2023-02-03 09:02:35 -03:00
oobabooga
3af3ffeb90 Make --help output more readable 2023-02-02 23:36:28 -03:00
oobabooga
638495b633 Simplify generate() function 2023-02-02 13:47:08 -03:00
oobabooga
3f05cf5ddd Simplify encode() function 2023-02-02 13:31:32 -03:00
oobabooga
afc2b0f4c8 Remove .gitignore 2023-02-02 12:42:49 -03:00
oobabooga
224be31a74 Use main bs4 package 2023-02-02 12:20:58 -03:00
oobabooga
cecaebc291 Add bs4 requirement (fixes #47) 2023-02-02 12:18:32 -03:00
oobabooga
0c2eb444e6 Merge branch 'main' of github.com:oobabooga/text-generation-webui 2023-02-02 12:17:02 -03:00
oobabooga
2583bc5840 Simplify deepspeed implementation (#40) 2023-02-02 12:15:44 -03:00
oobabooga
7f4315b120
Mention 8bit fix for Windows users
Closes #44, #20
2023-02-02 11:00:57 -03:00
oobabooga
d6b2d68527 Remove redundant requirements 2023-02-02 10:40:09 -03:00
oobabooga
f38c9bf428 Fix deepspeed (oops) 2023-02-02 10:39:37 -03:00
oobabooga
90f1067598 Move deepspeed parameters to another file 2023-02-02 10:25:09 -03:00
oobabooga
1a658b41aa
Merge pull request #43 from 81300/ds
Add DeepSpeed ZeRO-3 integration
2023-02-02 10:03:19 -03:00
oobabooga
39461bd796
Delete .gitignore 2023-02-02 10:01:56 -03:00
81300
248ec4fa21
Merge branch 'oobabooga:main' into ds 2023-02-01 20:50:51 +02:00
81300
a6f4760772
Add arg for bfloat16 2023-02-01 20:22:07 +02:00
81300
c515282f5c
no_split_module_classes not needed 2023-02-01 19:47:26 +02:00
81300
0a0d289537
Fix issue with generating on multiple GPUs 2023-02-01 19:02:07 +02:00
81300
a97afa6965
Add DeepSpeed ZeRO-3 integration 2023-02-01 18:48:13 +02:00
oobabooga
6b13816c47 Change default --disk behavior 2023-02-01 10:43:28 -03:00
oobabooga
119be56390 Add back low_cpu_mem_usage=True
Removing it didn't help with anything, so I am adding it bad on a purely
superstiticious basis.
2023-02-01 10:01:44 -03:00
oobabooga
d4a0b377ab Allow standalone --cpu-memory
I think that what I am doing probably makes sense, but I could be wrong.
2023-01-31 21:23:16 -03:00
oobabooga
efb0ab502e New preset 2023-01-31 21:03:25 -03:00
oobabooga
8ef89df746 Try to leave at least 1GiB free to prevent oom errors 2023-01-31 20:47:05 -03:00
oobabooga
bb77f20a6c Don't use low_cpu_mem_usage and device_map together 2023-01-31 13:24:05 -03:00
oobabooga
824329749d
Merge pull request #38 from Silver267/patch-2
Fix an error
2023-01-31 08:14:50 -03:00
oobabooga
001ecf95b2
Update server.py 2023-01-31 08:14:16 -03:00
Silver267
a85bb5e9a2
Fix an error
Fixes "UnboundLocalError: local variable 'substring_found' referenced before assignment" when loading non-pygmalion models in cai chat mode.
2023-01-31 01:34:10 -05:00
oobabooga
5b0bbfa6e8 Clean up 2023-01-30 14:17:12 -03:00
oobabooga
7aa3d6583e
Update README.md 2023-01-30 09:45:31 -03:00
oobabooga
239f96a9c5
Add extensions guide 2023-01-30 09:44:57 -03:00
oobabooga
dfbca86533 Add **bold** support in chat mode 2023-01-30 08:36:58 -03:00
oobabooga
2dadf42cb5 Print the tokenized example dialogue in a prettier way 2023-01-30 08:29:49 -03:00
oobabooga
161cae001b I needed this 2023-01-29 23:20:22 -03:00