text-generation-webui/modules
oobabooga 5a79863df3 Increase the sequence length, decrease batch size
I have no idea what I am doing
2023-03-03 15:54:13 -03:00
..
chat.py Ensure proper no-streaming with generation_attempts > 1 2023-03-02 00:10:10 -03:00
deepspeed_parameters.py Fix deepspeed (oops) 2023-02-02 10:39:37 -03:00
extensions.py Move bot_picture.py inside the extension 2023-02-25 03:00:19 -03:00
html_generator.py Store thumbnails as files instead of base64 strings 2023-02-27 13:41:00 -03:00
LLaMA.py Increase the sequence length, decrease batch size 2023-03-03 15:54:13 -03:00
models.py Add LLaMA support 2023-03-03 14:39:14 -03:00
RWKV.py Remove some unused imports 2023-03-02 00:36:20 -03:00
shared.py Add LLaMA support 2023-03-03 14:39:14 -03:00
stopping_criteria.py Improve the imports 2023-02-23 14:41:42 -03:00
text_generation.py Add a tokenizer placeholder 2023-03-03 15:16:55 -03:00
ui.py Stop chat from flashing dark when processing 2023-03-03 13:19:13 -03:00