text-generation-webui/modules
2023-10-22 15:57:19 -03:00
..
AutoGPTQ_loader.py Add the --disable_exllama option for AutoGPTQ 2023-08-12 02:26:58 -04:00
block_requests.py Bump to latest gradio (3.47) (#4258) 2023-10-10 22:20:49 -03:00
callbacks.py Add the option to use samplers in the logit viewer 2023-08-22 20:18:16 -07:00
chat.py Add a rename menu for chat histories 2023-09-21 19:16:51 -07:00
ctransformers_model.py Fix ctransformers model unload (#3711) 2023-08-27 10:53:48 -03:00
deepspeed_parameters.py Fix typo in deepspeed_parameters.py (#3222) 2023-07-24 11:17:28 -03:00
evaluate.py Clear the torch cache while evaluating 2023-10-16 10:52:50 -07:00
exllama_hf.py Fix off-by-one error in exllama_hf caching logic (#4145) 2023-10-05 12:20:56 -03:00
exllama.py Exllamav2 lora support (#4229) 2023-10-14 16:12:41 -03:00
exllamav2_hf.py Add missing exception 2023-10-20 23:53:24 -07:00
exllamav2.py Add flash-attention 2 for windows (#4235) 2023-10-21 03:46:23 -03:00
extensions.py Fix unexpected extensions load after gradio restart (#3965) 2023-09-17 17:35:43 -03:00
github.py Lint 2023-09-25 20:31:11 -07:00
GPTQ_loader.py Remove unused import 2023-08-10 00:10:14 -05:00
grammar.py Add grammar to transformers and _HF loaders (#4091) 2023-10-05 10:01:36 -03:00
html_generator.py Fix default/notebook tabs css 2023-10-10 18:45:12 -07:00
llama_attn_hijack.py Remove import. (#4247) 2023-10-09 18:06:11 -03:00
llamacpp_hf.py Llamacpp_HF: Fix CFG cache init (#4219) 2023-10-07 19:38:29 -03:00
llamacpp_model.py Add threads_batch parameter 2023-10-01 21:28:00 -07:00
loaders.py Enable special token support for exllamav2 (#4314) 2023-10-21 01:54:06 -03:00
logging_colors.py Add menus for saving presets/characters/instruction templates/prompts (#2621) 2023-06-11 12:19:18 -03:00
logits.py Remove exllama import that causes problems 2023-09-17 18:00:32 -07:00
LoRA.py Exllamav2 lora support (#4229) 2023-10-14 16:12:41 -03:00
metadata_gguf.py Lint 2023-09-19 13:51:57 -07:00
models_settings.py Use ExLlama_HF for GPTQ models by default 2023-10-21 20:45:38 -07:00
models.py Check for torch.xpu.is_available() 2023-10-16 12:53:40 -07:00
monkey_patch_gptq_lora.py fix lora training with alpaca_lora_4bit (#3853) 2023-09-11 01:22:20 -03:00
one_click_installer_check.py Add instructions 2023-09-22 13:10:03 -07:00
presets.py Add customizable ban tokens (#3899) 2023-09-15 18:27:27 -03:00
prompts.py Remove an error message 2023-09-17 18:46:08 -07:00
relative_imports.py Add ExLlama+LoRA support (#2756) 2023-06-19 12:31:24 -03:00
RoPE.py Add missing file 2023-08-25 07:10:26 -07:00
RWKV.py Add a note about RWKV loader 2023-09-26 17:43:39 -07:00
sampler_hijack.py Add the option to use samplers in the logit viewer 2023-08-22 20:18:16 -07:00
shared.py Ignore warnings on Colab 2023-10-21 21:45:25 -07:00
text_generation.py Experimental Intel Arc transformers support (untested) 2023-10-15 20:51:11 -07:00
training.py Option to select/target additional linear modules/layers in LORA training (#4178) 2023-10-22 15:57:19 -03:00
ui_chat.py Bump to latest gradio (3.47) (#4258) 2023-10-10 22:20:49 -03:00
ui_default.py Improve --multi-user mode 2023-09-26 06:42:33 -07:00
ui_file_saving.py Bump to latest gradio (3.47) (#4258) 2023-10-10 22:20:49 -03:00
ui_model_menu.py Fix a warning 2023-10-10 20:53:38 -07:00
ui_notebook.py Improve --multi-user mode 2023-09-26 06:42:33 -07:00
ui_parameters.py Another minor fix 2023-09-26 06:54:21 -07:00
ui_session.py Improve --multi-user mode 2023-09-26 06:42:33 -07:00
ui.py Bump to latest gradio (3.47) (#4258) 2023-10-10 22:20:49 -03:00
utils.py Bump to latest gradio (3.47) (#4258) 2023-10-10 22:20:49 -03:00