text-generation-webui/modules
Ravindra Marella e4c3e1bdd2
Fix ctransformers model unload (#3711)
Add missing comma in model types list

Fixes marella/ctransformers#111
2023-08-27 10:53:48 -03:00
..
AutoGPTQ_loader.py Add the --disable_exllama option for AutoGPTQ 2023-08-12 02:26:58 -04:00
block_requests.py Block a cloudfare request 2023-07-06 22:24:52 -07:00
callbacks.py Add the option to use samplers in the logit viewer 2023-08-22 20:18:16 -07:00
chat.py Unescape last message (#3623) 2023-08-19 09:29:08 -03:00
ctransformers_model.py Fix ctransformers model unload (#3711) 2023-08-27 10:53:48 -03:00
deepspeed_parameters.py Fix typo in deepspeed_parameters.py (#3222) 2023-07-24 11:17:28 -03:00
evaluate.py Sort some imports 2023-06-25 01:44:36 -03:00
exllama_hf.py Add rope_freq_base parameter for CodeLlama 2023-08-25 06:55:15 -07:00
exllama.py Add rope_freq_base parameter for CodeLlama 2023-08-25 06:55:15 -07:00
extensions.py Unify the 3 interface modes (#3554) 2023-08-13 01:12:15 -03:00
github.py Implement sessions + add basic multi-user support (#2991) 2023-07-04 00:03:30 -03:00
GPTQ_loader.py Remove unused import 2023-08-10 00:10:14 -05:00
html_generator.py Unescape inline code blocks 2023-08-24 21:01:09 -07:00
llama_attn_hijack.py Prevent unwanted log messages from modules 2023-05-21 22:42:34 -03:00
llamacpp_hf.py Fix llamacpp_HF loading 2023-08-26 22:15:06 -07:00
llamacpp_model.py Minor fixes/cosmetics 2023-08-26 22:11:07 -07:00
loaders.py Fix ctransformers model unload (#3711) 2023-08-27 10:53:48 -03:00
logging_colors.py Add menus for saving presets/characters/instruction templates/prompts (#2621) 2023-06-11 12:19:18 -03:00
logits.py Improve logit viewer format 2023-08-22 20:35:12 -07:00
LoRA.py Allow --lora to use an absolute path 2023-08-10 10:03:12 -07:00
models_settings.py Use separate llama-cpp-python packages for GGML support 2023-08-26 10:40:08 -05:00
models.py Use separate llama-cpp-python packages for GGML support 2023-08-26 10:40:08 -05:00
monkey_patch_gptq_lora.py Revert "Remove GPTQ-for-LLaMa monkey patch support" 2023-08-10 08:39:41 -07:00
presets.py Add Classifier Free Guidance (CFG) for Transformers/ExLlama (#3325) 2023-08-06 17:22:48 -03:00
prompts.py Add a token counter similar to automatic1111 2023-08-20 19:37:33 -07:00
relative_imports.py Add ExLlama+LoRA support (#2756) 2023-06-19 12:31:24 -03:00
RoPE.py Add missing file 2023-08-25 07:10:26 -07:00
RWKV.py Add ExLlama support (#2444) 2023-06-16 20:35:38 -03:00
sampler_hijack.py Add the option to use samplers in the logit viewer 2023-08-22 20:18:16 -07:00
shared.py Fix a typo 2023-08-25 07:08:38 -07:00
text_generation.py Fix an escaping bug 2023-08-20 21:50:42 -07:00
training.py Add the option to use samplers in the logit viewer 2023-08-22 20:18:16 -07:00
ui_chat.py Add "send to default/notebook" buttons to chat tab 2023-08-20 19:54:59 -07:00
ui_default.py Add the option to use samplers in the logit viewer 2023-08-22 20:18:16 -07:00
ui_file_saving.py Refresh the character dropdown when saving/deleting a character 2023-08-14 08:23:41 -07:00
ui_model_menu.py Update truncation length based on max_seq_len/n_ctx 2023-08-26 23:10:45 -07:00
ui_notebook.py Fix Notebook Logits tab 2023-08-22 21:00:12 -07:00
ui_parameters.py Update truncation length based on max_seq_len/n_ctx 2023-08-26 23:10:45 -07:00
ui_session.py Make "Show controls" customizable through settings.yaml 2023-08-16 07:04:18 -07:00
ui.py Add rope_freq_base parameter for CodeLlama 2023-08-25 06:55:15 -07:00
utils.py Minor fixes/cosmetics 2023-08-26 22:11:07 -07:00