text-generation-webui/modules
oobabooga 8705eba830 Remove universal llama tokenizer support
Instead replace it with a warning if the tokenizer files look off
2023-07-04 19:43:19 -07:00
..
AutoGPTQ_loader.py Add --no_use_cuda_fp16 param for AutoGPTQ 2023-06-23 12:22:56 -03:00
block_requests.py Add modules/block_requests.py 2023-06-18 16:31:14 -03:00
callbacks.py Make stop_everything work with non-streamed generation (#2848) 2023-06-24 11:19:16 -03:00
chat.py Fix start_with 2023-07-03 23:32:02 -07:00
deepspeed_parameters.py Style improvements (#1957) 2023-05-09 22:49:39 -03:00
evaluate.py Sort some imports 2023-06-25 01:44:36 -03:00
exllama_hf.py Add Support for Static NTK RoPE scaling for exllama/exllama_hf (#2955) 2023-07-04 01:13:16 -03:00
exllama.py Add Support for Static NTK RoPE scaling for exllama/exllama_hf (#2955) 2023-07-04 01:13:16 -03:00
extensions.py Implement sessions + add basic multi-user support (#2991) 2023-07-04 00:03:30 -03:00
github.py Implement sessions + add basic multi-user support (#2991) 2023-07-04 00:03:30 -03:00
GPTQ_loader.py Prevent unwanted log messages from modules 2023-05-21 22:42:34 -03:00
html_generator.py Implement sessions + add basic multi-user support (#2991) 2023-07-04 00:03:30 -03:00
llama_attn_hijack.py Prevent unwanted log messages from modules 2023-05-21 22:42:34 -03:00
llamacpp_model.py fix usage of self in classmethod (#2781) 2023-06-20 16:18:42 -03:00
loaders.py Add Support for Static NTK RoPE scaling for exllama/exllama_hf (#2955) 2023-07-04 01:13:16 -03:00
logging_colors.py Add menus for saving presets/characters/instruction templates/prompts (#2621) 2023-06-11 12:19:18 -03:00
LoRA.py Update LoRA.py - avoid potential error (#2953) 2023-07-03 17:40:22 -03:00
models_settings.py Keep ExLlama_HF if already selected 2023-06-25 19:06:28 -03:00
models.py Remove universal llama tokenizer support 2023-07-04 19:43:19 -07:00
monkey_patch_gptq_lora.py Sort some imports 2023-06-25 01:44:36 -03:00
presets.py Implement sessions + add basic multi-user support (#2991) 2023-07-04 00:03:30 -03:00
relative_imports.py Add ExLlama+LoRA support (#2756) 2023-06-19 12:31:24 -03:00
RWKV.py Add ExLlama support (#2444) 2023-06-16 20:35:38 -03:00
sampler_hijack.py Add repetition penalty range parameter to transformers (#2916) 2023-06-29 13:40:13 -03:00
shared.py Add Support for Static NTK RoPE scaling for exllama/exllama_hf (#2955) 2023-07-04 01:13:16 -03:00
text_generation.py Implement sessions + add basic multi-user support (#2991) 2023-07-04 00:03:30 -03:00
training.py Update training.py - correct use of lora_names (#2988) 2023-07-03 17:41:18 -03:00
ui.py Fix #3003 2023-07-04 11:38:35 -03:00
utils.py Implement sessions + add basic multi-user support (#2991) 2023-07-04 00:03:30 -03:00