.. |
AutoGPTQ_loader.py
|
Add the --disable_exllama option for AutoGPTQ
|
2023-08-12 02:26:58 -04:00 |
block_requests.py
|
Block a cloudfare request
|
2023-07-06 22:24:52 -07:00 |
callbacks.py
|
Add the option to use samplers in the logit viewer
|
2023-08-22 20:18:16 -07:00 |
chat.py
|
Show progress on impersonate
|
2023-09-13 11:22:53 -07:00 |
ctransformers_model.py
|
Fix ctransformers model unload (#3711)
|
2023-08-27 10:53:48 -03:00 |
deepspeed_parameters.py
|
Fix typo in deepspeed_parameters.py (#3222)
|
2023-07-24 11:17:28 -03:00 |
evaluate.py
|
Prevent extra keys from being saved to settings.yaml
|
2023-09-11 20:13:10 -07:00 |
exllama_hf.py
|
Lint
|
2023-09-11 07:57:38 -07:00 |
exllama.py
|
Simplified ExLlama cloning instructions and failure message (#3972)
|
2023-09-17 19:26:05 -03:00 |
exllamav2_hf.py
|
Fix NTK (alpha) and RoPE scaling for exllamav2 and exllamav2_HF (#3897)
|
2023-09-13 02:35:09 -03:00 |
exllamav2.py
|
Tokenization improvements
|
2023-09-17 07:02:00 -07:00 |
extensions.py
|
Fix unexpected extensions load after gradio restart (#3965)
|
2023-09-17 17:35:43 -03:00 |
github.py
|
Implement sessions + add basic multi-user support (#2991)
|
2023-07-04 00:03:30 -03:00 |
GPTQ_loader.py
|
Remove unused import
|
2023-08-10 00:10:14 -05:00 |
html_generator.py
|
Prevent code blocks from flickering while streaming
|
2023-09-15 07:46:26 -07:00 |
llama_attn_hijack.py
|
Prevent unwanted log messages from modules
|
2023-05-21 22:42:34 -03:00 |
llamacpp_hf.py
|
llamacpp_HF prefix matching
|
2023-09-17 11:51:01 -07:00 |
llamacpp_model.py
|
Fix llama.cpp double decoding
|
2023-09-17 13:07:48 -07:00 |
loaders.py
|
Add customizable ban tokens (#3899)
|
2023-09-15 18:27:27 -03:00 |
logging_colors.py
|
Add menus for saving presets/characters/instruction templates/prompts (#2621)
|
2023-06-11 12:19:18 -03:00 |
logits.py
|
Remove exllama import that causes problems
|
2023-09-17 18:00:32 -07:00 |
LoRA.py
|
Allow --lora to use an absolute path
|
2023-08-10 10:03:12 -07:00 |
metadata_gguf.py
|
Read more GGUF metadata (scale_linear and freq_base) (#3877)
|
2023-09-12 17:02:42 -03:00 |
models_settings.py
|
User config precedence over GGUF metadata
|
2023-09-14 12:15:52 -07:00 |
models.py
|
Allow custom tokenizer for llamacpp_HF loader (#3941)
|
2023-09-15 12:38:38 -03:00 |
monkey_patch_gptq_lora.py
|
fix lora training with alpaca_lora_4bit (#3853)
|
2023-09-11 01:22:20 -03:00 |
presets.py
|
Add customizable ban tokens (#3899)
|
2023-09-15 18:27:27 -03:00 |
prompts.py
|
Add a token counter similar to automatic1111
|
2023-08-20 19:37:33 -07:00 |
relative_imports.py
|
Add ExLlama+LoRA support (#2756)
|
2023-06-19 12:31:24 -03:00 |
RoPE.py
|
Add missing file
|
2023-08-25 07:10:26 -07:00 |
RWKV.py
|
Add ExLlama support (#2444)
|
2023-06-16 20:35:38 -03:00 |
sampler_hijack.py
|
Add the option to use samplers in the logit viewer
|
2023-08-22 20:18:16 -07:00 |
shared.py
|
Add back chat buttons with --chat-buttons (#3947)
|
2023-09-16 00:39:37 -03:00 |
text_generation.py
|
Undo part of ad8ac545a5
|
2023-09-17 08:12:23 -07:00 |
training.py
|
fix lora training with alpaca_lora_4bit (#3853)
|
2023-09-11 01:22:20 -03:00 |
ui_chat.py
|
Add back chat buttons with --chat-buttons (#3947)
|
2023-09-16 00:39:37 -03:00 |
ui_default.py
|
Improve the UI tokenizer
|
2023-09-15 19:30:44 -07:00 |
ui_file_saving.py
|
Refresh the character dropdown when saving/deleting a character
|
2023-08-14 08:23:41 -07:00 |
ui_model_menu.py
|
Recommend mul_mat_q for llama.cpp
|
2023-09-17 08:27:11 -07:00 |
ui_notebook.py
|
Improve the UI tokenizer
|
2023-09-15 19:30:44 -07:00 |
ui_parameters.py
|
Add a simple tokenizer to the UI
|
2023-09-15 19:09:03 -07:00 |
ui_session.py
|
Make "Show controls" customizable through settings.yaml
|
2023-08-16 07:04:18 -07:00 |
ui.py
|
Add customizable ban tokens (#3899)
|
2023-09-15 18:27:27 -03:00 |
utils.py
|
Remove GGML support
|
2023-09-11 07:44:00 -07:00 |