.. |
AutoGPTQ_loader.py
|
Add the --disable_exllama option for AutoGPTQ
|
2023-08-12 02:26:58 -04:00 |
block_requests.py
|
Block a cloudfare request
|
2023-07-06 22:24:52 -07:00 |
callbacks.py
|
Add the option to use samplers in the logit viewer
|
2023-08-22 20:18:16 -07:00 |
chat.py
|
Clear instruction template before loading new one
|
2023-08-29 13:11:32 -07:00 |
ctransformers_model.py
|
Fix ctransformers model unload (#3711)
|
2023-08-27 10:53:48 -03:00 |
deepspeed_parameters.py
|
Fix typo in deepspeed_parameters.py (#3222)
|
2023-07-24 11:17:28 -03:00 |
evaluate.py
|
Sort some imports
|
2023-06-25 01:44:36 -03:00 |
exllama_hf.py
|
Exllama new rope settings (#3852)
|
2023-09-11 01:14:36 -03:00 |
exllama.py
|
Exllama new rope settings (#3852)
|
2023-09-11 01:14:36 -03:00 |
extensions.py
|
Unify the 3 interface modes (#3554)
|
2023-08-13 01:12:15 -03:00 |
github.py
|
Implement sessions + add basic multi-user support (#2991)
|
2023-07-04 00:03:30 -03:00 |
GPTQ_loader.py
|
Remove unused import
|
2023-08-10 00:10:14 -05:00 |
html_generator.py
|
Prevent <ul> lists from flickering during streaming
|
2023-08-28 20:45:07 -07:00 |
llama_attn_hijack.py
|
Prevent unwanted log messages from modules
|
2023-05-21 22:42:34 -03:00 |
llamacpp_hf.py
|
Fix llamacpp_HF loading
|
2023-08-26 22:15:06 -07:00 |
llamacpp_model.py
|
Minor fixes/cosmetics
|
2023-08-26 22:11:07 -07:00 |
loaders.py
|
Fix ctransformers model unload (#3711)
|
2023-08-27 10:53:48 -03:00 |
logging_colors.py
|
Add menus for saving presets/characters/instruction templates/prompts (#2621)
|
2023-06-11 12:19:18 -03:00 |
logits.py
|
Improve logit viewer format
|
2023-08-22 20:35:12 -07:00 |
LoRA.py
|
Allow --lora to use an absolute path
|
2023-08-10 10:03:12 -07:00 |
models_settings.py
|
Use separate llama-cpp-python packages for GGML support
|
2023-08-26 10:40:08 -05:00 |
models.py
|
Use separate llama-cpp-python packages for GGML support
|
2023-08-26 10:40:08 -05:00 |
monkey_patch_gptq_lora.py
|
Revert "Remove GPTQ-for-LLaMa monkey patch support"
|
2023-08-10 08:39:41 -07:00 |
presets.py
|
Add Classifier Free Guidance (CFG) for Transformers/ExLlama (#3325)
|
2023-08-06 17:22:48 -03:00 |
prompts.py
|
Add a token counter similar to automatic1111
|
2023-08-20 19:37:33 -07:00 |
relative_imports.py
|
Add ExLlama+LoRA support (#2756)
|
2023-06-19 12:31:24 -03:00 |
RoPE.py
|
Add missing file
|
2023-08-25 07:10:26 -07:00 |
RWKV.py
|
Add ExLlama support (#2444)
|
2023-06-16 20:35:38 -03:00 |
sampler_hijack.py
|
Add the option to use samplers in the logit viewer
|
2023-08-22 20:18:16 -07:00 |
shared.py
|
Add max_tokens_second param (#3533)
|
2023-08-29 17:44:31 -03:00 |
text_generation.py
|
Set use_cache=True by default for all models
|
2023-08-30 13:26:27 -07:00 |
training.py
|
Add the option to use samplers in the logit viewer
|
2023-08-22 20:18:16 -07:00 |
ui_chat.py
|
Add a typing dots (...) animation to chat tab
|
2023-08-28 13:50:36 -07:00 |
ui_default.py
|
Autoscroll Notebook/Default textareas during streaming
|
2023-08-28 18:22:03 -07:00 |
ui_file_saving.py
|
Refresh the character dropdown when saving/deleting a character
|
2023-08-14 08:23:41 -07:00 |
ui_model_menu.py
|
Do not impose instruct mode while loading models
|
2023-09-02 11:31:33 -07:00 |
ui_notebook.py
|
Autoscroll Notebook/Default textareas during streaming
|
2023-08-28 18:22:03 -07:00 |
ui_parameters.py
|
Add max_tokens_second param (#3533)
|
2023-08-29 17:44:31 -03:00 |
ui_session.py
|
Make "Show controls" customizable through settings.yaml
|
2023-08-16 07:04:18 -07:00 |
ui.py
|
Add max_tokens_second param (#3533)
|
2023-08-29 17:44:31 -03:00 |
utils.py
|
Minor fixes/cosmetics
|
2023-08-26 22:11:07 -07:00 |