.. |
grammar
|
|
|
AutoGPTQ_loader.py
|
|
|
block_requests.py
|
Bump gradio to 4.19 (#5419)
|
2024-02-14 23:28:26 -03:00 |
callbacks.py
|
Dynamic Temperature HF loader support (#5174)
|
2024-01-07 10:36:26 -03:00 |
chat.py
|
Improve a log message after previous commit
|
2024-02-04 21:44:53 -08:00 |
ctransformers_model.py
|
|
|
deepspeed_parameters.py
|
|
|
evaluate.py
|
|
|
exllamav2_hf.py
|
|
|
exllamav2.py
|
Revert "Remove non-HF ExLlamaV2 loader (#5431)"
|
2024-02-06 06:21:36 -08:00 |
extensions.py
|
UI: Do not save unchanged extension settings to settings.yaml
|
2024-01-10 03:48:30 -08:00 |
github.py
|
|
|
GPTQ_loader.py
|
|
|
gradio_hijack.py
|
Bump gradio to 4.19 (#5419)
|
2024-02-14 23:28:26 -03:00 |
html_generator.py
|
Lint
|
2024-01-22 03:25:55 -08:00 |
llama_cpp_python_hijack.py
|
Lint
|
2024-02-13 16:05:41 -08:00 |
llamacpp_hf.py
|
llama.cpp: add a progress bar for prompt evaluation
|
2024-02-07 21:56:10 -08:00 |
llamacpp_model.py
|
llama.cpp: add a progress bar for prompt evaluation
|
2024-02-07 21:56:10 -08:00 |
loaders.py
|
Revert "Remove non-HF ExLlamaV2 loader (#5431)"
|
2024-02-06 06:21:36 -08:00 |
logging_colors.py
|
Lint
|
2023-12-19 21:36:57 -08:00 |
logits.py
|
Revert "Remove non-HF ExLlamaV2 loader (#5431)"
|
2024-02-06 06:21:36 -08:00 |
LoRA.py
|
Revert "Remove non-HF ExLlamaV2 loader (#5431)"
|
2024-02-06 06:21:36 -08:00 |
metadata_gguf.py
|
|
|
models_settings.py
|
Revert "Remove non-HF ExLlamaV2 loader (#5431)"
|
2024-02-06 06:21:36 -08:00 |
models.py
|
llamacpp_HF: do not use oobabooga/llama-tokenizer (#5499)
|
2024-02-14 00:28:51 -03:00 |
monkey_patch_gptq_lora.py
|
|
|
one_click_installer_check.py
|
|
|
presets.py
|
Lint
|
2024-02-13 16:05:41 -08:00 |
prompts.py
|
|
|
relative_imports.py
|
|
|
RoPE.py
|
Lint
|
2024-01-09 16:27:50 -08:00 |
sampler_hijack.py
|
Better warpers logging
|
2024-02-06 07:09:21 -08:00 |
shared.py
|
Minor doc changes
|
2024-02-06 06:35:01 -08:00 |
text_generation.py
|
Handle empty sampler priority field, use default values
|
2024-02-06 07:05:32 -08:00 |
training.py
|
LoRA: Fix error "Attempting to unscale FP16 gradients" when training (#5268)
|
2024-01-17 17:11:49 -03:00 |
ui_chat.py
|
Bump gradio to 4.19 (#5419)
|
2024-02-14 23:28:26 -03:00 |
ui_default.py
|
Bump gradio to 4.19 (#5419)
|
2024-02-14 23:28:26 -03:00 |
ui_file_saving.py
|
Improve the file saving/deletion menus
|
2024-01-09 06:33:47 -08:00 |
ui_model_menu.py
|
Bump gradio to 4.19 (#5419)
|
2024-02-14 23:28:26 -03:00 |
ui_notebook.py
|
Bump gradio to 4.19 (#5419)
|
2024-02-14 23:28:26 -03:00 |
ui_parameters.py
|
Add custom sampler order support (#5443)
|
2024-02-06 11:20:10 -03:00 |
ui_session.py
|
Bump gradio to 4.19 (#5419)
|
2024-02-14 23:28:26 -03:00 |
ui.py
|
Add custom sampler order support (#5443)
|
2024-02-06 11:20:10 -03:00 |
utils.py
|
Change some log messages when deleting files
|
2024-01-09 03:32:01 -08:00 |