.. |
AutoGPTQ_loader.py
|
Fix is_ccl_available & is_xpu_available imports
|
2023-10-26 20:27:04 -07:00 |
block_requests.py
|
|
|
callbacks.py
|
Intel Gpu support initialization (#4340)
|
2023-10-26 23:39:51 -03:00 |
chat.py
|
Minor style change
|
2023-11-18 21:11:04 -08:00 |
ctransformers_model.py
|
|
|
deepspeed_parameters.py
|
|
|
evaluate.py
|
|
|
exllama_hf.py
|
|
|
exllama.py
|
|
|
exllamav2_hf.py
|
Add cache_8bit option
|
2023-11-02 11:23:04 -07:00 |
exllamav2.py
|
Add missing exllamav2 samplers
|
2023-11-16 07:09:40 -08:00 |
extensions.py
|
|
|
github.py
|
|
|
GPTQ_loader.py
|
make torch.load a bit safer (#4448)
|
2023-11-02 14:07:08 -03:00 |
grammar.py
|
|
|
html_generator.py
|
New feature: enlarge character pictures on click (#4654)
|
2023-11-19 02:05:17 -03:00 |
llama_attn_hijack.py
|
Lint
|
2023-10-23 13:09:03 -07:00 |
llamacpp_hf.py
|
Bump llama-cpp-python to 0.2.18 (2nd attempt) (#4637)
|
2023-11-18 00:31:27 -03:00 |
llamacpp_model.py
|
llama.cpp: minor log change & lint
|
2023-11-27 10:44:55 -08:00 |
loaders.py
|
Sort
|
2023-11-28 18:43:33 -08:00 |
logging_colors.py
|
|
|
logits.py
|
Use convert_ids_to_tokens instead of decode in logits endpoint
|
2023-11-19 09:22:08 -08:00 |
LoRA.py
|
Minor LoRA bug fix
|
2023-11-19 07:59:29 -08:00 |
metadata_gguf.py
|
|
|
models_settings.py
|
Style changes
|
2023-11-15 15:51:37 -08:00 |
models.py
|
Set use_fast=True by default, create --no_use_fast flag
|
2023-11-16 19:55:28 -08:00 |
monkey_patch_gptq_lora.py
|
|
|
one_click_installer_check.py
|
Lint
|
2023-11-16 18:03:06 -08:00 |
presets.py
|
New feature: "random preset" button (#4647)
|
2023-11-18 18:31:41 -03:00 |
prompts.py
|
Fix "send instruction template to..." buttons (closes #4625)
|
2023-11-16 18:16:42 -08:00 |
relative_imports.py
|
|
|
RoPE.py
|
|
|
RWKV.py
|
Intel Gpu support initialization (#4340)
|
2023-10-26 23:39:51 -03:00 |
sampler_hijack.py
|
Add temperature_last parameter (#4472)
|
2023-11-04 13:09:07 -03:00 |
shared.py
|
Hide deprecated args from Session tab
|
2023-11-21 15:15:16 -08:00 |
text_generation.py
|
fix detection of stopping strings when HTML escaping is used (#4728)
|
2023-11-27 15:42:08 -03:00 |
training.py
|
make torch.load a bit safer (#4448)
|
2023-11-02 14:07:08 -03:00 |
ui_chat.py
|
New feature: enlarge character pictures on click (#4654)
|
2023-11-19 02:05:17 -03:00 |
ui_default.py
|
|
|
ui_file_saving.py
|
Refresh the Preset menu after saving a preset
|
2023-11-18 14:03:42 -08:00 |
ui_model_menu.py
|
Bump llama-cpp-python to 0.2.18 (2nd attempt) (#4637)
|
2023-11-18 00:31:27 -03:00 |
ui_notebook.py
|
|
|
ui_parameters.py
|
New feature: "random preset" button (#4647)
|
2023-11-18 18:31:41 -03:00 |
ui_session.py
|
Hide deprecated args from Session tab
|
2023-11-21 15:15:16 -08:00 |
ui.py
|
New feature: enlarge character pictures on click (#4654)
|
2023-11-19 02:05:17 -03:00 |
utils.py
|
Refactor the /v1/models endpoint
|
2023-11-07 19:59:27 -08:00 |