oobabooga
|
8513028968
|
Fix lag in the chat tab during streaming
|
2023-12-12 13:01:25 -08:00 |
|
oobabooga
|
736fe4aa3e
|
Fix server refusing to close on Ctrl+C
|
2023-12-12 12:27:40 -08:00 |
|
oobabooga
|
39d2fe1ed9
|
Jinja templates for Instruct and Chat (#4874)
|
2023-12-12 17:23:14 -03:00 |
|
oobabooga
|
aab0dd962d
|
Revert "Update callbacks.py to show tracebacks on ValueError (#4892)"
This reverts commit 993ca51a65 .
|
2023-12-12 11:47:11 -08:00 |
|
dependabot[bot]
|
7a987417bb
|
Bump optimum from 1.14.0 to 1.15.0 (#4885)
|
2023-12-12 02:32:19 -03:00 |
|
dependabot[bot]
|
a17750db91
|
Update peft requirement from ==0.6.* to ==0.7.* (#4886)
|
2023-12-12 02:31:30 -03:00 |
|
dependabot[bot]
|
a8a92c6c87
|
Update transformers requirement from ==4.35.* to ==4.36.* (#4882)
|
2023-12-12 02:30:25 -03:00 |
|
Nehereus
|
993ca51a65
|
Update callbacks.py to show tracebacks on ValueError (#4892)
|
2023-12-12 02:29:27 -03:00 |
|
Morgan Schweers
|
602b8c6210
|
Make new browser reloads recognize current model. (#4865)
|
2023-12-11 02:51:01 -03:00 |
|
oobabooga
|
8c8825b777
|
Add QuIP# to README
|
2023-12-08 08:40:42 -08:00 |
|
oobabooga
|
2a335b8aa7
|
Cleanup: set shared.model_name only once
|
2023-12-08 06:35:23 -08:00 |
|
oobabooga
|
62d59a516f
|
Add trust_remote_code to all HF loaders
|
2023-12-08 06:29:26 -08:00 |
|
oobabooga
|
181743fd97
|
Fix missing spaces tokenizer issue (closes #4834)
|
2023-12-08 05:16:46 -08:00 |
|
oobabooga
|
00aedf9209
|
Merge remote-tracking branch 'refs/remotes/origin/dev' into dev
|
2023-12-08 05:02:25 -08:00 |
|
oobabooga
|
7bbe7e803a
|
Minor fix
|
2023-12-08 05:01:25 -08:00 |
|
Yiximail
|
1c74b3ab45
|
Fix partial unicode characters issue (#4837)
|
2023-12-08 09:50:53 -03:00 |
|
oobabooga
|
2c5a1e67f9
|
Parameters: change max_new_tokens & repetition_penalty_range defaults (#4842)
|
2023-12-07 20:04:52 -03:00 |
|
Song Fuchang
|
e16e5997ef
|
Update IPEX install URL. (#4825)
* Old pip url no longer works. Use the latest url from
* https://intel.github.io/intel-extension-for-pytorch/index.html#installation
|
2023-12-06 21:07:01 -03:00 |
|
oobabooga
|
d516815c9c
|
Model downloader: download only fp16 if both fp16 and GGUF are present
|
2023-12-05 21:09:12 -08:00 |
|
oobabooga
|
98361af4d5
|
Add QuIP# support (#4803)
It has to be installed manually for now.
|
2023-12-06 00:01:01 -03:00 |
|
oobabooga
|
6430acadde
|
Minor bug fix after https://github.com/oobabooga/text-generation-webui/pull/4814
|
2023-12-05 10:08:11 -08:00 |
|
oobabooga
|
c21a9668a5
|
Lint
|
2023-12-04 21:17:05 -08:00 |
|
erew123
|
f786aa3caa
|
Clean-up Ctrl+C Shutdown (#4802)
|
2023-12-05 02:16:16 -03:00 |
|
oobabooga
|
0f828ea441
|
Do not limit API updates/second
|
2023-12-04 20:45:43 -08:00 |
|
oobabooga
|
9edb193def
|
Optimize HF text generation (#4814)
|
2023-12-05 00:00:40 -03:00 |
|
俞航
|
ac9f154bcc
|
Bump exllamav2 from 0.0.8 to 0.0.10 & Fix code change (#4782)
|
2023-12-04 21:15:05 -03:00 |
|
oobabooga
|
131a5212ce
|
UI: update context upper limit to 200000
|
2023-12-04 15:48:34 -08:00 |
|
oobabooga
|
f7145544f9
|
Update README
|
2023-12-04 15:44:44 -08:00 |
|
oobabooga
|
8e1f86a866
|
Merge remote-tracking branch 'refs/remotes/origin/dev' into dev
|
2023-12-04 15:41:56 -08:00 |
|
oobabooga
|
be88b072e9
|
Update --loader flag description
|
2023-12-04 15:41:25 -08:00 |
|
dependabot[bot]
|
801ba87c68
|
Update accelerate requirement from ==0.24.* to ==0.25.* (#4810)
|
2023-12-04 20:36:01 -03:00 |
|
oobabooga
|
7fc9033b2e
|
Recommend ExLlama_HF and ExLlamav2_HF
|
2023-12-04 15:28:46 -08:00 |
|
oobabooga
|
3f993280e4
|
Minor changes
|
2023-12-04 07:27:44 -08:00 |
|
oobabooga
|
0931ed501b
|
Minor changes
|
2023-12-04 07:25:18 -08:00 |
|
oobabooga
|
427a165597
|
Bump TTS version in coqui_tts
|
2023-12-04 07:21:56 -08:00 |
|
Song Fuchang
|
0bfd5090be
|
Import accelerate very early to make Intel GPU happy (#4704)
|
2023-12-03 22:51:18 -03:00 |
|
dependabot[bot]
|
2e83844f35
|
Bump safetensors from 0.4.0 to 0.4.1 (#4750)
|
2023-12-03 22:50:10 -03:00 |
|
Ikko Eltociear Ashimine
|
06cc9a85f7
|
README: minor typo fix (#4793)
|
2023-12-03 22:46:34 -03:00 |
|
Lounger
|
7c0a17962d
|
Gallery improvements (#4789)
|
2023-12-03 22:45:50 -03:00 |
|
oobabooga
|
77d6ccf12b
|
Add a LOADER debug message while loading models
|
2023-11-30 12:00:32 -08:00 |
|
oobabooga
|
1c90e02243
|
Update Colab-TextGen-GPU.ipynb
|
2023-11-30 11:55:18 -08:00 |
|
oobabooga
|
092a2c3516
|
Fix a bug in llama.cpp get_logits() function
|
2023-11-30 11:21:40 -08:00 |
|
oobabooga
|
000b77a17d
|
Minor docker changes
|
2023-11-29 21:27:23 -08:00 |
|
Callum
|
88620c6b39
|
feature/docker_improvements (#4768)
|
2023-11-30 02:20:23 -03:00 |
|
oobabooga
|
2698d7c9fd
|
Fix llama.cpp model unloading
|
2023-11-29 15:19:48 -08:00 |
|
oobabooga
|
fa89d305e3
|
Merge remote-tracking branch 'refs/remotes/origin/dev' into dev
|
2023-11-29 15:13:17 -08:00 |
|
oobabooga
|
9940ed9c77
|
Sort the loaders
|
2023-11-29 15:13:03 -08:00 |
|
Manu Kashyap
|
78fd7f6aa8
|
Fixed naming for sentence-transformers library (#4764)
|
2023-11-29 12:15:03 -03:00 |
|
oobabooga
|
a7670c31ca
|
Sort
|
2023-11-28 18:43:33 -08:00 |
|
oobabooga
|
6e51bae2e0
|
Sort the loaders menu
|
2023-11-28 18:41:11 -08:00 |
|