Commit Graph

3636 Commits

Author SHA1 Message Date
oobabooga
d8064c00e8 UI: hide chat scrollbar on desktop when not hovered 2024-02-17 20:47:14 -08:00
oobabooga
36c29084bb UI: fix instruct style background for multiline inputs 2024-02-17 20:09:47 -08:00
oobabooga
904867a139 UI: fix scroll down after sending a multiline message 2024-02-17 19:27:13 -08:00
oobabooga
d6bd71db7f ExLlamaV2: fix loading when autosplit is not set 2024-02-17 12:54:37 -08:00
oobabooga
af0bbf5b13 Lint 2024-02-17 09:01:04 -08:00
fschuh
fa1019e8fe
Removed extra spaces from Mistral instruction template that were causing Mistral to misbehave (#5517) 2024-02-16 21:40:51 -03:00
oobabooga
c375c753d6 Bump bitsandbytes to 0.42 (Linux only) 2024-02-16 10:47:57 -08:00
oobabooga
a6730f88f7
Add --autosplit flag for ExLlamaV2 (#5524) 2024-02-16 15:26:10 -03:00
oobabooga
4039999be5 Autodetect llamacpp_HF loader when tokenizer exists 2024-02-16 09:29:26 -08:00
oobabooga
76d28eaa9e
Add a menu for customizing the instruction template for the model (#5521) 2024-02-16 14:21:17 -03:00
oobabooga
0e1d8d5601 Instruction template: make "Send to default/notebook" work without a tokenizer 2024-02-16 08:01:07 -08:00
oobabooga
f465b7b486
Downloader: start one session per file (#5520) 2024-02-16 12:55:27 -03:00
oobabooga
44018c2f69
Add a "llamacpp_HF creator" menu (#5519) 2024-02-16 12:43:24 -03:00
oobabooga
b2b74c83a6 Fix Qwen1.5 in llamacpp_HF 2024-02-15 19:04:19 -08:00
oobabooga
080f7132c0
Revert gradio to 3.50.2 (#5513) 2024-02-15 20:40:23 -03:00
oobabooga
ea0e1feee7 Bump llama-cpp-python to 0.2.43 2024-02-14 21:58:24 -08:00
oobabooga
549f106879 Bump ExLlamaV2 to v0.0.13.2 2024-02-14 21:57:48 -08:00
oobabooga
7123ac3f77
Remove "Maximum UI updates/second" parameter (#5507) 2024-02-14 23:34:30 -03:00
DominikKowalczyk
33c4ce0720
Bump gradio to 4.19 (#5419)
---------

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2024-02-14 23:28:26 -03:00
oobabooga
04d8bdf929 Fix ExLlamaV2 requirement on Windows 2024-02-14 06:31:20 -08:00
oobabooga
b16958575f Minor bug fix 2024-02-13 19:48:32 -08:00
oobabooga
d47182d9d1
llamacpp_HF: do not use oobabooga/llama-tokenizer (#5499) 2024-02-14 00:28:51 -03:00
oobabooga
3a9ce3cfa6 Update stalebot message 2024-02-13 19:06:32 -08:00
oobabooga
93dd31fc0f Increase stalebot timeout 2024-02-13 16:07:33 -08:00
oobabooga
069ed7c6ef Lint 2024-02-13 16:05:41 -08:00
oobabooga
193548edce Minor fix to ExLlamaV2 requirements 2024-02-13 16:00:06 -08:00
oobabooga
25b655faeb Merge remote-tracking branch 'refs/remotes/origin/dev' into dev 2024-02-13 15:49:53 -08:00
oobabooga
f99f1fc68e Bump llama-cpp-python to 0.2.42 2024-02-13 15:49:20 -08:00
dependabot[bot]
d8081e85ec
Update peft requirement from ==0.7.* to ==0.8.* (#5446) 2024-02-13 16:27:18 -03:00
dependabot[bot]
653b195b1e
Update numpy requirement from ==1.24.* to ==1.26.* (#5490) 2024-02-13 16:26:35 -03:00
dependabot[bot]
147b4cf3e0
Bump hqq from 0.1.2.post1 to 0.1.3 (#5489) 2024-02-13 16:25:02 -03:00
Steven K
512933fa44
Update main.css to allow scrolling in code blocks (#5495) 2024-02-13 16:24:30 -03:00
oobabooga
e9fea353c5 Bump llama-cpp-python to 0.2.40 2024-02-13 11:22:34 -08:00
oobabooga
7342afaf19 Update the PyTorch installation instructions 2024-02-08 20:36:11 -08:00
oobabooga
86c320ab5a llama.cpp: add a progress bar for prompt evaluation 2024-02-07 21:56:10 -08:00
oobabooga
acea6a6669 Add more exllamav2 wheels 2024-02-07 08:24:29 -08:00
oobabooga
35537ad3d1
Bump exllamav2 to 0.0.13.1 (#5463) 2024-02-07 13:17:04 -03:00
oobabooga
b8e25e8678 Bump llama-cpp-python to 0.2.39 2024-02-07 06:50:47 -08:00
oobabooga
c55b8ce932 Improved random preset generation 2024-02-06 08:51:52 -08:00
oobabooga
4e34ae0587 Minor logging improvements 2024-02-06 08:22:08 -08:00
oobabooga
3add2376cd Better warpers logging 2024-02-06 07:09:21 -08:00
oobabooga
494cc3c5b0 Handle empty sampler priority field, use default values 2024-02-06 07:05:32 -08:00
oobabooga
775902c1f2 Sampler priority: better logging, always save to presets 2024-02-06 06:49:22 -08:00
oobabooga
acfbe6b3b3 Minor doc changes 2024-02-06 06:35:01 -08:00
oobabooga
8ee3cea7cb Improve some log messages 2024-02-06 06:31:27 -08:00
oobabooga
8a6d9abb41 Small fixes 2024-02-06 06:26:27 -08:00
oobabooga
2a1063eff5 Revert "Remove non-HF ExLlamaV2 loader (#5431)"
This reverts commit cde000d478.
2024-02-06 06:21:36 -08:00
oobabooga
8c35fefb3b
Add custom sampler order support (#5443) 2024-02-06 11:20:10 -03:00
oobabooga
7301c7618f Minor change to Models tab 2024-02-04 21:49:58 -08:00
oobabooga
f234fbe83f Improve a log message after previous commit 2024-02-04 21:44:53 -08:00