oobabooga
|
c74326de02
|
Fixes by @jllllll
|
2023-09-22 10:37:22 -07:00 |
|
oobabooga
|
b4b5f45558
|
Join the installation instructions
|
2023-09-22 10:28:22 -07:00 |
|
oobabooga
|
2d2a8cfb48
|
Remove a file
|
2023-09-22 10:08:08 -07:00 |
|
oobabooga
|
3314b7d795
|
Allow start scripts to have command-line flags
|
2023-09-22 10:03:56 -07:00 |
|
oobabooga
|
8ab3eca9ec
|
Add a warning for outdated installations
|
2023-09-22 09:35:19 -07:00 |
|
oobabooga
|
86648d4085
|
Remove CUDA, keep only pytorch
|
2023-09-22 08:13:11 -07:00 |
|
oobabooga
|
66363a4d70
|
Minor changes / reorder some functions
|
2023-09-22 08:02:21 -07:00 |
|
oobabooga
|
84b5a519cb
|
Merge pull request #4029 from jllllll/one-click
Various one-click-installer updates and fixes
|
2023-09-22 11:55:01 -03:00 |
|
jllllll
|
69b0aedd95
|
Fix missing models warning
|
2023-09-22 01:12:08 -05:00 |
|
jllllll
|
060bb76aa0
|
Update WSL installer
|
2023-09-22 01:10:30 -05:00 |
|
jllllll
|
9054c98eca
|
Use --autostash on git pull
|
2023-09-21 23:00:33 -05:00 |
|
jllllll
|
498552a92b
|
More robust installation check for installer
|
2023-09-21 22:23:23 -05:00 |
|
jllllll
|
cd1049eded
|
Add Conda env deactivation to installer scripts
Avoids conflicts with existing Conda installations
|
2023-09-21 21:52:29 -05:00 |
|
jllllll
|
6bbfc40d10
|
Add .git creation to installer
|
2023-09-21 21:51:58 -05:00 |
|
oobabooga
|
193fe18c8c
|
Resolve conflicts
|
2023-09-21 17:45:11 -07:00 |
|
oobabooga
|
df39f455ad
|
Merge remote-tracking branch 'second-repo/main' into merge-second-repo
|
2023-09-21 17:39:54 -07:00 |
|
oobabooga
|
fc2b831692
|
Basic changes
|
2023-09-21 15:55:09 -07:00 |
|
oobabooga
|
b04b3957f9
|
Move one-click-installers into the repository
|
2023-09-21 15:35:53 -07:00 |
|
oobabooga
|
05c4a4f83c
|
Bump exllamav2
|
2023-09-21 14:56:01 -07:00 |
|
oobabooga
|
9a5ab454b4
|
Improve list styles
|
2023-09-21 14:49:00 -07:00 |
|
oobabooga
|
00ab450c13
|
Multiple histories for each character (#4022)
|
2023-09-21 17:19:32 -03:00 |
|
oobabooga
|
029da9563f
|
Avoid redundant function call in llamacpp_hf
|
2023-09-19 14:14:40 -07:00 |
|
oobabooga
|
9b7646140c
|
Trim model path if using absolute path
|
2023-09-19 13:51:57 -07:00 |
|
oobabooga
|
869f47fff9
|
Lint
|
2023-09-19 13:51:57 -07:00 |
|
oobabooga
|
13ac55fa18
|
Reorder some functions
|
2023-09-19 13:51:57 -07:00 |
|
oobabooga
|
e2fddd9584
|
More robust autoscrolling (attempt)
|
2023-09-19 13:12:34 -07:00 |
|
oobabooga
|
03dc69edc5
|
ExLlama_HF (v1 and v2) prefix matching
|
2023-09-19 13:12:19 -07:00 |
|
oobabooga
|
5075087461
|
Fix command-line arguments being ignored
|
2023-09-19 13:11:46 -07:00 |
|
oobabooga
|
ff5d3d2d09
|
Add missing import
|
2023-09-18 16:26:54 -07:00 |
|
oobabooga
|
605ec3c9f2
|
Add a warning about ExLlamaV2 without flash-attn
|
2023-09-18 12:26:35 -07:00 |
|
oobabooga
|
f0ef971edb
|
Remove obsolete warning
|
2023-09-18 12:25:10 -07:00 |
|
oobabooga
|
745807dc03
|
Faster llamacpp_HF prefix matching
|
2023-09-18 11:02:45 -07:00 |
|
BadisG
|
893a72a1c5
|
Stop generation immediately when using "Maximum tokens/second" (#3952)
---------
Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
|
2023-09-18 14:27:06 -03:00 |
|
jllllll
|
b7c55665c1
|
Bump llama-cpp-python to 0.2.6 (#3982)
|
2023-09-18 14:08:37 -03:00 |
|
Cebtenzzre
|
8466cf229a
|
llama.cpp: fix ban_eos_token (#3987)
|
2023-09-18 12:15:02 -03:00 |
|
oobabooga
|
0ede2965d5
|
Remove an error message
|
2023-09-17 18:46:08 -07:00 |
|
dependabot[bot]
|
661bfaac8e
|
Update accelerate from ==0.22.* to ==0.23.* (#3981)
|
2023-09-17 22:42:12 -03:00 |
|
Chenxiao Wang
|
347aed4254
|
extensions/openai: load extension settings via settings.yaml (#3953)
|
2023-09-17 22:39:29 -03:00 |
|
missionfloyd
|
cc8eda298a
|
Move hover menu shortcuts to right side (#3951)
|
2023-09-17 22:33:00 -03:00 |
|
oobabooga
|
280cca9f66
|
Merge remote-tracking branch 'refs/remotes/origin/main'
|
2023-09-17 18:01:27 -07:00 |
|
oobabooga
|
b062d50c45
|
Remove exllama import that causes problems
|
2023-09-17 18:00:32 -07:00 |
|
James Braza
|
fee38e0601
|
Simplified ExLlama cloning instructions and failure message (#3972)
|
2023-09-17 19:26:05 -03:00 |
|
Thireus ☠
|
45335fa8f4
|
Bump ExLlamav2 to v0.0.2 (#3970)
|
2023-09-17 19:24:40 -03:00 |
|
Lu Guanghua
|
9858acee7b
|
Fix unexpected extensions load after gradio restart (#3965)
|
2023-09-17 17:35:43 -03:00 |
|
oobabooga
|
d9b0f2c9c3
|
Fix llama.cpp double decoding
|
2023-09-17 13:07:48 -07:00 |
|
FartyPants
|
230b562d53
|
Training_PRO extension - added target selector (#3969)
|
2023-09-17 17:00:00 -03:00 |
|
oobabooga
|
d71465708c
|
llamacpp_HF prefix matching
|
2023-09-17 11:51:01 -07:00 |
|
oobabooga
|
763ea3bcb2
|
Improved multimodal error message
|
2023-09-17 09:22:16 -07:00 |
|
oobabooga
|
37e2980e05
|
Recommend mul_mat_q for llama.cpp
|
2023-09-17 08:27:11 -07:00 |
|
oobabooga
|
a069f3904c
|
Undo part of ad8ac545a5
|
2023-09-17 08:12:23 -07:00 |
|