oobabooga
|
cd1cad1b47
|
Bump exllamav2
|
2023-10-14 11:23:07 -07:00 |
|
oobabooga
|
fae8062d39
|
Bump to latest gradio (3.47) (#4258)
|
2023-10-10 22:20:49 -03:00 |
|
dependabot[bot]
|
520cbb2ab1
|
Bump safetensors from 0.3.2 to 0.4.0 (#4249)
|
2023-10-10 17:41:09 -03:00 |
|
jllllll
|
0eda9a0549
|
Use GPTQ wheels compatible with Pytorch 2.1 (#4210)
|
2023-10-07 00:35:41 -03:00 |
|
AG-w
|
06fff3b2e9
|
Fix python wheels for avx requirements (#4189)
|
2023-10-06 15:42:44 -03:00 |
|
turboderp
|
8a98646a21
|
Bump ExLlamaV2 to 0.0.5 (#4186)
|
2023-10-05 19:12:22 -03:00 |
|
oobabooga
|
3f56151f03
|
Bump to transformers 4.34
|
2023-10-05 08:55:14 -07:00 |
|
oobabooga
|
ae4ba3007f
|
Add grammar to transformers and _HF loaders (#4091)
|
2023-10-05 10:01:36 -03:00 |
|
jllllll
|
41a2de96e5
|
Bump llama-cpp-python to 0.2.11
|
2023-10-01 18:08:10 -05:00 |
|
oobabooga
|
92a39c619b
|
Add Mistral support
|
2023-09-28 15:41:03 -07:00 |
|
jllllll
|
2bd23c29cb
|
Bump llama-cpp-python to 0.2.7 (#4110)
|
2023-09-27 23:45:36 -03:00 |
|
jllllll
|
13a54729b1
|
Bump exllamav2 to 0.0.4 and use pre-built wheels (#4095)
|
2023-09-26 21:36:14 -03:00 |
|
oobabooga
|
08c4fb12ae
|
Use bitsandbytes==0.38.1 for AMD
|
2023-09-24 08:11:59 -07:00 |
|
oobabooga
|
a3ad9fe6c0
|
Add comments
|
2023-09-24 06:08:39 -07:00 |
|
oobabooga
|
2e7b6b0014
|
Create alternative requirements.txt with AMD and Metal wheels (#4052)
|
2023-09-24 09:58:29 -03:00 |
|