Commit Graph

2014 Commits

Author SHA1 Message Date
ramblingcoder
0d0d849478
Update Dockerfile to resolve superbooga requirement error (#2401) 2023-06-20 18:31:28 -03:00
EugeoSynthesisThirtyTwo
7625c6de89
fix usage of self in classmethod (#2781) 2023-06-20 16:18:42 -03:00
MikoAL
c40932eb39
Added Falcon LoRA training support (#2684)
I am 50% sure this will work
2023-06-20 01:03:44 -03:00
oobabooga
c623e142ac Bump llama-cpp-python 2023-06-20 00:49:38 -03:00
FartyPants
ce86f726e9
Added saving of training logs to training_log.json (#2769) 2023-06-20 00:47:36 -03:00
oobabooga
017884132f Merge remote-tracking branch 'refs/remotes/origin/main' 2023-06-20 00:46:29 -03:00
oobabooga
e1cd6cc410 Minor style change 2023-06-20 00:46:18 -03:00
Cebtenzzre
59e7ecb198
llama.cpp: implement ban_eos_token via logits_processor (#2765) 2023-06-19 21:31:19 -03:00
oobabooga
0d9d70ec7e Update docs 2023-06-19 12:52:23 -03:00
oobabooga
f6a602861e Update docs 2023-06-19 12:51:30 -03:00
oobabooga
5d4b4d15a5
Update Using-LoRAs.md 2023-06-19 12:43:57 -03:00
oobabooga
eb30f4441f
Add ExLlama+LoRA support (#2756) 2023-06-19 12:31:24 -03:00
oobabooga
a1cac88c19
Update README.md 2023-06-19 01:28:23 -03:00
oobabooga
5f418f6171 Fix a memory leak (credits for the fix: Ph0rk0z) 2023-06-19 01:19:28 -03:00
ThisIsPIRI
def3b69002
Fix loading condition for universal llama tokenizer (#2753) 2023-06-18 18:14:06 -03:00
oobabooga
490a1795f0 Bump peft commit 2023-06-18 16:42:11 -03:00
oobabooga
09c781b16f Add modules/block_requests.py
This has become unnecessary, but it could be useful in the future
for other libraries.
2023-06-18 16:31:14 -03:00
oobabooga
687fd2604a Improve code/ul styles in chat mode 2023-06-18 15:52:59 -03:00
oobabooga
e8588d7077 Merge remote-tracking branch 'refs/remotes/origin/main' 2023-06-18 15:23:38 -03:00
oobabooga
44f28830d1 Chat CSS: fix ul, li, pre styles + remove redefinitions 2023-06-18 15:20:51 -03:00
Forkoz
3cae1221d4
Update exllama.py - Respect model dir parameter (#2744) 2023-06-18 13:26:30 -03:00
oobabooga
5b4c0155f6 Move a button 2023-06-18 01:56:43 -03:00
oobabooga
0686a2e75f Improve instruct colors in dark mode 2023-06-18 01:44:52 -03:00
oobabooga
c5641b65d3 Handle leading spaces properly in ExLllama 2023-06-17 19:35:12 -03:00
matatonic
1e97aaac95
extensions/openai: docs update, model loader, minor fixes (#2557) 2023-06-17 19:15:24 -03:00
matatonic
2220b78e7a
models/config.yaml: +alpacino, +alpasta, +hippogriff, +gpt4all-snoozy, +lazarus, +based, -airoboros 4k (#2580) 2023-06-17 19:14:25 -03:00
oobabooga
05a743d6ad Make llama.cpp use tfs parameter 2023-06-17 19:08:25 -03:00
oobabooga
e19cbea719 Add a variable to modules/shared.py 2023-06-17 19:02:29 -03:00
oobabooga
cbd63eeeff Fix repeated tokens with exllama 2023-06-17 19:02:08 -03:00
oobabooga
766c760cd7 Use gen_begin_reuse in exllama 2023-06-17 18:00:10 -03:00
oobabooga
239b11c94b Minor bug fixes 2023-06-17 17:57:56 -03:00
Bhavika Tekwani
d8d29edf54
Install wheel using pip3 (#2719) 2023-06-16 23:46:40 -03:00
Jonathan Yankovich
a1ca1c04a1
Update ExLlama.md (#2729)
Add details for configuring exllama
2023-06-16 23:46:25 -03:00
oobabooga
b27f83c0e9 Make exllama stoppable 2023-06-16 22:03:23 -03:00
oobabooga
7f06d551a3 Fix streaming callback 2023-06-16 21:44:56 -03:00
oobabooga
1e400218e9 Fix a typo 2023-06-16 21:01:57 -03:00
oobabooga
5f392122fd Add gpu_split param to ExLlama
Adapted from code created by Ph0rk0z. Thank you Ph0rk0z.
2023-06-16 20:49:36 -03:00
oobabooga
cb9be5db1c
Update ExLlama.md 2023-06-16 20:40:12 -03:00
oobabooga
83be8eacf0 Minor fix 2023-06-16 20:38:32 -03:00
oobabooga
9f40032d32
Add ExLlama support (#2444) 2023-06-16 20:35:38 -03:00
oobabooga
dea43685b0 Add some clarifications 2023-06-16 19:10:53 -03:00
oobabooga
7ef6a50e84
Reorganize model loading UI completely (#2720) 2023-06-16 19:00:37 -03:00
oobabooga
57be2eecdf
Update README.md 2023-06-16 15:04:16 -03:00
Meng-Yuan Huang
772d4080b2
Update llama.cpp-models.md for macOS (#2711) 2023-06-16 00:00:24 -03:00
Tom Jobbins
646b0c889f
AutoGPTQ: Add UI and command line support for disabling fused attention and fused MLP (#2648) 2023-06-15 23:59:54 -03:00
dependabot[bot]
909d8c6ae3
Bump transformers from 4.30.0 to 4.30.2 (#2695) 2023-06-14 19:56:28 -03:00
oobabooga
2b9a6b9259 Merge remote-tracking branch 'refs/remotes/origin/main' 2023-06-14 18:45:24 -03:00
oobabooga
4d508cbe58 Add some checks to AutoGPTQ loader 2023-06-14 18:44:43 -03:00
FartyPants
56c19e623c
Add LORA name instead of "default" in PeftModel (#2689) 2023-06-14 18:29:42 -03:00
oobabooga
134430bbe2 Minor change 2023-06-14 11:34:42 -03:00