oobabooga
|
9f40032d32
|
Add ExLlama support (#2444)
|
2023-06-16 20:35:38 -03:00 |
|
IJumpAround
|
020fe7b50b
|
Remove mutable defaults from function signature. (#1663)
|
2023-05-08 22:55:41 -03:00 |
|
oobabooga
|
057b1b2978
|
Add credits
|
2023-05-03 21:49:55 -03:00 |
|
Alex "mcmonkey" Goodwin
|
0df0b2d0f9
|
optimize stopping strings processing (#1625)
|
2023-05-02 01:21:54 -03:00 |
|
oobabooga
|
ea6e77df72
|
Make the code more like PEP8 for readability (#862)
|
2023-04-07 00:15:45 -03:00 |
|
Maya
|
b246d17513
|
Fix type object is not subscriptable
Fix `type object is not subscriptable` on python 3.8
|
2023-03-31 14:20:31 +03:00 |
|
oobabooga
|
304f812c63
|
Gracefully handle CUDA out of memory errors with streaming
|
2023-03-28 19:20:50 -03:00 |
|
oobabooga
|
af65c12900
|
Change Stop button behavior
|
2023-03-27 13:23:59 -03:00 |
|
oobabooga
|
8c8e8b4450
|
Fix the early stopping callback #559
|
2023-03-25 12:35:52 -03:00 |
|
oobabooga
|
8747c74339
|
Another missing import
|
2023-03-23 22:19:01 -03:00 |
|
oobabooga
|
7078d168c3
|
Missing import
|
2023-03-23 22:16:08 -03:00 |
|
oobabooga
|
d1327f99f9
|
Fix broken callbacks.py
|
2023-03-23 22:12:24 -03:00 |
|
oobabooga
|
bf22d16ebc
|
Clear cache while switching LoRAs
|
2023-03-23 21:56:26 -03:00 |
|
oobabooga
|
4578e88ffd
|
Stop the bot from talking for you in chat mode
|
2023-03-23 21:38:20 -03:00 |
|
oobabooga
|
a717fd709d
|
Sort the imports
|
2023-03-17 11:42:25 -03:00 |
|
oobabooga
|
341e135036
|
Various fixes in chat mode
|
2023-03-12 02:53:08 -03:00 |
|
oobabooga
|
0bd5430988
|
Use 'with' statement to better handle streaming memory
|
2023-03-12 02:04:28 -03:00 |
|
oobabooga
|
37f0166b2d
|
Fix memory leak in new streaming (second attempt)
|
2023-03-11 23:14:49 -03:00 |
|
oobabooga
|
ab50f80542
|
New text streaming method (much faster)
|
2023-03-08 02:46:35 -03:00 |
|