Commit Graph

3495 Commits

Author SHA1 Message Date
Philipp Claßen
3eca20c015
Typo fixed in variable names (#5184) 2024-01-06 03:05:03 -03:00
oobabooga
91c2b8e11c Improvements to character_bias extension 2024-01-04 20:48:26 -08:00
oobabooga
248742df1c Save extension fields to settings.yaml on "Save UI defaults" 2024-01-04 20:33:42 -08:00
oobabooga
9e86bea8e9 Use requirements_cpu.txt for intel 2024-01-04 18:52:14 -08:00
oobabooga
3d854ee516
Pin PyTorch version to 2.1 (#5056) 2024-01-04 23:50:23 -03:00
Matthew Raaff
c9c31f71b8
Various one-click installer improvements (#4994)
---------

Co-authored-by: oobabooga <112222186+oobabooga@users.noreply.github.com>
2024-01-04 23:41:54 -03:00
oobabooga
c9d814592e Increase maximum temperature value to 5 2024-01-04 17:28:15 -08:00
Guanghua Lu
3bb4b0504e
Close the menu on second click. (#5110) 2024-01-04 13:52:11 -03:00
oobabooga
e4d724eb3f Fix cache_folder bug introduced in 37eff915d6 2024-01-04 07:49:40 -08:00
Alberto Cano
37eff915d6
Use --disk-cache-dir for all caches 2024-01-04 00:27:26 -03:00
Lounger
7965f6045e
Fix loading latest history for file names with dots (#5162) 2024-01-03 22:39:41 -03:00
Adam Florizone
894e1a0700
Docker: added build args for non AVX2 CPU (#5154) 2024-01-03 20:43:02 -03:00
AstrisCantCode
b80e6365d0
Fix various bugs for LoRA training (#5161) 2024-01-03 20:42:20 -03:00
oobabooga
f6a204d7c9 Bump llama-cpp-python to 0.2.26 2024-01-03 11:06:36 -08:00
oobabooga
3a6cba9021 Add top_k=1 to Debug-deterministic preset
Makes it work with llama.cpp
2024-01-02 15:54:56 -08:00
oobabooga
7cce88c403 Rmove an unncecessary exception 2024-01-02 07:20:59 -08:00
oobabooga
90c7e84b01 UI: improve chat style margin for last bot message 2024-01-01 19:50:13 -08:00
oobabooga
a4b4708560 Decrease "Show controls" button opacity 2024-01-01 19:08:30 -08:00
oobabooga
94afa0f9cf Minor style changes 2024-01-01 16:00:22 -08:00
oobabooga
cbf6f9e695 Update some UI messages 2023-12-30 21:31:17 -08:00
oobabooga
2aad91f3c9
Remove deprecated command-line flags (#5131) 2023-12-31 02:07:48 -03:00
TheInvisibleMage
485b85ee76
Superboogav2 Quick Fixes (#5089) 2023-12-31 02:03:23 -03:00
oobabooga
2734ce3e4c
Remove RWKV loader (#5130) 2023-12-31 02:01:40 -03:00
oobabooga
0e54a09bcb
Remove exllamav1 loaders (#5128) 2023-12-31 01:57:06 -03:00
oobabooga
8e397915c9
Remove --sdp-attention, --xformers flags (#5126) 2023-12-31 01:36:51 -03:00
B611
b7dd1f9542
Specify utf-8 encoding for model metadata file open (#5125) 2023-12-31 01:34:32 -03:00
oobabooga
20a2eaaf95 Add .vs to .gitignore 2023-12-27 12:58:07 -08:00
oobabooga
a4079e879e CSS: don't change --chat-height when outside the chat tab 2023-12-27 11:51:55 -08:00
oobabooga
c419206ce1 Lint the JS/CSS 2023-12-27 09:59:23 -08:00
oobabooga
648c2d1cc2 Update settings-template.yaml 2023-12-25 15:25:16 -08:00
oobabooga
c21e3d6300
Merge pull request #5044 from TheLounger/style_improvements
Improve chat styles
2023-12-25 20:00:50 -03:00
oobabooga
2ad6c526b8 Check if extensions block exists before changing it 2023-12-25 14:43:12 -08:00
oobabooga
63553b41ed Improve some paddings 2023-12-25 14:25:31 -08:00
oobabooga
abd227594c Fix a border radius 2023-12-25 14:17:00 -08:00
oobabooga
8d0359a6d8 Rename some CSS variables 2023-12-25 14:10:07 -08:00
oobabooga
5466ae59a7 Prevent input/chat area overlap with new --my-delta variable 2023-12-25 14:07:31 -08:00
oobabooga
02d063fb9f Fix extra space after 18ca35faaa 2023-12-25 08:38:17 -08:00
oobabooga
ae927950a8 Remove instruct style border radius 2023-12-25 08:35:33 -08:00
oobabooga
18ca35faaa Space between chat tab and extensions block 2023-12-25 08:34:02 -08:00
oobabooga
73ba7a8921 Change height -> min-height for .chat 2023-12-25 08:32:02 -08:00
oobabooga
29b0f14d5a
Bump llama-cpp-python to 0.2.25 (#5077) 2023-12-25 12:36:32 -03:00
oobabooga
c06f630bcc Increase max_updates_second maximum value 2023-12-24 13:29:47 -08:00
Casper
92d5e64a82
Bump AutoAWQ to 0.1.8 (#5061) 2023-12-24 14:27:34 -03:00
oobabooga
4aeebfc571 Merge branch 'dev' into TheLounger-style_improvements 2023-12-24 09:24:55 -08:00
oobabooga
d76b00c211 Pin lm_eval package version 2023-12-24 09:22:31 -08:00
oobabooga
8c60495878 UI: add "Maximum UI updates/second" parameter 2023-12-24 09:17:40 -08:00
zhangningboo
1b8b61b928
Fix output_ids decoding for Qwen/Qwen-7B-Chat (#5045) 2023-12-22 23:11:02 -03:00
kabachuha
dbe438564e
Support for sending images into OpenAI chat API (#4827) 2023-12-22 22:45:53 -03:00
Stefan Daniel Schwarz
8956f3ebe2
Synthia instruction templates (#5041) 2023-12-22 22:19:43 -03:00
Yiximail
afc91edcb2
Reset the model_name after unloading the model (#5051) 2023-12-22 22:18:24 -03:00