Commit Graph

3285 Commits

Author SHA1 Message Date
oobabooga
1fba6db69f
Merge pull request #4488 from oobabooga/dev
Merge dev branch
2023-11-06 12:18:55 -03:00
oobabooga
0ed6a17ed4 Update warning 2023-11-06 07:17:49 -08:00
oobabooga
0db81355bc Reorder a parameter 2023-11-06 07:11:49 -08:00
oobabooga
b87c6213ae Remove obsolete endpoint 2023-11-06 05:45:45 -08:00
oobabooga
fcc9114b58 Merge remote-tracking branch 'refs/remotes/origin/dev' into dev 2023-11-06 05:38:47 -08:00
oobabooga
ceb8c92dfc
Update 12 - OpenAI API.md 2023-11-06 10:38:22 -03:00
oobabooga
28fd535f9c Make chat API more robust 2023-11-06 05:22:01 -08:00
oobabooga
5b5ef57049 Remove file 2023-11-05 21:39:59 -08:00
oobabooga
ec17a5d2b7
Make OpenAI API the default API (#4430) 2023-11-06 02:38:29 -03:00
俞航
84d957ba62
[Fix] fix openai embedding_model loading as str (#4147) 2023-11-05 20:42:45 -03:00
kabachuha
e18a0460d4
fix openai extension not working because of absent new defaults (#4477) 2023-11-04 16:12:51 -03:00
oobabooga
b7a409ef57
Merge pull request #4476 from oobabooga/dev
Merge dev branch
2023-11-04 15:04:43 -03:00
oobabooga
fb3bd0203d Update docs 2023-11-04 11:02:24 -07:00
oobabooga
1d8c7c1fc4 Update docs 2023-11-04 11:01:15 -07:00
oobabooga
b5c53041b8
Merge pull request #4475 from oobabooga/dev
Merge dev branch
2023-11-04 14:19:55 -03:00
oobabooga
40f7f37009 Update requirements 2023-11-04 10:12:06 -07:00
Orang
2081f43ac2
Bump transformers to 4.35.* (#4474) 2023-11-04 14:00:24 -03:00
feng lui
4766a57352
transformers: add use_flash_attention_2 option (#4373) 2023-11-04 13:59:33 -03:00
wouter van der plas
add359379e
fixed two links in the ui (#4452) 2023-11-04 13:41:42 -03:00
Casper
cfbd108826
Bump AWQ to 0.1.6 (#4470) 2023-11-04 13:09:41 -03:00
oobabooga
aa5d671579
Add temperature_last parameter (#4472) 2023-11-04 13:09:07 -03:00
oobabooga
1ab8700d94 Change frequency/presence penalty ranges 2023-11-03 17:38:19 -07:00
oobabooga
45fcb60e7a Make truncation_length_max apply to max_seq_len/n_ctx 2023-11-03 11:29:31 -07:00
oobabooga
7f9c1cbb30 Change min_p default to 0.0 2023-11-03 08:25:22 -07:00
oobabooga
4537853e2c Change min_p default to 1.0 2023-11-03 08:13:50 -07:00
kalomaze
367e5e6e43
Implement Min P as a sampler option in HF loaders (#4449) 2023-11-02 16:32:51 -03:00
oobabooga
fcb7017b7a Remove a checkbox 2023-11-02 12:24:09 -07:00
Julien Chaumond
fdcaa955e3
transformers: Add a flag to force load from safetensors (#4450) 2023-11-02 16:20:54 -03:00
oobabooga
c0655475ae Add cache_8bit option 2023-11-02 11:23:04 -07:00
oobabooga
42f816312d Merge remote-tracking branch 'refs/remotes/origin/dev' into dev 2023-11-02 11:09:26 -07:00
oobabooga
77abd9b69b Add no_flash_attn option 2023-11-02 11:08:53 -07:00
Julien Chaumond
a56ef2a942
make torch.load a bit safer (#4448) 2023-11-02 14:07:08 -03:00
deevis
deba039c03
(fix): OpenOrca-Platypus2 models should use correct instruction_template and custom_stopping_strings (#4435) 2023-11-01 01:51:00 -03:00
Mehran Ziadloo
aaf726dbfb
Updating the shared settings object when loading a model (#4425) 2023-11-01 01:29:57 -03:00
oobabooga
9bd0724d85 Change frequency/presence penalty ranges 2023-10-31 20:57:56 -07:00
Orang
6b7fa45cc3
Update exllamav2 version (#4417) 2023-10-31 19:12:14 -03:00
Casper
41e159e88f
Bump AutoAWQ to v0.1.5 (#4410) 2023-10-31 19:11:22 -03:00
Meheret
0707ed7677
updated wiki link (#4415) 2023-10-31 19:09:05 -03:00
oobabooga
262f8ae5bb Use default gr.Dataframe for evaluation table 2023-10-27 06:49:14 -07:00
James Braza
f481ce3dd8
Adding platform_system to autoawq (#4390) 2023-10-27 01:02:28 -03:00
dependabot[bot]
af98587580
Update accelerate requirement from ==0.23.* to ==0.24.* (#4400) 2023-10-27 00:46:16 -03:00
oobabooga
839a87bac8 Fix is_ccl_available & is_xpu_available imports 2023-10-26 20:27:04 -07:00
Abhilash Majumder
778a010df8
Intel Gpu support initialization (#4340) 2023-10-26 23:39:51 -03:00
GuizzyQC
317e2c857e
sd_api_pictures: fix Gradio warning message regarding custom value (#4391) 2023-10-26 23:03:21 -03:00
oobabooga
92b2f57095 Minor metadata bug fix (second attempt) 2023-10-26 18:57:32 -07:00
oobabooga
2d97897a25 Don't install flash-attention on windows + cuda 11 2023-10-25 11:21:18 -07:00
LightningDragon
0ced78fdfa
Replace hashlib.sha256 with hashlib.file_digest so we don't need to load entire files into ram before hashing them. (#4383) 2023-10-25 12:15:34 -03:00
tdrussell
72f6fc6923
Rename additive_repetition_penalty to presence_penalty, add frequency_penalty (#4376) 2023-10-25 12:10:28 -03:00
oobabooga
ef1489cd4d Remove unused parameter in AutoAWQ 2023-10-23 20:45:43 -07:00
oobabooga
1edf321362 Lint 2023-10-23 13:09:03 -07:00