Commit Graph

72 Commits

Author SHA1 Message Date
oobabooga
d272ac46dd Add Pillow as a requirement 2023-04-08 18:48:46 -03:00
oobabooga
58ed87e5d9
Update requirements.txt 2023-04-06 18:42:54 -03:00
dependabot[bot]
21be80242e
Bump rwkv from 0.7.2 to 0.7.3 (#842) 2023-04-06 17:52:27 -03:00
oobabooga
113f94b61e Bump transformers (16-bit llama must be reconverted/redownloaded) 2023-04-06 16:04:03 -03:00
oobabooga
59058576b5 Remove unused requirement 2023-04-06 13:28:21 -03:00
oobabooga
03cb44fc8c Add new llama.cpp library (2048 context, temperature, etc now work) 2023-04-06 13:12:14 -03:00
oobabooga
b2ce7282a1
Use past transformers version #773 2023-04-04 16:11:42 -03:00
dependabot[bot]
ad37f396fc
Bump rwkv from 0.7.1 to 0.7.2 (#747) 2023-04-03 14:29:57 -03:00
dependabot[bot]
18f756ada6
Bump gradio from 3.24.0 to 3.24.1 (#746) 2023-04-03 14:29:37 -03:00
TheTerrasque
2157bb4319
New yaml character format (#337 from TheTerrasque/feature/yaml-characters)
This doesn't break backward compatibility with JSON characters.
2023-04-02 20:34:25 -03:00
oobabooga
a5c9b7d977 Bump llamacpp version 2023-03-31 15:08:01 -03:00
oobabooga
4d98623041
Merge branch 'main' into feature/llamacpp 2023-03-31 14:37:04 -03:00
oobabooga
9d1dcf880a General improvements 2023-03-31 14:27:01 -03:00
oobabooga
f27a66b014 Bump gradio version (make sure to update)
This fixes the textbox shrinking vertically once it reaches
a certain number of lines.
2023-03-31 00:42:26 -03:00
Thomas Antony
8953a262cb Add llamacpp to requirements.txt 2023-03-30 11:22:38 +01:00
Alex "mcmonkey" Goodwin
b0f05046b3 remove duplicate import 2023-03-27 22:50:37 -07:00
Alex "mcmonkey" Goodwin
31f04dc615 Merge branch 'main' into add-train-lora-tab 2023-03-27 20:03:30 -07:00
dependabot[bot]
1e02f75f2b
Bump accelerate from 0.17.1 to 0.18.0
Bumps [accelerate](https://github.com/huggingface/accelerate) from 0.17.1 to 0.18.0.
- [Release notes](https://github.com/huggingface/accelerate/releases)
- [Commits](https://github.com/huggingface/accelerate/compare/v0.17.1...v0.18.0)

---
updated-dependencies:
- dependency-name: accelerate
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-03-28 01:19:34 +00:00
oobabooga
37f11803e3
Merge pull request #603 from oobabooga/dependabot/pip/rwkv-0.7.1
Bump rwkv from 0.7.0 to 0.7.1
2023-03-27 22:19:08 -03:00
dependabot[bot]
e9c0226b09
Bump rwkv from 0.7.0 to 0.7.1
Bumps [rwkv](https://github.com/BlinkDL/ChatRWKV) from 0.7.0 to 0.7.1.
- [Release notes](https://github.com/BlinkDL/ChatRWKV/releases)
- [Commits](https://github.com/BlinkDL/ChatRWKV/commits)

---
updated-dependencies:
- dependency-name: rwkv
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-03-27 21:05:35 +00:00
dependabot[bot]
9c96919121
Bump bitsandbytes from 0.37.1 to 0.37.2
Bumps [bitsandbytes](https://github.com/TimDettmers/bitsandbytes) from 0.37.1 to 0.37.2.
- [Release notes](https://github.com/TimDettmers/bitsandbytes/releases)
- [Changelog](https://github.com/TimDettmers/bitsandbytes/blob/main/CHANGELOG.md)
- [Commits](https://github.com/TimDettmers/bitsandbytes/commits)

---
updated-dependencies:
- dependency-name: bitsandbytes
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-03-27 21:05:19 +00:00
Alex "mcmonkey" Goodwin
e439228ed8 Merge branch 'main' into add-train-lora-tab 2023-03-27 08:21:19 -07:00
oobabooga
9ff6a538b6 Bump gradio version
Make sure to upgrade with

`pip install -r requirements.txt --upgrade`
2023-03-26 22:11:19 -03:00
Alex "mcmonkey" Goodwin
566898a79a initial lora training tab 2023-03-25 12:08:26 -07:00
oobabooga
7073e96093 Add back RWKV dependency #98 2023-03-19 12:05:28 -03:00
oobabooga
86b99006d9
Remove rwkv dependency 2023-03-18 10:27:52 -03:00
oobabooga
104293f411 Add LoRA support 2023-03-16 21:31:39 -03:00
oobabooga
23a5e886e1 The LLaMA PR has been merged into transformers
https://github.com/huggingface/transformers/pull/21955

The tokenizer class has been changed from

"LLaMATokenizer"

to

"LlamaTokenizer"

It is necessary to edit this change in every tokenizer_config.json
that you had for LLaMA so far.
2023-03-16 11:18:32 -03:00
oobabooga
29b7c5ac0c Sort the requirements 2023-03-15 12:40:03 -03:00
oobabooga
693b53d957 Merge branch 'main' into HideLord-main 2023-03-15 12:08:56 -03:00
dependabot[bot]
02d407542c
Bump accelerate from 0.17.0 to 0.17.1
Bumps [accelerate](https://github.com/huggingface/accelerate) from 0.17.0 to 0.17.1.
- [Release notes](https://github.com/huggingface/accelerate/releases)
- [Commits](https://github.com/huggingface/accelerate/compare/v0.17.0...v0.17.1)

---
updated-dependencies:
- dependency-name: accelerate
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-03-14 01:40:42 +00:00
oobabooga
d685332c10
Merge pull request #307 from oobabooga/dependabot/pip/bitsandbytes-0.37.1
Bump bitsandbytes from 0.37.0 to 0.37.1
2023-03-13 22:39:59 -03:00
dependabot[bot]
df83088593
Bump bitsandbytes from 0.37.0 to 0.37.1
Bumps [bitsandbytes](https://github.com/TimDettmers/bitsandbytes) from 0.37.0 to 0.37.1.
- [Release notes](https://github.com/TimDettmers/bitsandbytes/releases)
- [Changelog](https://github.com/TimDettmers/bitsandbytes/blob/main/CHANGELOG.md)
- [Commits](https://github.com/TimDettmers/bitsandbytes/commits)

---
updated-dependencies:
- dependency-name: bitsandbytes
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-03-14 01:36:18 +00:00
dependabot[bot]
715c3ecba6
Bump rwkv from 0.3.1 to 0.4.2
Bumps [rwkv](https://github.com/BlinkDL/ChatRWKV) from 0.3.1 to 0.4.2.
- [Release notes](https://github.com/BlinkDL/ChatRWKV/releases)
- [Commits](https://github.com/BlinkDL/ChatRWKV/commits)

---
updated-dependencies:
- dependency-name: rwkv
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2023-03-14 01:36:02 +00:00
Alexander Hristov Hristov
63c5a139a2
Merge branch 'main' into main 2023-03-13 19:50:08 +02:00
Luis Cosio
435a69e357
Fix for issue #282
RuntimeError: Tensors must have same number of dimensions: got 3 and 4
2023-03-13 11:41:35 -06:00
HideLord
683556f411 Adding markdown support and slight refactoring. 2023-03-12 21:34:09 +02:00
oobabooga
441e993c51 Bump accelerate, RWKV and safetensors 2023-03-12 14:25:14 -03:00
oobabooga
3c25557ef0 Add tqdm to requirements.txt 2023-03-12 08:48:16 -03:00
oobabooga
501afbc234 Add requests to requirements.txt 2023-03-11 14:47:30 -03:00
oobabooga
fd540b8930 Use new LLaMA implementation (this will break stuff. I am sorry)
https://github.com/oobabooga/text-generation-webui/wiki/LLaMA-model
2023-03-09 17:59:15 -03:00
oobabooga
8660227e1b Add top_k to RWKV 2023-03-07 17:24:28 -03:00
oobabooga
153dfeb4dd Add --rwkv-cuda-on parameter, bump rwkv version 2023-03-06 20:12:54 -03:00
oobabooga
145c725c39 Bump RWKV version 2023-03-05 16:28:21 -03:00
oobabooga
5492e2e9f8 Add sentencepiece 2023-03-05 10:02:24 -03:00
oobabooga
c33715ad5b Move towards HF LLaMA implementation 2023-03-05 01:20:31 -03:00
oobabooga
bcea196c9d Bump flexgen version 2023-03-02 12:03:57 -03:00
oobabooga
7a9b4407b0 Settle for 0.0.6 for now 2023-03-01 17:37:14 -03:00
oobabooga
f351dce032 Keep rwkv up to date 2023-03-01 17:36:16 -03:00
oobabooga
9c86a1cd4a Add RWKV pip package 2023-03-01 11:42:49 -03:00