oobabooga
|
bf56b6c1fb
|
Load settings.json without the need for --settings settings.json
This is for setting UI defaults
|
2023-03-06 10:57:45 -03:00 |
|
oobabooga
|
bcea196c9d
|
Bump flexgen version
|
2023-03-02 12:03:57 -03:00 |
|
oobabooga
|
169209805d
|
Model-aware prompts and presets
|
2023-03-02 11:25:04 -03:00 |
|
oobabooga
|
99dc95e14e
|
Minor aesthetic change
|
2023-03-01 19:32:04 -03:00 |
|
oobabooga
|
a1429d1607
|
Add default extensions to the settings
|
2023-02-28 02:20:11 -03:00 |
|
oobabooga
|
365e1089b3
|
Move some buttons
|
2023-02-28 01:34:07 -03:00 |
|
oobabooga
|
43b6ab8673
|
Store thumbnails as files instead of base64 strings
This improves the UI responsiveness for large histories.
|
2023-02-27 13:41:00 -03:00 |
|
oobabooga
|
611010e8af
|
Add a confirmation to clear history
|
2023-02-27 11:41:21 -03:00 |
|
oobabooga
|
7a776ccf87
|
Make the gallery interactive to load characters
|
2023-02-26 17:19:36 -03:00 |
|
oobabooga
|
e91eb24649
|
Decrease the repetition penalty upper limit to 3
|
2023-02-26 01:51:59 -03:00 |
|
oobabooga
|
3d94ebfdd0
|
Change --chat colors
|
2023-02-26 00:51:15 -03:00 |
|
oobabooga
|
b3d2365d92
|
Rename a button
|
2023-02-25 16:33:46 -03:00 |
|
oobabooga
|
03d25c1c61
|
Reorder the chat buttons
|
2023-02-25 15:35:43 -03:00 |
|
oobabooga
|
e2cf4e4968
|
Reorder the custom parameters
|
2023-02-25 15:21:40 -03:00 |
|
oobabooga
|
381f747181
|
Reorganize the custom parameters for mobile usage
|
2023-02-25 15:17:44 -03:00 |
|
oobabooga
|
01acb250c5
|
Add a comment
|
2023-02-25 02:07:29 -03:00 |
|
oobabooga
|
7c2babfe39
|
Rename greed to "generation attempts"
|
2023-02-25 01:42:19 -03:00 |
|
oobabooga
|
2dfb999bf1
|
Add greed parameter
|
2023-02-25 01:31:01 -03:00 |
|
oobabooga
|
7a527a5581
|
Move "send picture" into an extension
I am not proud of how I did it for now.
|
2023-02-25 00:23:51 -03:00 |
|
oobabooga
|
e51ece21c0
|
Add ui() function to extensions
|
2023-02-24 19:00:11 -03:00 |
|
oobabooga
|
77f58e5dab
|
Remove a space
|
2023-02-24 17:32:34 -03:00 |
|
oobabooga
|
c5066f1192
|
Rename some variables, be consistent about ' and "
|
2023-02-24 17:31:23 -03:00 |
|
oobabooga
|
78ad55641b
|
Remove duplicate max_new_tokens parameter
|
2023-02-24 17:19:42 -03:00 |
|
oobabooga
|
65326b545a
|
Move all gradio elements to shared (so that extensions can use them)
|
2023-02-24 16:46:50 -03:00 |
|
oobabooga
|
0a3590da8c
|
Add a progress bar
|
2023-02-24 14:19:27 -03:00 |
|
oobabooga
|
3b8cecbab7
|
Reload the default chat on page refresh
|
2023-02-23 19:50:23 -03:00 |
|
oobabooga
|
f1914115d3
|
Fix minor issue with chat logs
|
2023-02-23 16:04:47 -03:00 |
|
oobabooga
|
2e86a1ec04
|
Move chat history into shared module
|
2023-02-23 15:11:18 -03:00 |
|
oobabooga
|
c87800341c
|
Move function to extensions module
|
2023-02-23 14:55:21 -03:00 |
|
oobabooga
|
7224343a70
|
Improve the imports
|
2023-02-23 14:41:42 -03:00 |
|
oobabooga
|
364529d0c7
|
Further refactor
|
2023-02-23 14:31:28 -03:00 |
|
oobabooga
|
e46c43afa6
|
Move some stuff from server.py to modules
|
2023-02-23 13:42:23 -03:00 |
|
oobabooga
|
1dacd34165
|
Further refactor
|
2023-02-23 13:28:30 -03:00 |
|
oobabooga
|
ce7feb3641
|
Further refactor
|
2023-02-23 13:03:52 -03:00 |
|
oobabooga
|
98af4bfb0d
|
Refactor the code to make it more modular
|
2023-02-23 12:05:25 -03:00 |
|
oobabooga
|
18e0ec955e
|
Improve some descriptions in --help
|
2023-02-23 10:11:58 -03:00 |
|
oobabooga
|
c72892835a
|
Don't show *-np models in the list of choices
|
2023-02-22 11:38:16 -03:00 |
|
oobabooga
|
044b963987
|
Add stop parameter for flexgen (#105)
|
2023-02-22 11:23:36 -03:00 |
|
oobabooga
|
ea21a22940
|
Remove redundant preset
|
2023-02-22 01:01:26 -03:00 |
|
oobabooga
|
b8b3d4139c
|
Add --compress-weight parameter
|
2023-02-22 00:43:21 -03:00 |
|
oobabooga
|
eef6fc3cbf
|
Add a preset for FlexGen
|
2023-02-21 23:33:15 -03:00 |
|
oobabooga
|
311404e258
|
Reuse disk-cache-dir parameter for flexgen
|
2023-02-21 22:11:05 -03:00 |
|
oobabooga
|
f3c75bbd64
|
Add --percent flag for flexgen
|
2023-02-21 22:08:46 -03:00 |
|
oobabooga
|
b83f51ee04
|
Add FlexGen support #92 (experimental)
|
2023-02-21 21:00:06 -03:00 |
|
oobabooga
|
444cd69c67
|
Fix regex bug in loading character jsons with special characters
|
2023-02-20 19:38:19 -03:00 |
|
oobabooga
|
d7a738fb7a
|
Load any 13b/20b/30b model in 8-bit mode when no flags are supplied
|
2023-02-20 15:44:10 -03:00 |
|
oobabooga
|
77846ceef3
|
Minor change
|
2023-02-20 15:05:48 -03:00 |
|
oobabooga
|
e195377050
|
Deprecate torch dumps, move to safetensors (they load even faster)
|
2023-02-20 15:03:19 -03:00 |
|
oobabooga
|
14ffa0b418
|
Fix line breaks in --chat mode
|
2023-02-20 13:25:46 -03:00 |
|
SillyLossy
|
ded890c378
|
Escape regexp in message extraction
|
2023-02-19 12:55:45 +02:00 |
|
oobabooga
|
8c9dd95d55
|
Print the softprompt metadata when it is loaded
|
2023-02-19 01:48:23 -03:00 |
|
oobabooga
|
f79805f4a4
|
Change a comment
|
2023-02-18 22:58:40 -03:00 |
|
oobabooga
|
d58544a420
|
Some minor formatting changes
|
2023-02-18 11:07:55 -03:00 |
|
oobabooga
|
0dd41e4830
|
Reorganize the sliders some more
|
2023-02-17 16:33:27 -03:00 |
|
oobabooga
|
6b9ac2f88e
|
Reorganize the generation parameters
|
2023-02-17 16:18:01 -03:00 |
|
oobabooga
|
596732a981
|
The soft prompt length must be considered here too
|
2023-02-17 12:35:30 -03:00 |
|
oobabooga
|
edc0262889
|
Minor file uploading fixes
|
2023-02-17 10:27:41 -03:00 |
|
oobabooga
|
243244eeec
|
Attempt at fixing greyed out files on iphone
|
2023-02-17 10:17:15 -03:00 |
|
oobabooga
|
a226f4cddb
|
No change, so reverting
|
2023-02-17 09:27:17 -03:00 |
|
oobabooga
|
40cb9f63f6
|
Try making Colab happy (tensorflow warnings)
|
2023-02-17 09:23:11 -03:00 |
|
oobabooga
|
aeddf902ec
|
Make the refresh button prettier
|
2023-02-16 21:55:20 -03:00 |
|
oobabooga
|
21512e2790
|
Make the Stop button work more reliably
|
2023-02-16 21:21:45 -03:00 |
|
oobabooga
|
08805b3374
|
Force "You" in impersonate too
|
2023-02-16 13:24:13 -03:00 |
|
oobabooga
|
d7db04403f
|
Fix --chat chatbox height
|
2023-02-16 12:45:05 -03:00 |
|
oobabooga
|
589069e105
|
Don't regenerate if no message has been sent
|
2023-02-16 12:32:35 -03:00 |
|
oobabooga
|
405dfbf57c
|
Force your name to be "You" for pygmalion (properly)
|
2023-02-16 12:16:12 -03:00 |
|
oobabooga
|
7bd2ae05bf
|
Force your name to be "You" for pygmalion
This allows you to customize your displayed name.
|
2023-02-15 21:32:53 -03:00 |
|
oobabooga
|
3746d72853
|
More style fixes
|
2023-02-15 21:13:12 -03:00 |
|
oobabooga
|
6f213b8c14
|
Style fix
|
2023-02-15 20:58:17 -03:00 |
|
oobabooga
|
ccf10db60f
|
Move stuff into tabs in chat mode
|
2023-02-15 20:55:32 -03:00 |
|
oobabooga
|
a55e8836f6
|
Bump gradio version
It looks uglier, but the old one was bugged and unstable.
|
2023-02-15 20:20:56 -03:00 |
|
oobabooga
|
0e89ff4b13
|
Clear the persistent history after clicking on "Clear history"
|
2023-02-15 16:49:52 -03:00 |
|
oobabooga
|
b3bcd2881d
|
Implement regenerate/impersonate the proper way (fixes #78)
|
2023-02-15 14:39:26 -03:00 |
|
oobabooga
|
5ee9283cae
|
Mention BLIP
|
2023-02-15 13:53:38 -03:00 |
|
oobabooga
|
8d3b3959e7
|
Document --picture option
|
2023-02-15 13:50:18 -03:00 |
|
oobabooga
|
2eea0f4edb
|
Minor change
|
2023-02-15 12:58:11 -03:00 |
|
oobabooga
|
3c31fa7079
|
Simplifications
|
2023-02-15 12:46:11 -03:00 |
|
oobabooga
|
80fbc584f7
|
Readability
|
2023-02-15 11:38:44 -03:00 |
|
oobabooga
|
b397bea387
|
Make chat history persistent
|
2023-02-15 11:30:38 -03:00 |
|
oobabooga
|
7be372829d
|
Set chat prompt size in tokens
|
2023-02-15 10:18:50 -03:00 |
|
oobabooga
|
8c3ef58e00
|
Use BLIP directly + some simplifications
|
2023-02-14 23:55:46 -03:00 |
|
SillyLossy
|
a7d98f494a
|
Use BLIP to send a picture to model
|
2023-02-15 01:38:21 +02:00 |
|
oobabooga
|
d910d435cd
|
Consider the softprompt in the maximum prompt length calculation
|
2023-02-14 12:06:47 -03:00 |
|
oobabooga
|
8b3bb512ef
|
Minor bug fix (soft prompt was being loaded twice)
|
2023-02-13 23:34:04 -03:00 |
|
oobabooga
|
7739a29524
|
Some simplifications
|
2023-02-13 18:48:32 -03:00 |
|
oobabooga
|
3277b751f5
|
Add softprompt support (for real this time)
Is this too much voodoo for our purposes?
|
2023-02-13 15:25:16 -03:00 |
|
oobabooga
|
aa1177ff15
|
Send last internal reply to input rather than visible
|
2023-02-13 03:29:23 -03:00 |
|
oobabooga
|
2c3abcf57a
|
Add support for rosey/chip/joi instruct models
|
2023-02-12 09:46:34 -03:00 |
|
oobabooga
|
7ef7bba6e6
|
Add progress bar for model loading
|
2023-02-12 09:36:27 -03:00 |
|
oobabooga
|
5d3f15b915
|
Use the CPU if no GPU is detected
|
2023-02-11 23:17:06 -03:00 |
|
oobabooga
|
b3c4657c47
|
Remove commas from preset files
|
2023-02-11 14:54:29 -03:00 |
|
oobabooga
|
0dd1409f24
|
Add penalty_alpha parameter (contrastive search)
|
2023-02-11 14:48:12 -03:00 |
|
oobabooga
|
2ed0386d87
|
Fix replace last reply in --chat mode (for #69)
|
2023-02-11 07:59:54 -03:00 |
|
oobabooga
|
316e07f06a
|
auto-assign gpu memory with --auto-devices alone
|
2023-02-10 16:36:06 -03:00 |
|
oobabooga
|
219366342b
|
Sort imports according to PEP8 (based on #67)
|
2023-02-10 15:40:03 -03:00 |
|
81300
|
20dbef9623
|
Extend bfloat16 support
|
2023-02-09 20:00:03 +02:00 |
|
oobabooga
|
cadd100405
|
min_length has to be 0 when streaming is on
|
2023-02-08 00:23:35 -03:00 |
|
oobabooga
|
6be571cff7
|
Better variable names
|
2023-02-08 00:19:20 -03:00 |
|
oobabooga
|
58b07cca81
|
length_penalty can be negative (apparently)
|
2023-02-07 23:33:02 -03:00 |
|
oobabooga
|
7e4c25691d
|
Repetition penalty has to be < 5
|
2023-02-07 23:23:39 -03:00 |
|
oobabooga
|
1c30e1b49a
|
Add even more sliders
|
2023-02-07 23:11:04 -03:00 |
|
oobabooga
|
24dc705eca
|
Add lots of sliders
|
2023-02-07 22:08:21 -03:00 |
|
Martin J
|
06a4664805
|
Fix a regex issue in tokenize_dialogue .
The existing regex would fail if using character names that start with
numbers, for example: 9S or 2B.
|
2023-02-05 07:42:57 +01:00 |
|
oobabooga
|
2fe235738e
|
Reorganize chat buttons
|
2023-02-04 22:53:42 -03:00 |
|
oobabooga
|
2207d44986
|
Windows doesn't like : in filenames
|
2023-02-04 20:07:39 -03:00 |
|
oobabooga
|
65266f3349
|
Fix loading official colab chat logs
|
2023-02-03 22:43:02 -03:00 |
|
oobabooga
|
44e8c671f9
|
Fix API documentation formatting in chat mode
|
2023-02-03 10:00:05 -03:00 |
|
oobabooga
|
a28f0d8bd7
|
Show it/s in the same units with or without streaming
Closes #49
|
2023-02-03 09:11:11 -03:00 |
|
oobabooga
|
4e4cd67223
|
Save chat history with name/date in filename
closes #50
|
2023-02-03 09:02:35 -03:00 |
|
oobabooga
|
3af3ffeb90
|
Make --help output more readable
|
2023-02-02 23:36:28 -03:00 |
|
oobabooga
|
638495b633
|
Simplify generate() function
|
2023-02-02 13:47:08 -03:00 |
|
oobabooga
|
3f05cf5ddd
|
Simplify encode() function
|
2023-02-02 13:31:32 -03:00 |
|
oobabooga
|
2583bc5840
|
Simplify deepspeed implementation (#40)
|
2023-02-02 12:15:44 -03:00 |
|
oobabooga
|
f38c9bf428
|
Fix deepspeed (oops)
|
2023-02-02 10:39:37 -03:00 |
|
oobabooga
|
90f1067598
|
Move deepspeed parameters to another file
|
2023-02-02 10:25:09 -03:00 |
|
81300
|
248ec4fa21
|
Merge branch 'oobabooga:main' into ds
|
2023-02-01 20:50:51 +02:00 |
|
81300
|
a6f4760772
|
Add arg for bfloat16
|
2023-02-01 20:22:07 +02:00 |
|
81300
|
c515282f5c
|
no_split_module_classes not needed
|
2023-02-01 19:47:26 +02:00 |
|
81300
|
0a0d289537
|
Fix issue with generating on multiple GPUs
|
2023-02-01 19:02:07 +02:00 |
|
81300
|
a97afa6965
|
Add DeepSpeed ZeRO-3 integration
|
2023-02-01 18:48:13 +02:00 |
|
oobabooga
|
6b13816c47
|
Change default --disk behavior
|
2023-02-01 10:43:28 -03:00 |
|
oobabooga
|
119be56390
|
Add back low_cpu_mem_usage=True
Removing it didn't help with anything, so I am adding it bad on a purely
superstiticious basis.
|
2023-02-01 10:01:44 -03:00 |
|
oobabooga
|
d4a0b377ab
|
Allow standalone --cpu-memory
I think that what I am doing probably makes sense, but I could be wrong.
|
2023-01-31 21:23:16 -03:00 |
|
oobabooga
|
8ef89df746
|
Try to leave at least 1GiB free to prevent oom errors
|
2023-01-31 20:47:05 -03:00 |
|
oobabooga
|
bb77f20a6c
|
Don't use low_cpu_mem_usage and device_map together
|
2023-01-31 13:24:05 -03:00 |
|
oobabooga
|
001ecf95b2
|
Update server.py
|
2023-01-31 08:14:16 -03:00 |
|
Silver267
|
a85bb5e9a2
|
Fix an error
Fixes "UnboundLocalError: local variable 'substring_found' referenced before assignment" when loading non-pygmalion models in cai chat mode.
|
2023-01-31 01:34:10 -05:00 |
|
oobabooga
|
5b0bbfa6e8
|
Clean up
|
2023-01-30 14:17:12 -03:00 |
|
oobabooga
|
2dadf42cb5
|
Print the tokenized example dialogue in a prettier way
|
2023-01-30 08:29:49 -03:00 |
|
oobabooga
|
161cae001b
|
I needed this
|
2023-01-29 23:20:22 -03:00 |
|
oobabooga
|
3ebca480f6
|
Minor fix
|
2023-01-29 23:05:17 -03:00 |
|
oobabooga
|
00707a0b3b
|
Add "Impersonate" button
|
2023-01-29 22:56:23 -03:00 |
|
oobabooga
|
de72e83508
|
Reorganize things
|
2023-01-29 14:27:22 -03:00 |
|
oobabooga
|
6fbfee9e6d
|
Remove some bloat
|
2023-01-29 12:05:18 -03:00 |
|
oobabooga
|
9c9bd1074f
|
Add option to replace the bot's last reply
|
2023-01-29 12:02:44 -03:00 |
|
oobabooga
|
e5ff4ddfc8
|
Add bot prefix modifier option in extensions
|
2023-01-29 10:11:59 -03:00 |
|
oobabooga
|
b6d01bb704
|
Enable extensions in all modes, not just chat
|
2023-01-29 09:48:18 -03:00 |
|
oobabooga
|
1a139664f5
|
Grammar
|
2023-01-29 02:54:36 -03:00 |
|
oobabooga
|
2d134031ca
|
Apply extensions to character greeting
|
2023-01-29 00:04:11 -03:00 |
|
oobabooga
|
e349b52256
|
Read extensions parameters from settings file
|
2023-01-28 23:21:40 -03:00 |
|
oobabooga
|
2239be2351
|
Support for number/bool extension parameters
|
2023-01-28 23:08:28 -03:00 |
|
oobabooga
|
6da94e358c
|
Add support for extensions parameters
Still experimental
|
2023-01-28 23:00:51 -03:00 |
|
oobabooga
|
e779fd795f
|
Save TavernAI characters with TavernAI- prefix
|
2023-01-28 21:01:56 -03:00 |
|
oobabooga
|
833a1138fa
|
Explain the dialogue tokenization output
|
2023-01-28 20:41:02 -03:00 |
|
oobabooga
|
545b7395b2
|
Prevent huge --help outputs
|
2023-01-28 20:36:51 -03:00 |
|
oobabooga
|
f4c455ce29
|
Merge pull request #30 from 10sa/patch-1
Add listening port options for listening mode.
|
2023-01-28 20:35:20 -03:00 |
|
oobabooga
|
7b283a4a3d
|
Update server.py
|
2023-01-28 20:35:05 -03:00 |
|
oobabooga
|
f4674d34a9
|
Reorganize chat UI elements
|
2023-01-28 20:28:08 -03:00 |
|
oobabooga
|
3687962e6c
|
Add support for TavernAI character cards (closes #31)
|
2023-01-28 20:18:23 -03:00 |
|
oobabooga
|
f71531186b
|
Upload profile pictures from the web UI
|
2023-01-28 19:16:37 -03:00 |
|
Tensa
|
3742d3b18a
|
Add listening port options for listening mode.
|
2023-01-28 03:38:34 +09:00 |
|
oobabooga
|
69ffef4391
|
History loading minor bug fix
|
2023-01-27 12:01:11 -03:00 |
|
oobabooga
|
8b8236c6ff
|
Fix Regenerate button bug
|
2023-01-27 11:14:19 -03:00 |
|
oobabooga
|
1d1f931757
|
Load extensions at startup
|
2023-01-27 10:53:05 -03:00 |
|
oobabooga
|
70e034589f
|
Update the export/load chat history functions
|
2023-01-27 02:16:05 -03:00 |
|
oobabooga
|
6b5dcd46c5
|
Add support for extensions
This is experimental.
|
2023-01-27 00:40:39 -03:00 |
|
oobabooga
|
e69990e37b
|
Change order of upload and download tabs in chat mode
|
2023-01-26 16:57:12 -03:00 |
|
oobabooga
|
ac6065d5ed
|
Fix character loading bug
|
2023-01-26 13:45:19 -03:00 |
|
oobabooga
|
61611197e0
|
Add --verbose option (oops)
|
2023-01-26 02:18:06 -03:00 |
|
oobabooga
|
abc920752f
|
Stop at eos_token while streaming text (for #26)
|
2023-01-25 22:27:04 -03:00 |
|
oobabooga
|
b77933d327
|
File names must be img_me.jpg and img_bot.jpg
|
2023-01-25 19:40:30 -03:00 |
|
oobabooga
|
fc73188ec7
|
Allow specifying your own profile picture in chat mode
|
2023-01-25 19:37:44 -03:00 |
|
oobabooga
|
3fa14befc5
|
Bump the gradio version, add back the queue
|
2023-01-25 16:10:35 -03:00 |
|
oobabooga
|
7a3717b824
|
Allow uploading characters
|
2023-01-25 15:45:25 -03:00 |
|
oobabooga
|
6388c7fbc0
|
Set queue size to 1 to prevent gradio undefined behavior
|
2023-01-25 14:37:41 -03:00 |
|
oobabooga
|
ec69c190ba
|
Keep the character's greeting/example dialogue when "clear history" is clicked
|
2023-01-25 10:52:35 -03:00 |
|
oobabooga
|
ebed1dea56
|
Generate 8 tokens at a time in streaming mode instead of just 1
This is a performance optimization.
|
2023-01-25 10:38:26 -03:00 |
|
oobabooga
|
3b8f0021cc
|
Stop generating at \nYou: in chat mode
|
2023-01-25 10:17:55 -03:00 |
|
oobabooga
|
54e77acac4
|
Rename to "Generation parameters preset" for clarity
|
2023-01-23 20:49:44 -03:00 |
|
oobabooga
|
ce4756fb88
|
Allow uploading chat history in official pygmalion web ui format
|
2023-01-23 15:29:01 -03:00 |
|
oobabooga
|
8325e23923
|
Fix bug in loading chat history as text file
|
2023-01-23 14:28:02 -03:00 |
|
oobabooga
|
059d47edb5
|
Submit with enter instead of shift+enter in chat mode
|
2023-01-23 14:04:01 -03:00 |
|
oobabooga
|
4820379139
|
Add debug preset (deterministic, should always give the same responses)
|
2023-01-23 13:36:01 -03:00 |
|
oobabooga
|
947b50e8ea
|
Allow uploading chat history as simple text files
|
2023-01-23 09:45:10 -03:00 |
|
oobabooga
|
ebf720585b
|
Mention time and it/s in terminal with streaming off
|
2023-01-22 20:07:19 -03:00 |
|
oobabooga
|
d87310ad61
|
Send last input to the input box when "Remove last" is clicked
|
2023-01-22 19:40:22 -03:00 |
|
oobabooga
|
d0ea6d5f86
|
Make the maximum history size in prompt unlimited by default
|
2023-01-22 17:17:35 -03:00 |
|
oobabooga
|
00f3b0996b
|
Warn the user that chat mode becomes a lot slower with text streaming
|
2023-01-22 16:19:11 -03:00 |
|
oobabooga
|
c5cc3a3075
|
Fix bug in "remove last" button
|
2023-01-22 13:10:36 -03:00 |
|
oobabooga
|
a410cf1345
|
Mention that "Chat history size" means "Chat history size in prompt"
|
2023-01-22 03:15:35 -03:00 |
|
oobabooga
|
b3e1a874bc
|
Fix bug in loading history
|
2023-01-22 02:32:54 -03:00 |
|
oobabooga
|
62b533f344
|
Add "regenerate" button to the chat
|
2023-01-22 02:19:58 -03:00 |
|
oobabooga
|
94ecbc6dff
|
Export history as nicely formatted json
|
2023-01-22 01:24:16 -03:00 |
|
oobabooga
|
deacb96c34
|
Change the pygmalion default context
|
2023-01-22 00:49:59 -03:00 |
|
oobabooga
|
23f94f559a
|
Improve the chat prompt design
|
2023-01-22 00:35:42 -03:00 |
|
oobabooga
|
139e2f0ab4
|
Redesign the upload/download chat history buttons
|
2023-01-22 00:22:50 -03:00 |
|
oobabooga
|
434d4b128c
|
Add refresh buttons for the model/preset/character menus
|
2023-01-22 00:02:46 -03:00 |
|
oobabooga
|
1e5e56fa2e
|
Better recognize the 4chan model (for #19)
|
2023-01-21 22:13:01 -03:00 |
|
oobabooga
|
aadf4e899a
|
Improve example dialogue handling
|
2023-01-21 15:04:13 -03:00 |
|
oobabooga
|
f9dbe7e08e
|
Update README
|
2023-01-21 03:05:55 -03:00 |
|
oobabooga
|
27e2d932b0
|
Don't export include the example dialogue in the export json
|
2023-01-21 02:55:13 -03:00 |
|
oobabooga
|
990ee54ddd
|
Move the example dialogue to the chat history, and keep it hidden.
This greatly improves the performance of text generation, as
histories can be quite long. It also makes more sense to implement
it this way.
|
2023-01-21 02:48:06 -03:00 |
|
oobabooga
|
d7299df01f
|
Rename parameters
|
2023-01-21 00:33:41 -03:00 |
|
oobabooga
|
5df03bf0fd
|
Merge branch 'main' into main
|
2023-01-21 00:25:34 -03:00 |
|
oobabooga
|
faaafe7c0e
|
Better parameter naming
|
2023-01-20 23:45:16 -03:00 |
|
Silver267
|
f4634e4c32
|
Update.
|
2023-01-20 17:05:43 -05:00 |
|
oobabooga
|
c0f2367b54
|
Minor fix
|
2023-01-20 17:09:25 -03:00 |
|
oobabooga
|
185587a33e
|
Add a history size parameter to the chat
If too many messages are used in the prompt, the model
gets really slow. It is useful to have the ability to
limit this.
|
2023-01-20 17:03:09 -03:00 |
|
oobabooga
|
78d5a999e6
|
Improve prompt formatation
|
2023-01-20 01:54:38 -03:00 |
|
oobabooga
|
70ff685736
|
Encode the input string correctly
|
2023-01-20 00:45:02 -03:00 |
|