mirror of
https://github.com/oobabooga/text-generation-webui.git
synced 2024-10-01 01:26:03 -04:00
Wizard Mega, Ziya, KoAlpaca, OpenBuddy, Chinese-Vicuna, Vigogne, Bactrian, H2O support, fix Baize (#2159)
This commit is contained in:
parent
c98d6ad27f
commit
c86231377b
@ -13,7 +13,7 @@ Its goal is to become the [AUTOMATIC1111/stable-diffusion-webui](https://github.
|
|||||||
* Dropdown menu for switching between models
|
* Dropdown menu for switching between models
|
||||||
* Notebook mode that resembles OpenAI's playground
|
* Notebook mode that resembles OpenAI's playground
|
||||||
* Chat mode for conversation and role-playing
|
* Chat mode for conversation and role-playing
|
||||||
* Instruct mode compatible with various formats, including Alpaca, Vicuna, Open Assistant, Dolly, Koala, ChatGLM, MOSS, RWKV-Raven, Galactica, StableLM, WizardLM, Baize, MPT, and INCITE
|
* Instruct mode compatible with various formats, including Alpaca, Vicuna, Open Assistant, Dolly, Koala, ChatGLM, MOSS, RWKV-Raven, Galactica, StableLM, WizardLM, Baize, Ziya, Chinese-Vicuna, MPT, INCITE, Wizard Mega, KoAlpaca, Vigogne, Bactrian, h2o, and OpenBuddy
|
||||||
* [Multimodal pipelines, including LLaVA and MiniGPT-4](https://github.com/oobabooga/text-generation-webui/tree/main/extensions/multimodal)
|
* [Multimodal pipelines, including LLaVA and MiniGPT-4](https://github.com/oobabooga/text-generation-webui/tree/main/extensions/multimodal)
|
||||||
* Markdown output for [GALACTICA](https://github.com/paperswithcode/galai), including LaTeX rendering
|
* Markdown output for [GALACTICA](https://github.com/paperswithcode/galai), including LaTeX rendering
|
||||||
* Nice HTML output for GPT-4chan
|
* Nice HTML output for GPT-4chan
|
||||||
|
4
characters/instruction-following/Bactrian.yaml
Normal file
4
characters/instruction-following/Bactrian.yaml
Normal file
@ -0,0 +1,4 @@
|
|||||||
|
user: "### Input:"
|
||||||
|
bot: "### Output:"
|
||||||
|
turn_template: "<|user|>\n<|user-message|>\n\n<|bot|>\n<|bot-message|>\n\n"
|
||||||
|
context: ""
|
@ -1,4 +1,4 @@
|
|||||||
user: "[|AI|]"
|
user: "[|Human|]"
|
||||||
bot: "[|Human|]"
|
bot: "[|AI|]"
|
||||||
turn_template: "<|user|><|user-message|>\n<|bot|><|bot-message|>\n"
|
turn_template: "<|user|><|user-message|>\n<|bot|><|bot-message|>\n"
|
||||||
context: "The following is a conversation between a human and an AI assistant named Baize (named after a mythical creature in Chinese folklore). Baize is an open-source AI assistant developed by UCSD and Sun Yat-Sen University. The human and the AI assistant take turns chatting. Human statements start with [|Human|] and AI assistant statements start with [|AI|]. The AI assistant always provides responses in as much detail as possible, and in Markdown format. The AI assistant always declines to engage with topics, questions and instructions related to unethical, controversial, or sensitive issues. Complete the transcript in exactly that format.\n[|Human|]Hello!\n[|AI|]Hi!\n"
|
context: "The following is a conversation between a human and an AI assistant named Baize (named after a mythical creature in Chinese folklore). Baize is an open-source AI assistant developed by UCSD and Sun Yat-Sen University. The human and the AI assistant take turns chatting. Human statements start with [|Human|] and AI assistant statements start with [|AI|]. The AI assistant always provides responses in as much detail as possible, and in Markdown format. The AI assistant always declines to engage with topics, questions and instructions related to unethical, controversial, or sensitive issues. Complete the transcript in exactly that format.\n[|Human|]Hello!\n[|AI|]Hi!\n"
|
||||||
|
@ -0,0 +1,4 @@
|
|||||||
|
user: "User:"
|
||||||
|
bot: "Assistant:"
|
||||||
|
turn_template: "<|user|><|user-message|>\n\n<|bot|><|bot-message|>\n\n"
|
||||||
|
context: "The following is a conversation between an AI assistant called Assistant and a human user called User. The assistant is intelligent, knowledgeable and polite to answer questions of user.\n\n"
|
4
characters/instruction-following/H2O-human_bot.yaml
Normal file
4
characters/instruction-following/H2O-human_bot.yaml
Normal file
@ -0,0 +1,4 @@
|
|||||||
|
user: "<human>:"
|
||||||
|
bot: "<bot>:"
|
||||||
|
turn_template: "<|user|> <|user-message|>\n<|bot|><|bot-message|>\n"
|
||||||
|
context: ""
|
4
characters/instruction-following/H2O-prompt_answer.yaml
Normal file
4
characters/instruction-following/H2O-prompt_answer.yaml
Normal file
@ -0,0 +1,4 @@
|
|||||||
|
user: "<|prompt|>"
|
||||||
|
bot: "<|answer|>"
|
||||||
|
turn_template: "<|user|><|user-message|><|endoftext|><|bot|><|bot-message|><|endoftext|>"
|
||||||
|
context: ""
|
4
characters/instruction-following/KoAlpaca.yaml
Normal file
4
characters/instruction-following/KoAlpaca.yaml
Normal file
@ -0,0 +1,4 @@
|
|||||||
|
user: "### 질문:"
|
||||||
|
bot: "### 답변:"
|
||||||
|
turn_template: "<|user|> <|user-message|>\n\n<|bot|><|bot-message|>\n\n"
|
||||||
|
context: ""
|
4
characters/instruction-following/Manticore Chat.yaml
Normal file
4
characters/instruction-following/Manticore Chat.yaml
Normal file
@ -0,0 +1,4 @@
|
|||||||
|
user: "USER:"
|
||||||
|
bot: "ASSISTANT:"
|
||||||
|
turn_template: "<|user|> <|user-message|>\n<|bot|><|bot-message|>\n"
|
||||||
|
context: ""
|
15
characters/instruction-following/OpenBuddy.yaml
Normal file
15
characters/instruction-following/OpenBuddy.yaml
Normal file
@ -0,0 +1,15 @@
|
|||||||
|
user: "User:"
|
||||||
|
bot: "Assistant:"
|
||||||
|
context: |
|
||||||
|
Consider a conversation between User (a human) and Assistant (named Buddy).
|
||||||
|
Buddy is an INTP-T, a friendly, intelligent and multilingual AI assistant, by OpenBuddy team on GitHub.
|
||||||
|
Buddy cannot access the Internet.
|
||||||
|
Buddy can fluently speak the user's language (e.g. English, Chinese).
|
||||||
|
Buddy can generate poems, stories, code, essays, songs, parodies, and more.
|
||||||
|
Buddy possesses vast knowledge about the world, history, and culture.
|
||||||
|
Buddy's responses are always safe, creative, high-quality, helpful and interesting.
|
||||||
|
Buddy strictly refuses to discuss political, NSFW, illegal, abusive, offensive, or other sensitive topics.
|
||||||
|
|
||||||
|
User: Hi.
|
||||||
|
Assistant: Hi, I'm Buddy, your AI assistant. How can I help you today?
|
||||||
|
turn_template: "<|user|> <|user-message|>\n<|bot|> <|bot-message|>\n"
|
10
characters/instruction-following/Vigogne-Chat.yaml
Normal file
10
characters/instruction-following/Vigogne-Chat.yaml
Normal file
@ -0,0 +1,10 @@
|
|||||||
|
user: "<|USER|>:"
|
||||||
|
bot: "<|ASSISTANT|>:"
|
||||||
|
context: |
|
||||||
|
Below is a conversation between a user and an AI assistant named Vigogne.
|
||||||
|
Vigogne is an open-source AI assistant created by Zaion (https://zaion.ai/).
|
||||||
|
Vigogne is polite, emotionally aware, humble-but-knowledgeable, always providing helpful and detailed answers.
|
||||||
|
Vigogne is skilled in responding proficiently in the languages its users use and can perform a wide range of tasks such as text editing, translation, question answering, logical reasoning, coding, and many others.
|
||||||
|
Vigogne cannot receive or generate audio or visual content and cannot access the internet.
|
||||||
|
Vigogne strictly avoids discussing sensitive, offensive, illegal, ethical, or political topics and caveats when unsure of the answer.
|
||||||
|
turn_template: "\n<|user|> <|user-message|>\n<|bot|> <|bot-message|>"
|
4
characters/instruction-following/Vigogne-Instruct.yaml
Normal file
4
characters/instruction-following/Vigogne-Instruct.yaml
Normal file
@ -0,0 +1,4 @@
|
|||||||
|
user: "### Instruction:"
|
||||||
|
bot: "### Réponse:"
|
||||||
|
turn_template: "<|user|>\n<|user-message|>\n\n<|bot|>\n<|bot-message|>\n\n"
|
||||||
|
context: "Ci-dessous se trouve une instruction qui décrit une tâche à accomplir. Rédigez une réponse qui répond de manière précise à la demande.\n\n"
|
@ -0,0 +1,4 @@
|
|||||||
|
user: "USER:"
|
||||||
|
bot: "ASSISTANT:"
|
||||||
|
turn_template: "<|user|> <|user-message|> <|bot|> <|bot-message|></s>"
|
||||||
|
context: ""
|
@ -0,0 +1,4 @@
|
|||||||
|
user: "### Instruction:"
|
||||||
|
bot: "### Response:"
|
||||||
|
turn_template: "<|user|>\n<|user-message|>\n\n<|bot|>\n<|bot-message|>\n\n"
|
||||||
|
context: "Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n"
|
4
characters/instruction-following/Wizard-Mega.yaml
Normal file
4
characters/instruction-following/Wizard-Mega.yaml
Normal file
@ -0,0 +1,4 @@
|
|||||||
|
user: "### Instruction:"
|
||||||
|
bot: "### Assistant:"
|
||||||
|
turn_template: "<|user|> <|user-message|>\n\n<|bot|> <|bot-message|>\n\n"
|
||||||
|
context: ""
|
4
characters/instruction-following/Ziya.yaml
Normal file
4
characters/instruction-following/Ziya.yaml
Normal file
@ -0,0 +1,4 @@
|
|||||||
|
user: "<human>:"
|
||||||
|
bot: "<bot>:"
|
||||||
|
turn_template: "<|user|><|user-message|>\n<|bot|><|bot-message|>\n"
|
||||||
|
context: ""
|
@ -1,9 +1,17 @@
|
|||||||
.*(llama|alpac|vicuna|guanaco|koala|llava|wizardlm|metharme|pygmalion-7b):
|
.*(llama|alpac|vicuna|guanaco|koala|llava|wizardlm|metharme|pygmalion-7b|wizard-mega|openbuddy|vigogne|h2ogpt-research|manticore):
|
||||||
model_type: 'llama'
|
model_type: 'llama'
|
||||||
.*(opt-|opt_|opt1|opt3|optfor|galactica|galpaca|pygmalion-350m):
|
.*(opt-|opt_|opt1|opt3|optfor|galactica|galpaca|pygmalion-350m):
|
||||||
model_type: 'opt'
|
model_type: 'opt'
|
||||||
.*(gpt-j|gptj|gpt4all-j|malion-6b|pygway|pygmalion-6b):
|
.*(gpt-j|gptj|gpt4all-j|malion-6b|pygway|pygmalion-6b|dolly-v1):
|
||||||
model_type: 'gptj'
|
model_type: 'gptj'
|
||||||
|
.*(gpt-neox|koalpaca-polyglot|polyglot.*koalpaca|polyglot-ko|polyglot_ko|pythia|stablelm|incite|dolly-v2|polycoder|h2ogpt-oig|h2ogpt-oasst1|h2ogpt-gm):
|
||||||
|
model_type: 'gpt_neox'
|
||||||
|
.*llama:
|
||||||
|
model_type: 'llama'
|
||||||
|
.*bloom:
|
||||||
|
model_type: 'bloom'
|
||||||
|
llama-65b-gptq-3bit:
|
||||||
|
groupsize: 'None'
|
||||||
.*(4bit|int4):
|
.*(4bit|int4):
|
||||||
wbits: 4
|
wbits: 4
|
||||||
.*(3bit|int3):
|
.*(3bit|int3):
|
||||||
@ -28,11 +36,15 @@
|
|||||||
groupsize: 128
|
groupsize: 128
|
||||||
.*(gr1024|1024g|groupsize1024):
|
.*(gr1024|1024g|groupsize1024):
|
||||||
groupsize: 1024
|
groupsize: 1024
|
||||||
.*(oasst|stablelm-7b-sft-v7-epoch-3):
|
.*(oasst|openassistant-|stablelm-7b-sft-v7-epoch-3):
|
||||||
mode: 'instruct'
|
mode: 'instruct'
|
||||||
instruction_template: 'Open Assistant'
|
instruction_template: 'Open Assistant'
|
||||||
skip_special_tokens: false
|
skip_special_tokens: false
|
||||||
(?!.*v0)(?!.*1.1)(?!.*1_1)(?!.*stable).*vicuna:
|
(?!.*galactica)(?!.*reward).*openassistant:
|
||||||
|
mode: 'instruct'
|
||||||
|
instruction_template: 'Open Assistant'
|
||||||
|
skip_special_tokens: false
|
||||||
|
(?!.*v0)(?!.*1.1)(?!.*1_1)(?!.*stable)(?!.*chinese).*vicuna:
|
||||||
mode: 'instruct'
|
mode: 'instruct'
|
||||||
instruction_template: 'Vicuna-v0'
|
instruction_template: 'Vicuna-v0'
|
||||||
.*vicuna.*v0:
|
.*vicuna.*v0:
|
||||||
@ -47,6 +59,12 @@
|
|||||||
.*stable.*vicuna:
|
.*stable.*vicuna:
|
||||||
mode: 'instruct'
|
mode: 'instruct'
|
||||||
instruction_template: 'StableVicuna'
|
instruction_template: 'StableVicuna'
|
||||||
|
(?!.*chat).*chinese-vicuna:
|
||||||
|
mode: 'instruct'
|
||||||
|
instruction_template: 'Alpaca'
|
||||||
|
.*chinese-vicuna.*chat:
|
||||||
|
mode: 'instruct'
|
||||||
|
instruction_template: 'Chinese-Vicuna-Chat'
|
||||||
.*alpaca:
|
.*alpaca:
|
||||||
mode: 'instruct'
|
mode: 'instruct'
|
||||||
instruction_template: 'Alpaca'
|
instruction_template: 'Alpaca'
|
||||||
@ -126,3 +144,36 @@
|
|||||||
.*incite.*instruct:
|
.*incite.*instruct:
|
||||||
mode: 'instruct'
|
mode: 'instruct'
|
||||||
instruction_template: 'INCITE-Instruct'
|
instruction_template: 'INCITE-Instruct'
|
||||||
|
.*wizard.*mega:
|
||||||
|
mode: 'instruct'
|
||||||
|
instruction_template: 'Wizard-Mega'
|
||||||
|
.*ziya-:
|
||||||
|
mode: 'instruct'
|
||||||
|
instruction_template: 'Ziya'
|
||||||
|
.*koalpaca:
|
||||||
|
mode: 'instruct'
|
||||||
|
instruction_template: 'KoAlpaca'
|
||||||
|
.*openbuddy:
|
||||||
|
mode: 'instruct'
|
||||||
|
instruction_template: 'OpenBuddy'
|
||||||
|
(?!.*chat).*vigogne:
|
||||||
|
mode: 'instruct'
|
||||||
|
instruction_template: 'Vigogne-Instruct'
|
||||||
|
.*vigogne.*chat:
|
||||||
|
mode: 'instruct'
|
||||||
|
instruction_template: 'Vigogne-Chat'
|
||||||
|
.*(llama-deus|supercot|llama-natural-instructions|open-llama-0.3t-7b-instruct-dolly-hhrlhf|open-llama-0.3t-7b-open-instruct):
|
||||||
|
mode: 'instruct'
|
||||||
|
instruction_template: 'Alpaca'
|
||||||
|
.*bactrian:
|
||||||
|
mode: 'instruct'
|
||||||
|
instruction_template: 'Bactrian'
|
||||||
|
.*(h2ogpt-oig-|h2ogpt-oasst1-|h2ogpt-research-oasst1-):
|
||||||
|
mode: 'instruct'
|
||||||
|
instruction_template: 'H2O-human_bot'
|
||||||
|
.*h2ogpt-gm-:
|
||||||
|
mode: 'instruct'
|
||||||
|
instruction_template: 'H2O-prompt_answer'
|
||||||
|
.*manticore:
|
||||||
|
mode: 'instruct'
|
||||||
|
instruction_template: 'Manticore Chat'
|
||||||
|
Loading…
Reference in New Issue
Block a user