add information about token size parameters

This commit is contained in:
Max Kammler 2023-11-07 10:18:30 +01:00
parent d8df41d101
commit b07ef97e33

View File

@ -17,6 +17,10 @@ CHATGPT_API_MODEL=gpt-3.5-turbo
#CHATGPT_REVERSE_PROXY=https://api.openai.com/v1/chat/completions #CHATGPT_REVERSE_PROXY=https://api.openai.com/v1/chat/completions
# (Optional) Set the temperature of the model. 0.0 is deterministic, 1.0 is very creative. # (Optional) Set the temperature of the model. 0.0 is deterministic, 1.0 is very creative.
# CHATGPT_TEMPERATURE=0.8 # CHATGPT_TEMPERATURE=0.8
# (Optional) (Optional) Davinci models have a max context length of 4097 tokens, but you may need to change this for other models.
# CHATGPT_MAX_CONTEXT_TOKENS=4097
# You might want to lower this to save money if using a paid model. Earlier messages will be dropped until the prompt is within the limit.
# CHATGPT_MAX_PROMPT_TOKENS=3097
# Set data store settings # Set data store settings
KEYV_BACKEND=file KEYV_BACKEND=file