Each provider function uses specific keys from the configuration’s options
dictionary. In addition to these, you can also set a system_prompt
(outside of options
) and—where applicable—an API key (for OpenAI). Below are the options available for each provider.
call_llm_gpt4all
)Configuration Keys in options
:
model
str
"default-model"
if not provided.max_tokens
int
temperature
float
or int
Additional Configurations (Outside options
):
system_prompt
url
(Parameter)
"http://localhost:4891"
if not provided.