LLMWorkbook

LLMWORKBOOK Provider OpenAI

Each provider function uses specific keys from the configuration’s options dictionary. In addition to these, you can also set a system_prompt (outside of options) and—where applicable—an API key (for OpenAI). Below are the options available for each provider.

Important - This API uses the latest Response endpoint from OpenAI instead of Chat completions. Please do refer to relevant documentation. As surface, the changes are made to be backward compatible and non-breaking.


1. OpenAI Provider (call_llm_openai)

Configuration Keys in options:

Additional OpenAI API Parameters Description: Any valid OpenAI API parameter (e.g., max_tokens, top_p, frequency_penalty, etc.) can be provided via options. This ensures full control over the API request without modifying the function.

Example -

config = {
    "options": {
        "model": "gpt-4o-mini",
        "temperature": 0.7,
        #Additional Parameter as needed
        "max_tokens": 500,
        "top_p": 0.9,
        "frequency_penalty": 0.5
        #Output format
        'response_format' : ...,
    },