LLMWorkbook

LLMWORKBOOK Provider Anthropic

Anthropic’s Claude API has a distinct structure and accepts only specific parameters. This documentation explains how to configure and use the call_llm_anthropic function through the LLMConfig system in llmworkbook.


1. Anthropic Provider (call_llm_anthropic)

Configuration Keys in options:


✅ Example Configuration

from llmworkbook import LLMConfig

config = LLMConfig(
    provider="anthropic",
    system_prompt="You are a helpful assistant.",
    options={
        "model": "claude-3-haiku-20240307",
        "max_tokens": 512,
    }
)

API Key