[`agentic_crew.config.llm`](#module-agentic_crew.config.llm)
LLM Configuration for CrewAI agents.
Uses Anthropic Claude directly via ANTHROPIC_API_KEY. Falls back to OpenRouter if OPENROUTER_API_KEY is set.
Module Contents
Section titled “Module Contents”Classes
Section titled “Classes”LLMProvider | Available LLM providers. |
|---|---|
LLMConfig | Configuration for specific LLM use cases. |
Functions
Section titled “Functions”get_llm | Get configured LLM instance for CrewAI agents. |
|---|---|
_create_anthropic_llm | Create Anthropic LLM instance. |
_create_openrouter_llm | Create OpenRouter LLM instance. |
get_llm_or_raise | Get configured LLM instance, raising if API key not set. |
get_llm_for_task | Get LLM configured for a specific task type. |
get_reasoning_llm | Get LLM optimized for complex reasoning tasks. |
get_creative_llm | Get LLM optimized for creative tasks. |
get_code_llm | Get LLM optimized for code generation. |
_CLAUDE_HAIKU_45 | |
|---|---|
_CLAUDE_SONNET_45 | |
_CLAUDE_SONNET_4 | |
_CLAUDE_OPUS_4 | |
DEFAULT_MODEL | |
MODELS | |
LLM_CONFIGS |
class agentic_crew.config.llm.LLMProvider(*args, **kwds)
Section titled “class agentic_crew.config.llm.LLMProvider(*args, **kwds)”Bases: enum.Enum
Available LLM providers.
Initialization
Section titled “Initialization”ANTHROPIC
Section titled “ANTHROPIC”‘anthropic’
OPENROUTER
Section titled “OPENROUTER”‘openrouter’
class agentic_crew.config.llm.LLMConfig
Section titled “class agentic_crew.config.llm.LLMConfig”Configuration for specific LLM use cases.
model : str
Section titled “model : str”None
temperature : float
Section titled “temperature : float”None
description : str
Section titled “description : str”None
agentic_crew.config.llm._CLAUDE_HAIKU_45
Section titled “agentic_crew.config.llm._CLAUDE_HAIKU_45”‘claude-haiku-4-5-20251001’
agentic_crew.config.llm._CLAUDE_SONNET_45
Section titled “agentic_crew.config.llm._CLAUDE_SONNET_45”‘claude-sonnet-4-5-20250929’
agentic_crew.config.llm._CLAUDE_SONNET_4
Section titled “agentic_crew.config.llm._CLAUDE_SONNET_4”‘claude-sonnet-4-20250514’
agentic_crew.config.llm._CLAUDE_OPUS_4
Section titled “agentic_crew.config.llm._CLAUDE_OPUS_4”‘claude-opus-4-20250514’
agentic_crew.config.llm.DEFAULT_MODEL
Section titled “agentic_crew.config.llm.DEFAULT_MODEL”None
agentic_crew.config.llm.MODELS
Section titled “agentic_crew.config.llm.MODELS”None
agentic_crew.config.llm.LLM_CONFIGS
Section titled “agentic_crew.config.llm.LLM_CONFIGS”None
agentic_crew.config.llm.get_llm(model: str = DEFAULT_MODEL, temperature: float = 0.7, provider: agentic_crew.config.llm.LLMProvider | None = None) → crewai.LLM | None
Section titled “agentic_crew.config.llm.get_llm(model: str = DEFAULT_MODEL, temperature: float = 0.7, provider: agentic_crew.config.llm.LLMProvider | None = None) → crewai.LLM | None”Get configured LLM instance for CrewAI agents.
Args: model: Model identifier. Defaults to claude-haiku-4-5-20251001 (DEFAULT_MODEL) temperature: Sampling temperature (0.0-1.0). Lower = more focused, higher = more creative. provider: Force specific provider (ANTHROPIC or OPENROUTER). If None, auto-detects based on available API keys.
Available models:
- claude-haiku-4-5-20251001 (default - fast, cost-effective)
- claude-sonnet-4-5-20250929 (best for code and creative)
- claude-sonnet-4-20250514 (capable general purpose)
- claude-opus-4-20250514 (most capable)
- openrouter/auto (fallback via OpenRouter)
Returns: Configured LLM instance, or None if no API key set
Note: Tries ANTHROPIC_API_KEY first, falls back to OPENROUTER_API_KEY. Returns None if neither is set, allowing CrewAI to use its default.
Example:
llm = get_llm() # Uses Claude Haiku 4.5 llm = get_llm(“claude-opus-4-20250514”, temperature=0.3) llm = get_llm(provider=LLMProvider.OPENROUTER)
agentic_crew.config.llm._create_anthropic_llm(model: str, temperature: float, api_key: str) → crewai.LLM
Section titled “agentic_crew.config.llm._create_anthropic_llm(model: str, temperature: float, api_key: str) → crewai.LLM”Create Anthropic LLM instance.
agentic_crew.config.llm._create_openrouter_llm(model: str, temperature: float, api_key: str) → crewai.LLM
Section titled “agentic_crew.config.llm._create_openrouter_llm(model: str, temperature: float, api_key: str) → crewai.LLM”Create OpenRouter LLM instance.
agentic_crew.config.llm.get_llm_or_raise(model: str = DEFAULT_MODEL, temperature: float = 0.7, provider: agentic_crew.config.llm.LLMProvider | None = None) → crewai.LLM
Section titled “agentic_crew.config.llm.get_llm_or_raise(model: str = DEFAULT_MODEL, temperature: float = 0.7, provider: agentic_crew.config.llm.LLMProvider | None = None) → crewai.LLM”Get configured LLM instance, raising if API key not set.
Use this when you need to ensure an LLM is available.
Args: model: Model identifier (see get_llm for options) temperature: Sampling temperature (0.0-1.0) provider: Force specific provider (optional)
Returns: Configured LLM instance
Raises: ValueError: If neither ANTHROPIC_API_KEY nor OPENROUTER_API_KEY is set
agentic_crew.config.llm.get_llm_for_task(task: str) → crewai.LLM | None
Section titled “agentic_crew.config.llm.get_llm_for_task(task: str) → crewai.LLM | None”Get LLM configured for a specific task type.
Args: task: Task type - one of: ‘reasoning’, ‘creative’, ‘code’, ‘default’
Returns: Configured LLM instance, or None if no API key set
Raises: ValueError: If task type is unknown
Example:
llm = get_llm_for_task(‘code’) # Low temp, optimized for code llm = get_llm_for_task(‘creative’) # High temp, creative output
agentic_crew.config.llm.get_reasoning_llm() → crewai.LLM | None
Section titled “agentic_crew.config.llm.get_reasoning_llm() → crewai.LLM | None”Get LLM optimized for complex reasoning tasks.
agentic_crew.config.llm.get_creative_llm() → crewai.LLM | None
Section titled “agentic_crew.config.llm.get_creative_llm() → crewai.LLM | None”Get LLM optimized for creative tasks.
agentic_crew.config.llm.get_code_llm() → crewai.LLM | None
Section titled “agentic_crew.config.llm.get_code_llm() → crewai.LLM | None”Get LLM optimized for code generation.