Skip to content

[`agentic_crew.config.llm`](#module-agentic_crew.config.llm)

LLM Configuration for CrewAI agents.

Uses Anthropic Claude directly via ANTHROPIC_API_KEY. Falls back to OpenRouter if OPENROUTER_API_KEY is set.

LLMProviderAvailable LLM providers.
LLMConfigConfiguration for specific LLM use cases.
get_llmGet configured LLM instance for CrewAI agents.
_create_anthropic_llmCreate Anthropic LLM instance.
_create_openrouter_llmCreate OpenRouter LLM instance.
get_llm_or_raiseGet configured LLM instance, raising if API key not set.
get_llm_for_taskGet LLM configured for a specific task type.
get_reasoning_llmGet LLM optimized for complex reasoning tasks.
get_creative_llmGet LLM optimized for creative tasks.
get_code_llmGet LLM optimized for code generation.
_CLAUDE_HAIKU_45
_CLAUDE_SONNET_45
_CLAUDE_SONNET_4
_CLAUDE_OPUS_4
DEFAULT_MODEL
MODELS
LLM_CONFIGS

class agentic_crew.config.llm.LLMProvider(*args, **kwds)

Section titled “class agentic_crew.config.llm.LLMProvider(*args, **kwds)”

Bases: enum.Enum

Available LLM providers.

‘anthropic’

‘openrouter’

Configuration for specific LLM use cases.

None

None

None

‘claude-haiku-4-5-20251001’

‘claude-sonnet-4-5-20250929’

‘claude-sonnet-4-20250514’

‘claude-opus-4-20250514’

None

None

None

Get configured LLM instance for CrewAI agents.

Args: model: Model identifier. Defaults to claude-haiku-4-5-20251001 (DEFAULT_MODEL) temperature: Sampling temperature (0.0-1.0). Lower = more focused, higher = more creative. provider: Force specific provider (ANTHROPIC or OPENROUTER). If None, auto-detects based on available API keys.

Available models:

  • claude-haiku-4-5-20251001 (default - fast, cost-effective)
  • claude-sonnet-4-5-20250929 (best for code and creative)
  • claude-sonnet-4-20250514 (capable general purpose)
  • claude-opus-4-20250514 (most capable)
  • openrouter/auto (fallback via OpenRouter)

Returns: Configured LLM instance, or None if no API key set

Note: Tries ANTHROPIC_API_KEY first, falls back to OPENROUTER_API_KEY. Returns None if neither is set, allowing CrewAI to use its default.

Example:

llm = get_llm() # Uses Claude Haiku 4.5 llm = get_llm(“claude-opus-4-20250514”, temperature=0.3) llm = get_llm(provider=LLMProvider.OPENROUTER)

agentic_crew.config.llm._create_anthropic_llm(model: str, temperature: float, api_key: str) → crewai.LLM

Section titled “agentic_crew.config.llm._create_anthropic_llm(model: str, temperature: float, api_key: str) → crewai.LLM”

Create Anthropic LLM instance.

agentic_crew.config.llm._create_openrouter_llm(model: str, temperature: float, api_key: str) → crewai.LLM

Section titled “agentic_crew.config.llm._create_openrouter_llm(model: str, temperature: float, api_key: str) → crewai.LLM”

Create OpenRouter LLM instance.

Get configured LLM instance, raising if API key not set.

Use this when you need to ensure an LLM is available.

Args: model: Model identifier (see get_llm for options) temperature: Sampling temperature (0.0-1.0) provider: Force specific provider (optional)

Returns: Configured LLM instance

Raises: ValueError: If neither ANTHROPIC_API_KEY nor OPENROUTER_API_KEY is set

Get LLM configured for a specific task type.

Args: task: Task type - one of: ‘reasoning’, ‘creative’, ‘code’, ‘default’

Returns: Configured LLM instance, or None if no API key set

Raises: ValueError: If task type is unknown

Example:

llm = get_llm_for_task(‘code’) # Low temp, optimized for code llm = get_llm_for_task(‘creative’) # High temp, creative output

agentic_crew.config.llm.get_reasoning_llm() → crewai.LLM | None

Section titled “agentic_crew.config.llm.get_reasoning_llm() → crewai.LLM | None”

Get LLM optimized for complex reasoning tasks.

agentic_crew.config.llm.get_creative_llm() → crewai.LLM | None

Section titled “agentic_crew.config.llm.get_creative_llm() → crewai.LLM | None”

Get LLM optimized for creative tasks.

Get LLM optimized for code generation.