Skip to main content
Use Auriko as your LLM provider in CrewAI for cost-effective multi-agent workflows.

Prerequisites

Installation

pip install "auriko[crewai]"

Use SDK adapter

Use the AurikoCrewAILLM adapter:
from auriko.frameworks.crewai import AurikoCrewAILLM

auriko_llm = AurikoCrewAILLM(model="gpt-5.4")
AurikoCrewAILLM adds an openai/ prefix internally so all models (including Claude) route through Auriko. Without this wrapper, CrewAI detects claude- model names and silently routes to the native Anthropic SDK, bypassing Auriko entirely.
from crewai import Agent, Task, Crew

researcher = Agent(
    role="Researcher",
    goal="Find accurate and comprehensive information",
    backstory="You are an expert researcher with attention to detail.",
    llm=auriko_llm.llm,
    verbose=True,
)

writer = Agent(
    role="Writer",
    goal="Write clear, engaging content based on research",
    backstory="You are a skilled technical writer.",
    llm=auriko_llm.llm,
    verbose=True,
)

research_task = Task(
    description="Research the latest trends in AI agents",
    agent=researcher,
    expected_output="A detailed summary of AI agent trends with sources",
)

writing_task = Task(
    description="Write a blog post based on the research findings",
    agent=writer,
    expected_output="A 500-word blog post about AI agent trends",
    context=[research_task],
)

crew = Crew(
    agents=[researcher, writer],
    tasks=[research_task, writing_task],
    verbose=True,
)

result = crew.kickoff()
print(result)

Configure options

ParameterTypeDefaultDescription
modelstr(required)Model ID (e.g., "gpt-5.4", "claude-sonnet-4-20250514")
api_keystr | NoneAURIKO_API_KEY envAPI key
routingRoutingOptions | NoneNoneRouting configuration
base_urlstr"https://api.auriko.ai/v1"API base URL
**kwargsPassed through to crewai.LLM

Configure routing

Configure routing options:
from auriko.frameworks.crewai import AurikoCrewAILLM
from auriko.route_types import RoutingOptions

auriko_llm = AurikoCrewAILLM(
    model="gpt-5.4",
    routing=RoutingOptions(optimize="cost"),
)

# After crew.kickoff(), access routing metadata from the last request
metadata = auriko_llm.last_routing_metadata
if metadata:
    print(f"Provider: {metadata.provider}")
Different agents can use different models and routing strategies:
fast_llm = AurikoCrewAILLM(model="gpt-4o", routing=RoutingOptions(optimize="speed"))
smart_llm = AurikoCrewAILLM(model="gpt-5.4", routing=RoutingOptions(optimize="balanced"))

researcher = Agent(role="Researcher", goal="Find information", backstory="Expert", llm=smart_llm.llm)
writer = Agent(role="Writer", goal="Write content", backstory="Skilled writer", llm=fast_llm.llm)

Configure manually

If you prefer not to use the SDK adapter, you can configure CrewAI’s LLM directly. You must add the openai/ prefix to the model name manually:
import os
from crewai import LLM

llm = LLM(
    model="openai/gpt-5.4",  # openai/ prefix required
    base_url="https://api.auriko.ai/v1",
    api_key=os.environ["AURIKO_API_KEY"],
)
Note: routing options and metadata access aren’t available with manual configuration.

Notes

  • AurikoCrewAILLM wraps crewai.LLM — pass the .llm property to Agent, not the wrapper itself.
  • The openai/ prefix is added automatically for all models, including Claude.
  • last_routing_metadata returns metadata from the most recent non-streaming response only.