Skip to main content
Use Auriko as your LLM provider in Google’s Agent Development Kit (ADK).

Prerequisites

Installation

pip install "auriko[adk]"

Use SDK adapter

Use the AurikoLlm adapter:
from auriko.frameworks.adk import AurikoLlm

llm = AurikoLlm(model="gpt-5.4")
AurikoLlm is a native BaseLlm implementation that doesn’t use LiteLLM as an intermediary. It converts directly between ADK (Gemini) types and OpenAI message format, supporting text and function calling.
import asyncio
from auriko.frameworks.adk import AurikoLlm
from google.adk import Agent, Runner
from google.adk.sessions import InMemorySessionService
from google.genai import types

llm = AurikoLlm(model="gpt-5.4")

agent = Agent(
    model=llm,
    name="assistant",
    instruction="You are a helpful assistant.",
)

session_service = InMemorySessionService()
runner = Runner(agent=agent, app_name="my_app", session_service=session_service, auto_create_session=True)

user_message = types.Content(
    role="user", parts=[types.Part(text="What is 2+2?")]
)

async def main():
    async for event in runner.run_async(user_id="user-1", session_id="session-1", new_message=user_message):
        if event.content and event.content.parts:
            for part in event.content.parts:
                if part.text:
                    print(part.text, end="", flush=True)

asyncio.run(main())

Configure options

ParameterTypeDefaultDescription
modelstr(required)Model ID
api_keystr"" (reads AURIKO_API_KEY at first use)API key
routingRoutingOptions | NoneNoneRouting configuration
base_urlstr"https://api.auriko.ai/v1"API base URL

Configure routing

Configure routing options:
from auriko.frameworks.adk import AurikoLlm
from auriko.route_types import RoutingOptions

llm = AurikoLlm(
    model="gpt-5.4",
    routing=RoutingOptions(optimize="cost"),
)

Configure manually

If you prefer to use Google’s LiteLlm class directly:
import os
from google.adk.models.lite_llm import LiteLlm

llm = LiteLlm(
    model="openai/gpt-5.4",
    api_key=os.environ["AURIKO_API_KEY"],
    api_base="https://api.auriko.ai/v1",
    custom_llm_provider="openai",
)
LiteLLM ignores api_base for model names containing provider keywords (like gpt or claude). Always include custom_llm_provider="openai" to force LiteLLM to respect your custom base URL.
Note: routing options and Auriko error mapping aren’t available with manual configuration.

Notes

  • Supports text and function calling. Inline data (inline_data) and file data (file_data) aren’t yet supported and raise NotImplementedError.
  • OpenAI API errors are automatically mapped to typed Auriko error classes.
  • The adapter uses AsyncOpenAI internally; the client is lazily initialized on first use.