Prerequisites
Installation
Use SDK adapter
Use theAurikoLlamaIndexLLM adapter:
AurikoLlamaIndexLLM extends LlamaIndex’s OpenAI LLM class with routing injection, per-call routing overrides, and Auriko error mapping.
Configure options
| Parameter | Type | Default | Description |
|---|---|---|---|
model | str | (required, via parent) | Model ID |
api_key | str | None | AURIKO_API_KEY env | API key |
routing | RoutingOptions | None | None | Default routing configuration |
api_base | str | "https://api.auriko.ai/v1" | API base URL |
**kwargs | Passed through to LlamaIndex’s OpenAI (e.g., temperature, max_tokens) |
Configure routing
Instance-level routing applies to all requests:Configure manually
Alternative: configure LlamaIndex manually
Alternative: configure LlamaIndex manually
If you prefer to use LlamaIndex’s Note: routing options, per-call overrides, and Auriko error mapping aren’t available with manual configuration.
OpenAI class directly:Notes
AurikoLlamaIndexLLMinherits all LlamaIndex OpenAI capabilities: chat, completion, streaming, async.- OpenAI API errors are automatically mapped to typed Auriko error classes (
RateLimitError,BudgetExceededError, etc.). - Per-call routing overrides are unique to this adapter — pass
routing=RoutingOptions(...)to any chat/complete call.