Skip to main content
Get your first LLM response through Auriko in under 2 minutes.

Prerequisites

1. Get an API Key

Sign Up

Create your account and get an API key from the dashboard
Base URL: Use https://api.auriko.ai/v1 as your base URL. This value matches servers[0].url in our OpenAPI spec and is the canonical endpoint for all API requests.

2. Install

pip install auriko

3. Make Your First Request

import os
from auriko import Client

client = Client(
    api_key=os.environ["AURIKO_API_KEY"],
    base_url="https://api.auriko.ai/v1"
)

response = client.chat.completions.create(
    model="gpt-5.4",
    messages=[{"role": "user", "content": "Hello!"}]
)

print(response.choices[0].message.content)
if response.routing_metadata:
    print(f"Provider: {response.routing_metadata.provider}")
    if response.routing_metadata.cost:
        print(f"Cost: ${response.routing_metadata.cost.billable_cost_usd:.6f}")

4. Enable Routing Features (Optional)

response = client.chat.completions.create(
    model="gpt-5.4",
    messages=[{"role": "user", "content": "Hello!"}],
    routing={
        "optimize": "cost",        # Optimize for cost
        "max_ttft_ms": 200,        # Max 200ms to first token
    }
)

Next Steps

API Reference

Full API documentation

Routing Options

Configure cost/speed optimization

Streaming

Real-time streaming responses

LangChain

Use with LangChain