Skip to main content
Use Auriko as your LLM provider in the Vercel AI SDK with a first-party provider package.

Prerequisites

Install

npm install @auriko/ai-sdk-provider ai

Use the provider

Create a provider instance and pass it to any AI SDK function:
import { createAuriko } from "@auriko/ai-sdk-provider";
import { generateText } from "ai";

const auriko = createAuriko();

const { text } = await generateText({
  model: auriko("gpt-4o"),
  prompt: "What is the capital of France?",
});

console.log(text);
createAuriko() reads your AURIKO_API_KEY environment variable automatically.

Stream responses

Use streamText for streaming:
import { createAuriko } from "@auriko/ai-sdk-provider";
import { streamText } from "ai";

const auriko = createAuriko();

const result = streamText({
  model: auriko("gpt-4o"),
  prompt: "Count to 10",
});

for await (const chunk of result.textStream) {
  process.stdout.write(chunk);
}

Configure options

ParameterTypeDefaultDescription
apiKeystringAURIKO_API_KEY envAPI key
baseURLstring"https://api.auriko.ai/v1"API base URL
headersRecord<string, string>undefinedCustom headers
fetchtypeof fetchglobalThis.fetchCustom fetch implementation
routingRoutingOptionsundefinedDefault routing configuration
metadataAurikoMetadataParamundefinedRequest metadata (tags, user ID, trace ID)

Configure routing

Set routing defaults when you create the provider:
import { createAuriko, Optimize } from "@auriko/ai-sdk-provider";
import type { AurikoResponseMetadata } from "@auriko/ai-sdk-provider";
import { generateText } from "ai";

const auriko = createAuriko({
  routing: { optimize: Optimize.COST, max_ttft_ms: 200 },
});

const result = await generateText({
  model: auriko("gpt-4o"),
  prompt: "Hello!",
});

const meta = result.providerMetadata?.auriko as AurikoResponseMetadata | undefined;
if (meta) {
  console.log(`Provider: ${meta.provider}`);
  console.log(`Cost: $${meta.cost?.usd}`);
}
For all routing parameters, see the routing options guide and advanced routing guide.

Access routing metadata

For non-streaming calls, read result.providerMetadata?.auriko:
const meta = result.providerMetadata?.auriko as AurikoResponseMetadata | undefined;
console.log(meta?.provider);
For streaming calls, await the metadata:
const result = streamText({ model: auriko("gpt-4o"), prompt: "Hello!" });
const metadata = await result.providerMetadata;
const meta = metadata?.auriko as AurikoResponseMetadata | undefined;
console.log(meta?.provider);
Import AurikoResponseMetadata from @auriko/ai-sdk-provider for type-safe access.

Configure manually

You can point the OpenAI-compatible provider at Auriko’s API:
npm install @ai-sdk/openai ai
import { createOpenAI } from "@ai-sdk/openai";
import { generateText } from "ai";

const openai = createOpenAI({
  baseURL: "https://api.auriko.ai/v1",
  apiKey: process.env.AURIKO_API_KEY,
});

const { text } = await generateText({
  model: openai("gpt-4o"),
  prompt: "Hello!",
});
This approach loses automatic routing injection and metadata extraction. You can still pass routing via body:
const { text } = await generateText({
  model: openai("gpt-4o"),
  prompt: "Hello!",
  body: { gateway: { routing: { optimize: "cost" } } },
});

Notes

  • @auriko/ai-sdk-provider is in preview (v0.1.0-preview.1). The API may change before 1.0.
  • The provider inherits all @ai-sdk/openai-compatible capabilities: tool calling, structured output, streaming.
  • For a standalone TypeScript client, see the TypeScript SDK.
  • For streaming patterns, see the streaming guide.
  • To choose between integration options, see the integration guide.