mock provider is always
on. Each real provider only registers when its API key is set, and AI_PROVIDER picks the
active one. That way you can develop offline, swap providers per environment, and never
ship a half-configured stack.
Prerequisites
- A working install (
pnpm devboots). - One real API key — OpenRouter is the easiest if you only want to pick one (openrouter.ai).
- Read
src/ai/index.ts— it’s 80 lines and shows the whole story.
The six providers
| Name | Source file | Operations | Notes |
|---|---|---|---|
mock | providers/mock.ts | chat, chatStream, image | Default. Returns canned text + picsum URLs. |
openrouter | providers/openai-compat.ts | chat, chatStream | OpenAI-shape API; routes 100+ models. |
openai | providers/openai-compat.ts | chat, chatStream | Direct OpenAI endpoint. |
anthropic | providers/anthropic.ts | chat, chatStream | Distinct Messages API + SSE event types. |
replicate | providers/replicate.ts | image | Poll-based prediction API. |
fal | providers/fal.ts | image | Sync queue, fast for FLUX schnell. |
image() on anthropic) throws AIUnsupportedError.
The manager catches it and tries opts.fallback if you pass one.
Step-by-step: switch to a real provider
-
Pick a provider and add its key to
.env.local: -
Restart the dev server. Conditional registration runs at import time:
-
Make a call from a server action or route handler:
resultisChatResult | InsufficientCredits. Always narrow withisInsufficientCredits(result)before reading.text.
Adding a model to the price book
src/ai/pricing.ts is the single source of truth for cost. Keys are
provider:model. Numbers are per-1k tokens in micro-cents (1/100 of a cent —
$0.0001 = 100 micro-cents). Edit, save, done:
getTokenPrice falls back to mock:any (cheap),
so the call still goes through but cost reports will under-count. Search for
“forgetting price book” in Observability for the fix.
Errors you can catch
opts.fallback is your retry knob. The manager invokes it once on
AIProviderError or AIUnsupportedError — not on InsufficientCredits (refunds
already handled). Use mock as a fallback in dev so demos never break.
Verify it works
- Set
AI_PROVIDER=mock(default). Hit the chat demo at/playground/chat— you should see the “(mock provider …)” canned response. - Set a real key and
AI_PROVIDER=<name>. Restart. Same demo should now stream a real reply. - Tail the dev server. The
ai_callinsert log line includesprovider=<name>— confirms the active selection. psqlinto your dev DB and runSELECT provider, model, status FROM ai_call ORDER BY created_at DESC LIMIT 5;
Common pitfalls
AI_PROVIDERset but no key. Manager silently falls back tomock. Look atObject.keys(PROVIDERS)ingetProvider— your provider is missing.- OpenRouter model id format. It’s
vendor/model(e.g.openai/gpt-4o-mini), not justgpt-4o-mini. Different from the direct OpenAI provider. - Anthropic system messages. They live at the top level, not in
messages. The provider auto-splits, but if you build a request manually, system goes separate. - Replicate version pinning.
modelis a version hash, not a friendly name. Useblack-forest-labs/flux-schnellonly after looking up its current version on Replicate. AIUnsupportedErroron image with Anthropic. Anthropic doesn’t do images. Setfallback: 'replicate'or'fal', or branch onoperationupstream.
Official docs
- OpenRouter: openrouter.ai/docs
- OpenAI: platform.openai.com/docs
- Anthropic: docs.anthropic.com
- Replicate: replicate.com/docs
- fal.ai: fal.ai/docs
- Source:
src/ai/index.ts,src/ai/manager.ts,src/ai/types.ts