Skip to main content

Anthropic

Use Claude models from Anthropic as the LLM for your agents.

Get an API key

Sign up at console.anthropic.com and create an API key under API Keys.

Setup

pai create model-provider anthropic \
--provider anthropic \
--api-key sk-ant-...

This stores your API key securely and creates a ModelProvider that covers every Claude model under your subscription.

Verify:

pai get model-providers
# NAME PROVIDER ENDPOINT MAX/DAY LAST USED AGE
# anthropic anthropic — 5s

Supported models

ModelReferenceBest for
Claude Sonnet 4.6anthropic/claude-sonnet-4-6Best overall — reasoning, coding, analysis
Claude Haiku 4.5anthropic/claude-haiku-4-5Fast, lightweight tasks
Claude Opus 4.6anthropic/claude-opus-4-6Most capable, complex multi-step tasks

Use in an agent

spec:
models:
- anthropic/claude-sonnet-4-6

Multiple models — first is primary, rest are fallbacks:

spec:
models:
- anthropic/claude-sonnet-4-6 # primary
- anthropic/claude-haiku-4-5 # fallback when the primary is over budget

Token budgets

Cap how many tokens this subscription burns per day across every agent — a hard safety net against runaway spend.

apiVersion: pai.io/v1
kind: ModelProvider
metadata:
name: anthropic
spec:
provider: anthropic
apiKeySecretRef:
name: anthropic-key
key: api-key
maxTokensPerDay: 5000000 # daily cap shared across all agents
maxTokensPerRequest: 200000 # per-request context-window limit

When the daily cap is hit, the gateway returns HTTP 429 until midnight UTC. Agents that list another provider in spec.models automatically fall over to it.

Expose via the LLM Gateway

Set externalAccess.enabled: true to let developers outside the cluster — laptops, CI, scripts — route their own LLM traffic through this provider. The Anthropic API key stays inside Pai; clients authenticate with a Pai AccessKey instead.

spec:
externalAccess:
enabled: true
maxTokensPerDay: 2000000 # separate budget for external usage

Once enabled, developers connect with three commands:

pai login https://api.pairun.dev --access-key pak_...
eval $(pai gateway env)
claude # Claude Code now routes through Pai

See LLM Gateway for the full onboarding flow, AccessKey management, and per-developer rate limits.

Access control

Narrow which models agents may call on this provider with allowedModels / deniedModels, or attach prompt-injection guards. See Security controls on the Model page for the full field list.