OpenClaw Agent
This walkthrough deploys a full-featured OpenClaw agent with multiple LLM models and external service integrations. By the end, you will have an agent that can:
- Use both Gemini and Claude for LLM reasoning
- Create pull requests and read issues on GitHub
- Receive and respond to messages via Telegram
- Be accessed through a public HTTPS URL
Prerequisites
- Pai installed on your cluster (see Getting Started)
- A Gemini API key
- An Anthropic API key
- A GitHub Personal Access Token with
reposcope - A Telegram bot token (see Telegram Integration)
Step 1: Create secrets
Store all credentials as Pai-managed secrets:
# LLM API keys
pai add secret gemini-key --from-literal api-key=YOUR_GEMINI_API_KEY
pai add secret anthropic-key --from-literal api-key=YOUR_ANTHROPIC_API_KEY
# GitHub PAT
pai add secret github-pat --from-literal token=ghp_YOUR_GITHUB_PAT
# Telegram bot token
pai add secret telegram-token --from-literal token=YOUR_TELEGRAM_BOT_TOKEN
Verify your secrets:
pai get secrets
# NAME KEYS AGE
# gemini-key api-key 5s
# anthropic-key api-key 5s
# github-pat token 4s
# telegram-token token 4s
Step 2: Add models
Create ModelBindings for both Gemini and Claude:
# Gemini Flash -- fast, high-volume model
pai add model gemini-flash \
--provider gemini \
--api-key gemini-key \
--max-tokens-day 1000000 \
--max-tokens-request 32000
# Claude Sonnet -- high-quality reasoning model
pai add model claude-sonnet \
--provider anthropic \
--api-key anthropic-key \
--max-tokens-day 500000 \
--max-tokens-request 16000
Verify:
pai get models
# NAME PROVIDER MODEL MAX/DAY AGE
# gemini-flash gemini gemini-2.0-flash 1000000 10s
# claude-sonnet anthropic claude-sonnet-4-6 500000 8s
Step 3: Add providers
Create Providers for GitHub and Telegram:
# GitHub -- allow PR creation and issue reading
pai add service github-writer \
--provider github \
--secret-token github-pat \
--repos "your-org/your-repo"
# Telegram -- allow bot messaging
pai add service telegram-bot \
--provider telegram \
--secret-token telegram-token
Verify:
pai get services
# NAME PROVIDER HOST AUTH AGE
# github-writer github api.github.com pat 5s
# telegram-bot telegram api.telegram.org bot-token 4s
Step 4: Deploy the agent
Create a file called openclaw-agent.yaml:
apiVersion: pai.io/v1
kind: AgentWorkload
metadata:
name: openclaw
spec:
image: ghcr.io/pai-platform/openclaw:latest
runAsUser: 1000
modelBindings:
- gemini-flash
- claude-sonnet
providers:
- github-writer
- telegram-bot
inbound:
port: 3000
allowCIDRs:
- "0.0.0.0/0"
volumes:
- name: workspace
mountPath: /home/node/workspace
size: "5Gi"
configFiles:
- path: /home/node/.openclaw/openclaw.json
content: |
{
"channels": ["web", "telegram"],
"defaultModel": "gemini-flash",
"models": {
"gemini-flash": {
"displayName": "Gemini Flash",
"description": "Fast responses, high volume"
},
"claude-sonnet": {
"displayName": "Claude Sonnet",
"description": "Deep reasoning, high quality"
}
}
}
resources:
requests:
cpu: "500m"
memory: "512Mi"
limits:
cpu: "2"
memory: "2Gi"
Deploy:
pai create -f openclaw-agent.yaml
Step 5: Wait for the agent to start
Watch the status until it reaches Running:
pai status openclaw
# Name: openclaw
# Namespace: pai-system
# Status: Running
# URL: https://k9m2x4.pairun.dev
# Image: ghcr.io/pai-platform/openclaw:latest
#
# Models:
# - gemini-flash (gemini / gemini-2.0-flash) -- 0 / 1,000,000 tokens today
# - claude-sonnet (anthropic / claude-sonnet-4-6) -- 0 / 500,000 tokens today
#
# Services:
# - github-writer (github / api.github.com) -- 0 requests today
# - telegram-bot (telegram / api.telegram.org) -- 0 requests today
Step 6: Access the web UI
Open the OpenClaw web interface:
pai dashboard openclaw
This opens https://k9m2x4.pairun.dev in your browser. You will see the OpenClaw chat interface with a model selector showing both Gemini Flash and Claude Sonnet.
Step 7: Configure Telegram
To pair Telegram with your agent:
- Open Telegram and find your bot (the one you created with BotFather)
- Send
/startto the bot - The bot is now connected to your OpenClaw agent via the Telegram Provider
Messages sent to the bot are routed through the Pai sidecar proxy, which injects the bot token and forwards to the Telegram API. See the Telegram Integration guide for detailed setup.
Step 8: Test GitHub integration
Ask the agent to interact with your repository through the chat UI:
List the open issues in your-org/your-repo
The agent will call the GitHub API through the sidecar proxy, which:
- Intercepts the HTTPS request to
api.github.com - Resolves the action to
issues:read - Checks the policy (allowed)
- Checks the scope (repository is in the allowed list)
- Injects the GitHub PAT as a Bearer token
- Forwards the request to GitHub
You can verify the activity in the logs:
pai logs openclaw --tail 20
Updating the agent
To change the configuration, edit openclaw-agent.yaml and apply:
pai apply -f openclaw-agent.yaml
Pai will perform a rolling update of the deployment, preserving the hostname and URL.
Monitoring
Check token consumption and service usage:
# Overall status
pai status openclaw
# Recent events
pai events openclaw
# Live logs
pai logs openclaw
Cleanup
To remove the agent and all associated resources:
pai delete agent openclaw
pai delete model gemini-flash
pai delete model claude-sonnet
pai delete service github-writer
pai delete service telegram-bot
pai delete secret gemini-key
pai delete secret anthropic-key
pai delete secret github-pat
pai delete secret telegram-token