OpenClaw
Connect OpenClaw to 9Router
Route OpenClaw's AI requests through 9Router for multi-provider LLM access with automatic fallback and cost tracking.
Connect OpenClaw to 9Router
Overview
9Router is a central AI proxy hub that gives OpenClaw access to multiple LLM providers with automatic fallback, cost tracking, and load balancing. Instead of connecting OpenClaw directly to a single AI provider, route requests through 9Router for flexibility and reliability.
Prerequisites
- OpenClaw deployed and running (see Installation)
- 9Router addon deployed in the same project (see AI Connections)
Steps
- Deploy 9Router
Go to your project > Addons section and deploy the 9Router addon. Wait for it to become active. - Get the 9Router endpoint
From the 9Router addon details, copy the API endpoint URL. It will look like:http://9router-<project-id>:4000 - Configure OpenClaw to use 9Router
Go to your OpenClaw application > Environment Variables and update:Variable Value OPENCLAW_PRIMARY_MODEL9router/<model-alias>(e.g.,9router/gemini-2.5-flash)OPENAI_API_BASEYour 9Router endpoint URL OPENAI_API_KEYYour 9Router API key - Create API keys in 9Router
Open the 9Router dashboard and add API keys for the providers you want to use (OpenAI, Google, Anthropic, etc.). - Configure model aliases
In 9Router, set up model aliases for easy switching between providers without changing OpenClaw configuration. - Restart OpenClaw
Save your environment variables and restart the application.
Verify Connection
- Open OpenClaw and send a test message
- Check the 9Router dashboard to confirm the request was routed
- Verify the response comes back correctly
Benefits
- Multi-provider fallback — If one provider is down, 9Router automatically routes to another
- Cost tracking — Monitor AI spending per model and per request
- Load balancing — Distribute requests across multiple API keys
- Model switching — Change models in 9Router without redeploying OpenClaw
For a full guide on 9Router configuration, see AI Connections.