OpenClaw + Lumecoder Quick Setup
Codex (GPT-5.3) in 3 Steps
No plugins needed. Connect OpenClaw to Lumecoder's OpenAI Responses endpoint for Codex. For users who already have OpenClaw installed.
Prerequisites
- OpenClaw installed (see OpenClaw + Lumecoder Integration Guide)
- Lumecoder account with API Key (starts with sk-)
How It Works
OpenClaw natively supports the OpenAI Responses protocol. Register a custom Provider in openclaw.json, point baseUrl to Lumecoder /v1, and set api: openai-responses.
Configuration Steps
Edit openclaw.json to add Lumecoder OpenAI Provider
Open the config file:
vim ~/.openclaw/openclaw.json
Add a top-level models node (same level as agents, channels):
"models": {
"providers": {
"lumecoder-openai": {
"baseUrl": "https://api.lumecoder.com/v1",
"apiKey": "sk-xxx",
"api": "openai-responses",
"models": [
{
"id": "gpt-5.4",
"name": "GPT-5.4",
"reasoning": false,
"input": ["text", "image"],
"cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
"contextWindow": 200000,
"maxTokens": 8192
},
{
"id": "gpt-5.3-codex",
"name": "GPT-5.3 Codex",
"reasoning": false,
"input": ["text", "image"],
"cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
"contextWindow": 200000,
"maxTokens": 8192
},
{
"id": "gpt-5.2-codex",
"name": "GPT-5.2 Codex",
"reasoning": false,
"input": ["text", "image"],
"cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
"contextWindow": 200000,
"maxTokens": 8192
},
{
"id": "gpt-5.1-codex-max",
"name": "GPT-5.1 Codex Max",
"reasoning": false,
"input": ["text", "image"],
"cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
"contextWindow": 200000,
"maxTokens": 8192
},
{
"id": "gpt-5.2",
"name": "GPT-5.2",
"reasoning": false,
"input": ["text", "image"],
"cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
"contextWindow": 200000,
"maxTokens": 8192
},
{
"id": "gpt-5.1-codex-mini",
"name": "GPT-5.1 Codex Mini",
"reasoning": false,
"input": ["text", "image"],
"cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
"contextWindow": 200000,
"maxTokens": 8192
}
]
}
}
}Also update the model allowlist in agents.defaults:
"model": {
"primary": "lumecoder-openai/gpt-5.3-codex",
"fallbacks": []
},
"models": {
"lumecoder-openai/gpt-5.4": {},
"lumecoder-openai/gpt-5.3-codex": {},
"lumecoder-openai/gpt-5.2-codex": {},
"lumecoder-openai/gpt-5.1-codex-max": {},
"lumecoder-openai/gpt-5.2": {},
"lumecoder-openai/gpt-5.1-codex-mini": {}
}Verification
Run a test command:
openclaw agent --local --session-id test -m "你好"
If you receive a normal response, it means:
- OpenClaw is connected to Lumecoder Codex via OpenAI Responses
- Credits billing is working
- Ready for chat, code generation, and automation
Available Models
You can configure the following model in openclaw.json:
| Model ID | Description | Recommended Use |
|---|---|---|
| gpt-5.4 | Latest general-purpose frontier model, strongest overall | Chat, reasoning & coding (newest) |
| gpt-5.3-codex | Latest frontier agentic coding model | Code generation, automation (recommended) |
| gpt-5.2-codex | Frontier agentic coding model | General coding tasks |
| gpt-5.1-codex-max | Codex-optimized flagship for deep and fast reasoning | High-complexity tasks |
| gpt-5.2 | Frontier model with improvements across knowledge, reasoning and coding | General-purpose scenarios |
| gpt-5.1-codex-mini | Codex-optimized, cheaper and faster but less capable | Low-cost batch work |
Switch default model:
openclaw models set lumecoder-openai/gpt-5.2-codex
Full Configuration Reference
Complete Lumecoder Codex configuration snippet for openclaw.json:
{
"agents": {
"defaults": {
"model": {
"primary": "lumecoder-openai/gpt-5.3-codex",
"fallbacks": []
},
"models": {
"lumecoder-openai/gpt-5.4": {},
"lumecoder-openai/gpt-5.3-codex": {},
"lumecoder-openai/gpt-5.2-codex": {},
"lumecoder-openai/gpt-5.1-codex-max": {},
"lumecoder-openai/gpt-5.2": {},
"lumecoder-openai/gpt-5.1-codex-mini": {}
},
"workspace": "/home/ubuntu/.openclaw/workspace",
"compaction": { "mode": "safeguard" },
"maxConcurrent": 4,
"subagents": { "maxConcurrent": 8 }
}
},
"models": {
"providers": {
"lumecoder-openai": {
"baseUrl": "https://api.lumecoder.com/v1",
"apiKey": "sk-xxx",
"api": "openai-responses",
"models": [
{
"id": "gpt-5.4",
"name": "GPT-5.4",
"reasoning": false,
"input": ["text", "image"],
"cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
"contextWindow": 200000,
"maxTokens": 8192
},
{
"id": "gpt-5.3-codex",
"name": "GPT-5.3 Codex",
"reasoning": false,
"input": ["text", "image"],
"cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
"contextWindow": 200000,
"maxTokens": 8192
},
{
"id": "gpt-5.2-codex",
"name": "GPT-5.2 Codex",
"reasoning": false,
"input": ["text", "image"],
"cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
"contextWindow": 200000,
"maxTokens": 8192
},
{
"id": "gpt-5.1-codex-max",
"name": "GPT-5.1 Codex Max",
"reasoning": false,
"input": ["text", "image"],
"cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
"contextWindow": 200000,
"maxTokens": 8192
},
{
"id": "gpt-5.2",
"name": "GPT-5.2",
"reasoning": false,
"input": ["text", "image"],
"cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
"contextWindow": 200000,
"maxTokens": 8192
},
{
"id": "gpt-5.1-codex-mini",
"name": "GPT-5.1 Codex Mini",
"reasoning": false,
"input": ["text", "image"],
"cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 },
"contextWindow": 200000,
"maxTokens": 8192
}
]
}
}
}
}Troubleshooting
Invalid key format
Lumecoder API Key must start with sk-. Check the providers.lumecoder-openai.apiKey value in openclaw.json.
Config changes not taking effect
You must restart after modifying openclaw.json:
openclaw gateway restart
Done
Setup complete. OpenClaw now calls Codex through Lumecoder, with support for streaming and tool calling.
