Skip to main content
OpenClaw

Prerequisites

Follow the official documentation to install OpenClaw (formerly Moltbot/Clawdbot). Supports macOS, Windows (WSL2), Linux. For price ratio, please refer to Models & Pricing

Locate the OpenClaw Configuration File

macOS / Linux / WSL2: ~/.openclaw/openclaw.json or ~/.moltbot/moltbot.json or ~/.clawdbot/clawdbot.json

Modify the models list in the configuration file

Refer to the following JSON to configure models provided by liao: Claude models, GPT-CodeX series models, Gemini series models, and other models:
  "models": {
    "mode": "merge",
    "providers": {
      "liao-claude": {
        "baseUrl": "https://ai.liaobots.work",
        "apiKey": "Your Authcode",
        "api": "anthropic-messages",
        "models": [
          {
            "id": "claude-sonnet-4-6",
            "name": "Claude Sonnet 4.6",
            "reasoning": true,
            "input": [
              "text",
              "image"
            ],
            "cost": {
              "input": 0,
              "output": 0,
              "cacheRead": 0,
              "cacheWrite": 0
            },
            "contextWindow": 200000,
            "maxTokens": 64000
          },
          {
            "id": "claude-opus-4-6",
            "name": "Claude Opus 4.6",
            "reasoning": true,
            "input": [
              "text",
              "image"
            ],
            "cost": {
              "input": 0,
              "output": 0,
              "cacheRead": 0,
              "cacheWrite": 0
            },
            "contextWindow": 200000,
            "maxTokens": 64000
          }
        ]
      },
      "liao-codex": {
        "baseUrl": "https://ai.liaobots.work/v1",
        "apiKey": "Your Authcode",
        "api": "openai-completions",
        "models": [
          {
            "id": "gpt-5.3-codex",
            "name": "GPT 5.3 Codex",
            "reasoning": true,
            "input": [
              "text"
            ],
            "cost": {
              "input": 0,
              "output": 0,
              "cacheRead": 0,
              "cacheWrite": 0
            },
            "contextWindow": 200000,
            "maxTokens": 64000
          }
        ]
      },
      "liao": {
        "baseUrl": "https://ai.liaobots.work/",
        "apiKey": "Your Authcode",
        "api": "anthropic-messages",
        "models": [
          {
            "id": "minimax-m2.5",
            "name": "MiniMax M2.5",
            "reasoning": true,
            "input": [
              "text"
            ],
            "cost": {
              "input": 0,
              "output": 0,
              "cacheRead": 0,
              "cacheWrite": 0
            },
            "contextWindow": 204800,
            "maxTokens": 131072
          },
          {
            "id": "kimi-k2.5",
            "name": "Kimi K2.5",
            "reasoning": true,
            "input": [
              "text",
              "image"
            ],
            "cost": {
              "input": 0,
              "output": 0,
              "cacheRead": 0,
              "cacheWrite": 0
            },
            "contextWindow": 204800,
            "maxTokens": 131072
          }
        ]
      }
    }
  },

Switch the Currently Used Model in OpenClaw

After completing the configuration in the previous step, you can use the following command format to switch models (the commands are for reference only, you only need to run the model you require):
openclaw models set liao-claude/claude-sonnet-4-6
openclaw models set liao/minimax-m2.5
openclaw models set liao/kimi-k2.5
openclaw models set liao-codex/gpt-5.3-codex
Attention: You may need to use a new provider name everytime you change the baseUrl. Restarting gateway and reloading configs may not work.