零成本白嫖 Kimi K2.5!OpenClaw 免费配置全攻略

Qwen 额度用完?别慌! 国产最强多模态模型 Kimi K2.5 现在能完全免费接入了!本文手把手教你用 OpenClaw 零成本玩转这个顶级 AI,不花一分钱体验 256K 上下文 + 深度推理能力。

本文跟大家介绍一下如何替换模型到免费的kimi k2.5使用,体验国产最强多模态模型。

1. 安装openclaw

没有安装openclaw的小伙伴先安装openclaw。

1.1 英文官方原版安装命令

curl -fsSL https://openclaw.ai/install.sh | bash

官方地址:https://docs.openclaw.ai/zh-CN

1.2 中文版本安装命令

curl -fsSL https://clawd.org.cn/install.sh | bash

官方地址:https://clawd.org.cn/

本文以官方英文原版为例

2. 配置kimi k2.5模型

2.1 开启配置向导

# 英文版本
openclaw onboard
# 中文版本
openclaw-cn onboard

2.2 选择yes

◆  I understand this is powerful and inherently risky. Continue?
│  ● Yes / ○ No

2.3 选择QuickStart

◆  Onboarding mode
│  ● QuickStart (Configure details later via openclaw configure.)
│  ○ Manual

2.4 直接切换Update values

◆  Config handling
│  ○ Use existing values
│  ● Update values
│  ○ Reset" data-show-line-number="false">◆  Config handling
│  ○ Use existing values
│  ● Update values
│  ○ Reset

2.5 选择opencode Zen

◆  Model/auth provider
│  ○ OpenAI
│  ○ Anthropic
│  ○ MiniMax
│  ○ Moonshot AI (Kimi K2.5)
│  ○ Google
│  ○ xAI (Grok)
│  ○ OpenRouter
│  ○ Qwen
│  ○ Z.AI (GLM 4.7)
│  ○ Qianfan
│  ○ Copilot
│  ○ Vercel AI Gateway
│  ● OpenCode Zen (API key)
│  ○ Xiaomi
│  ○ Synthetic
│  ○ Venice AI
│  ○ Cloudflare AI Gateway
│  ○ Skip for now" data-show-line-number="false">◆  Model/auth provider
│  ○ OpenAI
│  ○ Anthropic
│  ○ MiniMax
│  ○ Moonshot AI (Kimi K2.5)
│  ○ Google
│  ○ xAI (Grok)
│  ○ OpenRouter
│  ○ Qwen
│  ○ Z.AI (GLM 4.7)
│  ○ Qianfan
│  ○ Copilot
│  ○ Vercel AI Gateway
│  ● OpenCode Zen (API key)
│  ○ Xiaomi
│  ○ Synthetic
│  ○ Venice AI
│  ○ Cloudflare AI Gateway
│  ○ Skip for now

2.6 选择OpenCode Zen (multi-model proxy)

◆  OpenCode Zen auth method
│  ● OpenCode Zen (multi-model proxy) (Claude, GPT, Gemini via opencode.ai/zen)
│  ○ Back" data-show-line-number="false">◆  OpenCode Zen auth method
│  ● OpenCode Zen (multi-model proxy) (Claude, GPT, Gemini via opencode.ai/zen)
│  ○ Back

2.7 输入opencode的apikey

apiKey获取地址:https://opencode.ai/zen


登录获取到token之后填入

◆  Enter OpenCode Zen API key
│  sk-1321321321312312323123█
└" data-show-line-number="false">◆  Enter OpenCode Zen API key
│  sk-1321321321312312323123█
└

2.8 选择默认模型为 opencode/kimi-k2.5-free

Default model
│  ○ Keep current (opencode/claude-opus-4-6)
│  ○ Enter model manually
│  ○ opencode/big-pickle
│  ○ opencode/claude-3-5-haiku
│  ○ opencode/claude-haiku-4-5
│  ○ opencode/claude-opus-4-1
│  ○ opencode/claude-opus-4-5
│  ○ opencode/claude-opus-4-6
│  ○ opencode/claude-sonnet-4
│  ○ opencode/claude-sonnet-4-5
│  ○ opencode/gemini-3-flash
│  ○ opencode/gemini-3-pro
│  ○ opencode/glm-4.6
│  ○ opencode/glm-4.7
│  ○ opencode/glm-4.7-free
│  ○ opencode/gpt-5
│  ○ opencode/gpt-5-codex
│  ○ opencode/gpt-5-nano
│  ○ opencode/gpt-5.1
│  ○ opencode/gpt-5.1-codex
│  ○ opencode/gpt-5.1-codex-max
│  ○ opencode/gpt-5.1-codex-mini
│  ○ opencode/gpt-5.2
│  ○ opencode/gpt-5.2-codex
│  ○ opencode/kimi-k2
│  ○ opencode/kimi-k2-thinking
│  ○ opencode/kimi-k2.5
│  ● opencode/kimi-k2.5-free (Kimi K2.5 Free · ctx 256k · reasoning)
│  ○ opencode/minimax-m2.1
│  ○ opencode/minimax-m2.1-free
│  ○ opencode/qwen3-coder
│  ○ opencode/trinity-large-preview-free" data-show-line-number="false">Default model
│  ○ Keep current (opencode/claude-opus-4-6)
│  ○ Enter model manually
│  ○ opencode/big-pickle
│  ○ opencode/claude-3-5-haiku
│  ○ opencode/claude-haiku-4-5
│  ○ opencode/claude-opus-4-1
│  ○ opencode/claude-opus-4-5
│  ○ opencode/claude-opus-4-6
│  ○ opencode/claude-sonnet-4
│  ○ opencode/claude-sonnet-4-5
│  ○ opencode/gemini-3-flash
│  ○ opencode/gemini-3-pro
│  ○ opencode/glm-4.6
│  ○ opencode/glm-4.7
│  ○ opencode/glm-4.7-free
│  ○ opencode/gpt-5
│  ○ opencode/gpt-5-codex
│  ○ opencode/gpt-5-nano
│  ○ opencode/gpt-5.1
│  ○ opencode/gpt-5.1-codex
│  ○ opencode/gpt-5.1-codex-max
│  ○ opencode/gpt-5.1-codex-mini
│  ○ opencode/gpt-5.2
│  ○ opencode/gpt-5.2-codex
│  ○ opencode/kimi-k2
│  ○ opencode/kimi-k2-thinking
│  ○ opencode/kimi-k2.5
│  ● opencode/kimi-k2.5-free (Kimi K2.5 Free · ctx 256k · reasoning)
│  ○ opencode/minimax-m2.1
│  ○ opencode/minimax-m2.1-free
│  ○ opencode/qwen3-coder
│  ○ opencode/trinity-large-preview-free

2.9 配置通讯工具,如何已经配置,可直接跳过

◆  Select channel (QuickStart)
│  ○ Telegram (Bot API)
│  ○ WhatsApp (QR link)
│  ○ Discord (Bot API)
│  ○ Google Chat (Chat API)
│  ○ Slack (Socket Mode)
│  ○ Signal (signal-cli)
│  ○ iMessage (imsg)
│  ○ Feishu/Lark (飞书)
│  ○ Nostr (NIP-04 DMs)
│  ○ Microsoft Teams (Bot Framework)
│  ○ Mattermost (plugin)
│  ○ Nextcloud Talk (self-hosted)
│  ○ Matrix (plugin)
│  ○ BlueBubbles (macOS app)
│  ○ LINE (Messaging API)
│  ○ Zalo (Bot API)
│  ○ Zalo (Personal Account)
│  ○ Tlon (Urbit)
│  ● Skip for now (Yo" data-show-line-number="false">◆  Select channel (QuickStart)
│  ○ Telegram (Bot API)
│  ○ WhatsApp (QR link)
│  ○ Discord (Bot API)
│  ○ Google Chat (Chat API)
│  ○ Slack (Socket Mode)
│  ○ Signal (signal-cli)
│  ○ iMessage (imsg)
│  ○ Feishu/Lark (飞书)
│  ○ Nostr (NIP-04 DMs)
│  ○ Microsoft Teams (Bot Framework)
│  ○ Mattermost (plugin)
│  ○ Nextcloud Talk (self-hosted)
│  ○ Matrix (plugin)
│  ○ BlueBubbles (macOS app)
│  ○ LINE (Messaging API)
│  ○ Zalo (Bot API)
│  ○ Zalo (Personal Account)
│  ○ Tlon (Urbit)
│  ● Skip for now (Yo

之后步骤直接跳过即可,直到最后一步,选择 Hatch in TUI (recommended) 进行测试

◆  How do you want to hatch your bot?
│  ● Hatch in TUI (recommended)
│  ○ Open the Web UI
│  ○ Do this later" data-show-line-number="false">◆  How do you want to hatch your bot?
│  ● Hatch in TUI (recommended)
│  ○ Open the Web UI
│  ○ Do this later

正常返回即为配置成功

 

转自:https://mp.weixin.qq.com/s/gjnG78aoYcZc75eYCeIhiw