From zai-glm
Use this skill when the user asks about z.ai, Zhipu AI, setting up z.ai API access, getting a z.ai API key, configuring Claude Code to use z.ai models directly via the Anthropic-compatible endpoint, routing z.ai through Cloudflare AI Gateway or OpenRouter, or using z.ai's OpenAI-compatible API.
npx claudepluginhub nsheaps/ai-mktpl --plugin zai-glmThis skill uses the workspace's default tool permissions.
[z.ai](https://z.ai/) (formerly Zhipu AI / 智谱AI, rebranded July 2025) is a Chinese AI company that develops the GLM (General Language Model) family. They provide both OpenAI-compatible and Anthropic-compatible API endpoints.
Guides Next.js Cache Components and Partial Prerendering (PPR) with cacheComponents enabled. Implements 'use cache', cacheLife(), cacheTag(), revalidateTag(), static/dynamic optimization, and cache debugging.
Migrates code, prompts, and API calls from Claude Sonnet 4.0/4.5 or Opus 4.1 to Opus 4.5, updating model strings on Anthropic, AWS, GCP, Azure platforms.
Analyzes BMad project state from catalog CSV, configs, artifacts, and query to recommend next skills or answer questions. Useful for help requests, 'what next', or starting BMad.
z.ai (formerly Zhipu AI / 智谱AI, rebranded July 2025) is a Chinese AI company that develops the GLM (General Language Model) family. They provide both OpenAI-compatible and Anthropic-compatible API endpoints.
https://api.z.ai/api/paas/v4/https://open.bigmodel.cn/api/paas/v4/https://api.z.ai/api/anthropicz.ai provides two compatible API formats:
curl "https://api.z.ai/api/paas/v4/chat/completions" \
-H "Authorization: Bearer YOUR_ZHIPU_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "glm-4.7",
"messages": [{"role": "user", "content": "Hello!"}]
}'
from openai import OpenAI
client = OpenAI(
api_key="YOUR_ZHIPU_API_KEY",
base_url="https://api.z.ai/api/paas/v4/"
)
response = client.chat.completions.create(
model="glm-4.7",
messages=[{"role": "user", "content": "Hello!"}]
)
import OpenAI from "openai";
const client = new OpenAI({
apiKey: "YOUR_ZHIPU_API_KEY",
baseURL: "https://api.z.ai/api/paas/v4/",
});
const response = await client.chat.completions.create({
model: "glm-4.7",
messages: [{ role: "user", content: "Hello!" }],
});
z.ai also provides an endpoint that speaks the Anthropic Messages API protocol. This is what enables direct Claude Code integration.
z.ai provides a native Anthropic-compatible endpoint at https://api.z.ai/api/anthropic. Claude Code can connect directly — no proxy needed.
In ~/.claude/settings.json:
{
"env": {
"ANTHROPIC_BASE_URL": "https://api.z.ai/api/anthropic",
"ANTHROPIC_AUTH_TOKEN": "your-zai-api-key",
"CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC": "1"
}
}
Or via shell alias:
# Add to ~/.bashrc or ~/.zshrc
zaiclaude() {
ANTHROPIC_BASE_URL=https://api.z.ai/api/anthropic \
ANTHROPIC_AUTH_TOKEN="your-zai-api-key" \
CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC=1 \
claude "$@"
}
Optional model mapping — map Claude Code's model slots to specific GLM models:
{
"env": {
"ANTHROPIC_BASE_URL": "https://api.z.ai/api/anthropic",
"ANTHROPIC_AUTH_TOKEN": "your-zai-api-key",
"ANTHROPIC_DEFAULT_HAIKU_MODEL": "glm-4.5-air",
"ANTHROPIC_DEFAULT_SONNET_MODEL": "glm-4.7",
"ANTHROPIC_DEFAULT_OPUS_MODEL": "glm-5",
"CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC": "1"
}
}
| Variable | Purpose |
|---|---|
ANTHROPIC_BASE_URL | Points Claude Code to z.ai's Anthropic endpoint |
ANTHROPIC_AUTH_TOKEN | Your z.ai API key |
ANTHROPIC_API_KEY | Set to "" to avoid conflicts with existing Anthropic key |
CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC | Prevents background traffic to Anthropic servers |
ANTHROPIC_DEFAULT_SONNET_MODEL | GLM model for the "Sonnet" slot |
ANTHROPIC_DEFAULT_OPUS_MODEL | GLM model for the "Opus" slot |
ANTHROPIC_DEFAULT_HAIKU_MODEL | GLM model for the "Haiku" slot |
Note: See https://docs.z.ai/scenario-example/develop-tools/claude for z.ai's official Claude Code integration guide.
Route z.ai through Cloudflare AI Gateway for logging, caching, and cost tracking. See the cloudflare-ai-gateway skill for full setup.
# Use the universal endpoint to proxy z.ai through your gateway
curl "https://gateway.ai.cloudflare.com/v1/ACCT_ID/GATEWAY_ID" \
-H "Content-Type: application/json" \
-d '[{
"provider": "zhipu",
"endpoint": "https://api.z.ai/api/paas/v4/chat/completions",
"headers": { "Authorization": "Bearer YOUR_ZHIPU_API_KEY" },
"query": {
"model": "glm-4.7",
"messages": [{"role": "user", "content": "Hello"}]
}
}]'
OpenRouter aggregates many model providers including z.ai/Zhipu models:
curl "https://openrouter.ai/api/v1/chat/completions" \
-H "Authorization: Bearer $OPENROUTER_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "zhipu/glm-4.7",
"messages": [{"role": "user", "content": "Hello"}]
}'
For Claude Code web sessions, set environment variables in .claude/settings.local.json (gitignored):
{
"env": {
"ANTHROPIC_BASE_URL": "https://api.z.ai/api/anthropic",
"ANTHROPIC_AUTH_TOKEN": "your-zai-api-key",
"CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC": "1"
}
}
Important: Never put API keys in
settings.json(committed). Always usesettings.local.json(gitignored) or a secrets manager like 1Password. These use the same Anthropic-compatible environment variables as Option 1 above.
z.ai offers a Coding Plan subscription for enhanced coding model access:
https://api.z.ai/api/coding/paas/v4