From mise-toolkit
The four main AI CLIs compared — Claude Code, OpenAI Codex, Google Gemini, and aichat — with provider lock-in, model-agnostic fallback strategy, and why installing all four via mise is a lightweight dev-environment win. Use when picking AI CLIs for a project.
npx claudepluginhub ray-manaloto/claude-code-marketplace --plugin mise-toolkitThis skill uses the workspace's default tool permissions.
The AI-assisted coding landscape has consolidated around a few CLIs. All four are installable via mise (with one caveat for gemini). Running more than one is normal — they have different strengths and you'll want a model-agnostic fallback.
Searches, retrieves, and installs Agent Skills from prompts.chat registry using MCP tools like search_skills and get_skill. Activates for finding skills, browsing catalogs, or extending Claude.
Checks Next.js compilation errors using a running Turbopack dev server after code edits. Fixes actionable issues before reporting complete. Replaces `next build`.
Guides code writing, review, and refactoring with Karpathy-inspired rules to avoid overcomplication, ensure simplicity, surgical changes, and verifiable success criteria.
Share bugs, ideas, or general feedback.
The AI-assisted coding landscape has consolidated around a few CLIs. All four are installable via mise (with one caveat for gemini). Running more than one is normal — they have different strengths and you'll want a model-agnostic fallback.
| CLI | Provider | Model backbone | Primary strength | Registry backend |
|---|---|---|---|---|
Claude Code (claude) | Anthropic | Claude (Sonnet/Opus) | Agentic coding, long-context refactors, tool use, plugins | aqua:anthropics/claude-code |
Codex (codex) | OpenAI | GPT-5 / o-series | Fast completions, strong at one-shot tasks | aqua:openai/codex |
Gemini CLI (gemini) | Gemini 2.x / 3.x | Huge context window, multimodal, Google Workspace integration | npm:@google/gemini-cli (not in registry yet) | |
| aichat | Any (OpenAI/Anthropic/Gemini/Ollama/...) | Multi-provider adapter | Model-agnostic fallback, local models via Ollama | aqua:sigoden/aichat |
The cost of installing all four is negligible — a few MB of binaries + a few env vars. The upside is you're never stuck when one is unavailable.
aichat is special: it's not tied to a specific provider. It speaks to OpenAI, Anthropic, Gemini, Ollama, Cohere, and more via config. One CLI, many backends.
Use it when:
Config lives at ~/.config/aichat/config.yaml:
model: claude:claude-sonnet-4-5
clients:
- type: claude
api_key: $ANTHROPIC_API_KEY
- type: openai
api_key: $OPENAI_API_KEY
- type: gemini
api_key: $GEMINI_API_KEY
- type: ollama
api_base: http://localhost:11434
Each vendor's CLI is optimized for their models and their tool-calling conventions. Switching between them means:
You won't fully escape lock-in by installing all four, but you'll have off-ramps when you need them.
[tools]
node = "24" # required for Gemini CLI (npm)
claude = "latest" # aqua:anthropics/claude-code
codex = "latest" # aqua:openai/codex
aichat = "latest" # aqua:sigoden/aichat
"npm:@google/gemini-cli" = "latest"
[env]
ANTHROPIC_API_KEY = { required = "console.anthropic.com → API Keys", redact = true }
OPENAI_API_KEY = { required = "platform.openai.com → API keys", redact = true }
GEMINI_API_KEY = { required = "aistudio.google.com/app/apikey", redact = true }
[redactions]
patterns = ["*_API_KEY", "*_TOKEN", "*_SECRET"]
node = "24" is needed because Gemini CLI is an npm package. Even if you don't otherwise use Node, mise will install it transparently.
Never write the actual key values into mise.toml. The required = "…" form is a prompt — when the variable is missing, mise tells the user where to get it. The value is set via their shell rc, keychain, or secrets manager. See mise-ai-cli-keys for the threat model.
ai-status taskA simple sanity check that all four CLIs are installed and their keys are set:
[tasks.ai-status]
description = "Verify AI CLIs are installed and authenticated"
run = [
"claude --version",
"codex --version",
"aichat --version",
"gemini --version",
]
If any CLI prints an auth error instead of a version, that tells you the key isn't loaded. Run /mise-ai-keys to fix.
.env with keys "just for this project". Never.mise set ANTHROPIC_API_KEY=… — writes plaintext to mise.local.toml. Not a secret store.npm view @google/gemini-cli before pinning.mise-ai-cli-setup — the full mise.toml wiring and the ai-status task.mise-ai-cli-keys — threat model, keychain/1Password integration, rotation.mise-env-directives — how required = "" and redact = true work.data/ai-cli-research.md — authoritative registry mappings (fetched 2026-04-07)./mise-ai-init — scaffold this from a project.