Multi-model AI orchestration MCP server
npx claudepluginhub religa/multi_mcpCode review, compare, and debate tools using multiple AI models
Claude Code plugins for the Slidev presentation framework
Bundled plugins for actuating and debugging the Chrome browser.
Claude Code marketplace entries for the plugin-safe Antigravity Awesome Skills library and its compatible editorial bundles.
A multi-model AI orchestration MCP server for automated code review and LLM-powered analysis. Multi-MCP integrates with Claude Code CLI and OpenCode to orchestrate multiple AI models (OpenAI GPT, Anthropic Claude, Google Gemini) for code quality checks, security analysis (OWASP Top 10), and multi-agent consensus. Built on the Model Context Protocol (MCP), this tool enables Python developers and DevOps teams to automate code reviews with AI-powered insights directly in their development workflow.
mini, sonnet, geminiMulti-MCP acts as an MCP server that Claude Code or OpenCode connects to, providing AI-powered code analysis tools:
make installFast Multi-Model Analysis:
Prerequisites:
# Clone and install
git clone https://github.com/religa/multi_mcp.git
cd multi_mcp
# Execute ./scripts/install.sh
make install
# The installer will:
# 1. Install dependencies (uv sync)
# 2. Generate your .env file
# 3. Automatically add to Claude Code / OpenCode config (requires jq)
# 4. Test the installation
If you prefer not to run make install:
# Install dependencies
uv sync
# Copy and configure .env
cp .env.example .env
# Edit .env with your API keys
Add to Claude Code (~/.claude.json) or OpenCode (~/.opencode/opencode.json), replacing /path/to/multi_mcp with your actual clone path:
Claude Code:
{
"mcpServers": {
"multi": {
"type": "stdio",
"command": "/path/to/multi_mcp/.venv/bin/python",
"args": ["-m", "multi_mcp.server"]
}
}
}
OpenCode:
{
"mcp": {
"multi": {
"type": "local",
"command": ["/path/to/multi_mcp/.venv/bin/python", "-m", "multi_mcp.server"],
"enabled": true
}
}
}
Multi-MCP loads settings from .env files in this order (highest priority first):
.env (current directory or project root).env (~/.multi_mcp/.env) - fallback for pip installsEdit .env with your API keys:
# API Keys (configure at least one)
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...
GEMINI_API_KEY=...
OPENROUTER_API_KEY=sk-or-...
# Azure OpenAI (optional)
AZURE_API_KEY=...
AZURE_API_BASE=https://your-resource.openai.azure.com/
# AWS Bedrock (optional)
AWS_ACCESS_KEY_ID=...
AWS_SECRET_ACCESS_KEY=...
AWS_REGION_NAME=us-east-1
# Model Configuration
DEFAULT_MODEL=gpt-5-mini
DEFAULT_MODEL_LIST=gpt-5-mini,gemini-3-flash