
Station - AI Agent Orchestration Platform

Build, test, and deploy intelligent agent teams. Self-hosted. Git-backed. Production-ready.
Quick Start | Real Example | Deploy | Documentation
Why Station?
Build multi-agent systems that coordinate like real teams. Test with realistic scenarios. Deploy on your infrastructure.
Station gives you:
- ✅ Multi-Agent Teams - Coordinate specialist agents under orchestrators
- ✅ Built-in Evaluation - LLM-as-judge tests every agent automatically
- ✅ Git-Backed Workflow - Version control agents like code
- ✅ One-Command Deploy - Push to production with
stn deploy
- ✅ Full Observability - Jaeger traces for every execution
- ✅ Self-Hosted - Your data, your infrastructure, your control
Quick Start (2 minutes)
Prerequisites
- Docker - Required for Jaeger (traces and observability)
- AI Provider - Choose one:
- CloudShip AI (Recommended) -
STN_CLOUDSHIP_KEY or CLOUDSHIPAI_REGISTRATION_KEY
OPENAI_API_KEY - OpenAI (gpt-5-mini, gpt-5, etc.)
GEMINI_API_KEY - Google Gemini
ANTHROPIC_API_KEY - Anthropic (claude-sonnet-4-20250514, etc.)
1. Install Station
curl -fsSL https://raw.githubusercontent.com/cloudshipai/station/main/install.sh | bash
2. Initialize Station
Choose your AI provider:
CloudShip AI (Recommended)
Use CloudShip AI for optimized inference with Llama and Qwen models. This is the default when a registration key is available.
# Set your CloudShip registration key
export CLOUDSHIPAI_REGISTRATION_KEY="csk-..."
# Or use: export STN_CLOUDSHIP_KEY="csk-..."
stn init --provider cloudshipai --ship # defaults to cloudship/llama-3.1-70b
Available models:
cloudship/llama-3.1-70b (default) - Best balance of performance and cost
cloudship/llama-3.1-8b - Faster, lower cost
cloudship/qwen-72b - Alternative large model
Claude Max/Pro Subscription (⚠️ DEPRECATED)
⚠️ DEPRECATED: Anthropic OAuth is currently unavailable.
Anthropic has restricted third-party use of OAuth tokens. This authentication method is not working until further notice.
Please use one of the following alternatives:
- OpenAI API Key (recommended)
- Google Gemini API Key
- Anthropic API Key (pay-per-token, not subscription-based)
Use your existing Claude Max or Claude Pro subscription - no API billing required.
# ❌ NOT WORKING - Anthropic OAuth disabled
# stn init --provider anthropic --ship
# stn auth anthropic login
OpenAI (API Key)
export OPENAI_API_KEY="sk-..."
stn init --provider openai --ship # defaults to gpt-5-mini
Google Gemini (API Key)
export GEMINI_API_KEY="..."
stn init --provider gemini --ship
This sets up:
- ✅ Your chosen AI provider
- ✅ Ship CLI for filesystem MCP tools
- ✅ Configuration at
~/.config/station/config.yaml
3. Start Jaeger (Tracing)
Start the Jaeger tracing backend for observability:
stn jaeger up
This starts Jaeger UI at http://localhost:16686 for viewing agent execution traces.
4. Connect Your MCP Client
Choose your editor and add Station:
Claude Code CLI
claude mcp add station -e OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318 --scope user -- stn stdio
Verify with claude mcp list.
OpenCode
Add to opencode.jsonc:
{
"mcp": {
"station": {
"enabled": true,
"type": "local",
"command": ["stn", "stdio"],
"environment": {
"OTEL_EXPORTER_OTLP_ENDPOINT": "http://localhost:4318"
}
}
}
}
Cursor
Add to .cursor/mcp.json in your project (or ~/.cursor/mcp.json for global):
{
"mcpServers": {
"station": {
"command": "stn",
"args": ["stdio"],
"env": {
"OTEL_EXPORTER_OTLP_ENDPOINT": "http://localhost:4318"
}
}
}
}
Claude Desktop
| OS | Config Path |
|---|
| macOS | ~/Library/Application Support/Claude/claude_desktop_config.json |
| Windows | %APPDATA%\Claude\claude_desktop_config.json |
| Linux | ~/.config/Claude/claude_desktop_config.json |