By ypollak2
Route AI tasks—code generation, research, analysis, writing—to cheapest capable LLMs across 20+ providers by auto-classifying type (research, generate, code) and complexity via heuristics or cheap APIs. Save on Claude/OpenAI costs with tracking, alerts, dashboards; decompose complex tasks via agent; automate llm-router releases with tests and PyPI/GitHub publishing.
npx claudepluginhub ypollak2/llm-router --plugin llm-routerAutomates the full release pipeline for llm-router. Run this skill whenever
Route a task to the best LLM based on task type and complexity
Route tasks to the cheapest capable model automatically using llm-router MCP tools.
Track and report how much you've saved by routing tasks to cheaper models.
Flagship+ skill pack for OpenRouter - 30 skills for multi-model routing, fallbacks, and LLM gateway mastery
Uses power tools
Uses Bash, Write, or Edit tools
Share bugs, ideas, or general feedback.
When calling LLM APIs from Python code. When connecting to llamafile or local LLM servers. When switching between OpenAI/Anthropic/local providers. When implementing retry/fallback logic for LLM calls. When code imports litellm or uses completion() patterns.
Intelligent model routing for Claude Code - routes queries to optimal Claude model (Haiku/Sonnet/Opus) based on complexity, with persistent knowledge system, context forking, and multi-turn awareness
Consult multiple AI coding agents (Gemini, OpenAI, Grok, Perplexity, plus codex and gemini CLIs when installed) to get diverse perspectives on coding problems
Ultra-compressed communication mode. Cuts ~75% of tokens while keeping full technical accuracy by speaking like a caveman.
Comprehensive UI/UX design plugin for mobile (iOS, Android, React Native) and web applications with design systems, accessibility, and modern patterns