From ask-llm
Send a topic to multiple LLM providers in parallel while Claude Opus performs its own independent research in parallel, then synthesize all findings. Usage /brainstorm [providers] <topic>. External providers default to gemini,codex. Example /brainstorm gemini,codex,ollama "review this architecture"
npx claudepluginhub lykhoyda/ask-llm --plugin ask-llmThis skill uses the workspace's default tool permissions.
Consult multiple external LLM providers simultaneously on a topic while Claude Opus performs its own independent research in parallel, then synthesize the findings from all participants.
Spawns parallel agents powered by Claude Opus, OpenAI Codex, and Google Gemini for brainstorming project ideas, features, or directions to uncover blind spots and spark creativity.
Orchestrates parallel analysis of coding problems across AI models (Claude, GPT, Gemini, Grok) via CLI tools or APIs, collects recommendations, and synthesizes optimal solution.
Runs 3 AI models in parallel (gpt-5.2-pro, gemini-3-pro-preview, claude-opus-4-5-20251101) for diverse perspectives on code queries. Invoke via /ask-council or auto-activates.
Share bugs, ideas, or general feedback.
Consult multiple external LLM providers simultaneously on a topic while Claude Opus performs its own independent research in parallel, then synthesize the findings from all participants.
Parse the arguments:
gemini,codex or gemini,codex,ollama), use those as the external providersgemini,codexgemini, codex, ollamaDetermine the brainstorm topic:
git diff and git diff --cachedIf no topic is clear, ask the user what they'd like to brainstorm about.
Launch the brainstorm-coordinator agent with the topic, the selected external providers list, and any gathered context. The agent handles: