From ask-llm
Send a topic to ALL LLM providers (Gemini, Codex, Ollama) in parallel while Claude Opus performs its own independent research in parallel. Synthesizes findings from up to four participants. Shortcut for /brainstorm gemini,codex,ollama <topic>. Requires Ollama to be running locally.
npx claudepluginhub lykhoyda/ask-llm --plugin ask-llmThis skill uses the workspace's default tool permissions.
Consult all available external LLM providers (Gemini, Codex, Ollama) simultaneously while Claude Opus performs its own independent research on the topic, then synthesize perspectives from all four participants.
Spawns parallel agents powered by Claude Opus, OpenAI Codex, and Google Gemini for brainstorming project ideas, features, or directions to uncover blind spots and spark creativity.
Orchestrates parallel analysis of coding problems across AI models (Claude, GPT, Gemini, Grok) via CLI tools or APIs, collects recommendations, and synthesizes optimal solution.
Consults external LLMs (OpenAI Codex, Google Gemini) via CLIs for second opinions on architecture, design decisions, model selection, and approach comparisons.
Share bugs, ideas, or general feedback.
Consult all available external LLM providers (Gemini, Codex, Ollama) simultaneously while Claude Opus performs its own independent research on the topic, then synthesize perspectives from all four participants.
Determine the brainstorm topic:
git diff and git diff --cachedIf no topic is clear, ask the user what they'd like to brainstorm about.
Launch the brainstorm-coordinator agent with the topic, external providers set to gemini,codex,ollama, and any gathered context. The coordinator will: