/dispatch 
Dispatch 10x's Claude Code's effective context window size. Instead of filling your session with implementation, /dispatch turns it into a lightweight orchestrator — work fans out to background agents, each with their own full context window.
Without dispatch: You ask Claude to review code, refactor a module, write tests, and update docs. Each task fills the context window. By task 3, Claude is losing track. By task 5, you're starting a new session.
With dispatch: You describe all 5 tasks. Workers execute in parallel with fresh contexts. Your main session stays lean. Questions surface to you when needed — no polling, no context lost.
/dispatch use sonnet to find better design patterns for the auth module
Why dispatch
Your session stays lean
Dispatch inverts the usual model: the main session becomes a mediator, not the thinker. It writes a checklist and hands it off. The actual implementation — reading code, reasoning about edge cases, writing tests — happens in fresh worker contexts that each get their own full window. Your main session's context is preserved for orchestration.
The dispatcher carries the cognitive load
With claude --background, multiple terminals, or fire-and-forget agent runners — you are the orchestrator. You track what's running, check on progress, notice failures, and context-switch between outputs.
With dispatch, the AI dispatcher tracks all workers, surfaces questions, reports completions, handles errors, and offers recovery. Your job reduces to: (a) describe what you want, (b) answer questions when asked.
Workers ask questions back
When a /dispatch worker gets stuck, it doesn't silently fail or hallucinate. It asks a clarifying question — the dispatcher surfaces it to you, you answer, and the worker continues without losing context. No restart, no re-explaining, no lost work.
Worker is asking: "requirements.txt doesn't exist. What feature should I implement?"
> Add a /health endpoint that returns JSON with uptime and version.
Answer sent. Worker is continuing.
Non-blocking — you never wait
The moment a worker is dispatched, your session is immediately free. Dispatch another task. Ask a question. Write code. Workers run in parallel and results arrive as they complete.
Any model, one interface
Mix models per task. Claude for deep reasoning, GPT for broad generation, Gemini for speed. Reference any model by name — if it's not in your config, /dispatch auto-discovers and adds it. If multiple models are named in one prompt, dispatch uses the last one mentioned. If no model is specified, dispatch confirms your default before proceeding.
/dispatch use opus to review this PR for edge cases
/dispatch use gemini to refactor the config parser — it's getting unwieldy
Requires Claude Code as your host session. Dispatch is a skill that runs inside Claude Code — the host plans tasks and spawns workers. Other CLIs like Cursor and Codex work as workers only (background agents that execute subtasks).
Install
npx skills add bassimeledath/dispatch -g # user-level (all projects)
npx skills add bassimeledath/dispatch # project-level (team-shared)
How it works
- You run
/dispatch task description
- A checklist plan is created — the only context your main session needs
- A background worker picks it up in a fresh, full context window and checks off items as it goes
- If the worker has a question, it asks — you answer — it continues (no context lost in either direction)
- You get results when it's done, or ask for status anytime — your main session is still lean
Complex example
Example — pre-launch sweep for a fullstack SaaS app:
/dispatch we launch Thursday, need a full pre-launch sweep:
1) security audit the auth flow (JWT, sessions, password reset) and the
stripe payment integration. use opus, do it in a worktree
2) performance - load test the top 5 API endpoints, find N+1 queries,
check db indexes on users/orders/products tables. sonnet in a worktree
3) frontend audit - accessibility (WCAG 2.1 AA), bundle size, flag any
components missing error boundaries. dispatch sonnet
4) generate OpenAPI specs for every endpoint that's missing one. use gemini