From claude-token-reducer
Use this skill whenever the user needs to reduce context bloat, lower token usage, summarize large code/docs corpora, run FTS plus embeddings retrieval, rerank top chunks, and produce compact context packets before implementation work. Also trigger when users say context is too long, reduce tokens, optimize prompt size, retrieve top chunks, compress context, or ask for cost-saving prompt workflows.
npx claudepluginhub madhan230205/token-reducer --plugin claude-token-reducerThis skill is limited to using the following tools:
Cut context size without cutting answer quality.
Searches, retrieves, and installs Agent Skills from prompts.chat registry using MCP tools like search_skills and get_skill. Activates for finding skills, browsing catalogs, or extending Claude.
Searches prompts.chat for AI prompt templates by keyword or category, retrieves by ID with variable handling, and improves prompts via AI. Use for discovering or enhancing prompts.
Creates isolated Git worktrees for feature branches with prioritized directory selection, gitignore safety checks, auto project setup for Node/Python/Rust/Go, and baseline verification.
Cut context size without cutting answer quality.
python "${CLAUDE_PLUGIN_ROOT}/scripts/context_pipeline.py" run --inputs . --query "${ARGUMENTS}" --hybrid-mode fallback --top-k 5python "${CLAUDE_PLUGIN_ROOT}/scripts/context_pipeline.py" self-test./references/implementation-guide.md./references/context7-integration.md