LLM Tool Output Tokens Optimizer - buffers large command outputs to files, providing compact summaries to reduce token consumption and prevent wasteful command re-runs
npx claudepluginhub fprochazka/claude-code-plugins --plugin llm-totoIntelligent prompt optimization using skill-based architecture. Enriches vague prompts with research-based clarifying questions before Claude Code executes them
Easily create hooks to prevent unwanted behaviors by analyzing conversation patterns
AI-native product management for startups. Transform Claude into an expert PM with competitive research, gap analysis using the WINNING filter, PRD generation, and GitHub Issues integration.