Context
- Existing specs: !
ls -la specs/ 2>/dev/null || echo "No specs directory found"
FIRST PRINCIPLES PROBLEM ANALYSIS
Before defining any solution, validate the problem from first principles:
Core Problem Investigation
- Strip Away Solution Assumptions: What is the core problem, completely separate from any proposed solution?
- Root Cause Analysis: Why does this problem exist? What created this need?
- Goal Decomposition: What are we fundamentally trying to achieve for users/business?
- Success Definition: What would success look like if we had unlimited resources and no constraints?
- Alternative Approaches: Could we achieve the underlying goal without building anything? Are there simpler approaches?
Problem Validation Questions
- Real vs. Perceived: Is this solving a real problem that users actually have?
- Assumption Audit: What assumptions about user needs, technical constraints, or business requirements might be wrong?
- Value Proposition: What is the minimum viable solution that delivers core value?
- Scope Validation: Are we solving the right problem, or treating symptoms of a deeper issue?
CRITICAL: Only proceed if the core problem is clearly defined and validated. If uncertain, request additional context.
MANDATORY PRE-CREATION VERIFICATION
After validating the problem from first principles, complete these technical checks:
1. Context Discovery Phase
- Search existing codebase for similar features/specs using the Task tool with Explore agent
- Apply domain-specific best practices when research involves specific domains (TypeScript, React, testing, databases, etc.)
- Reference project-specific conventions from CLAUDE.md or similar documentation
- Use the Explore agent sparingly for deep codebase analysis when needed
- Identify potential conflicts or duplicates
- Verify feature request is technically feasible
- Document any missing prerequisites
2. Request Validation
- Confirm request is well-defined and actionable
- If vague or incomplete, STOP and ask clarifying questions
- Validate scope is appropriate (not too broad/narrow)
3. Quality Gate
- Only proceed if you have 80%+ confidence in implementation approach
- If uncertain, request additional context before continuing
- Document any assumptions being made
CRITICAL: If any validation fails, STOP immediately and request clarification.
Your task
Create a comprehensive specification document in the specs/ folder for the following feature/bugfix: $ARGUMENTS
First, analyze the request to understand:
- Whether this is a feature or bugfix
- The scope and complexity
- Related existing code/features
- External libraries/frameworks involved
END-TO-END INTEGRATION ANALYSIS
Before writing the detailed specification, map the complete system impact:
System Integration Mapping
- Data Flow Tracing: Trace data flow from user action -> processing -> storage -> response
- Service Dependencies: Identify all affected services, APIs, databases, and external systems
- Integration Points: Map every place this feature touches existing functionality
- Cross-System Impact: How does this change affect other teams, services, or user workflows?
Complete User Journey Analysis
- Entry Points: How do users discover and access this feature?
- Step-by-Step Flow: What is the complete sequence from start to finish?
- Error Scenarios: What happens when things go wrong at each step?
- Exit Points: How does this connect to what users do next?
Deployment and Rollback Considerations
- Migration Path: How do we get from current state to new state?
- Rollback Strategy: What if we need to undo this feature?
- Deployment Dependencies: What must be deployed together vs. independently?
- Data Migration: How do we handle existing data during the transition?
VERIFICATION: Ensure you can trace the complete end-to-end flow before proceeding to detailed specification.
Then create a spec document that includes:
- Title: Clear, descriptive title of the feature/bugfix
- Status: Draft/Under Review/Approved/Implemented
- Authors: Your name and date
- Overview: Brief description and purpose
- Background/Problem Statement: Why this feature is needed or what problem it solves
- Goals: What we aim to achieve (bullet points)
- Non-Goals: What is explicitly out of scope (bullet points)
- Technical Dependencies:
- External libraries/frameworks used
- Version requirements
- Links to relevant documentation
- Detailed Design:
- Architecture changes
- Implementation approach
- Code structure and file organization
- API changes (if any)
- Data model changes (if any)
- Integration with external libraries (with examples from docs)
- User Experience: How users will interact with this feature
- Testing Strategy:
- Unit tests
- Integration tests
- E2E tests (if needed)
- Mocking strategies for external dependencies
- Test documentation: Each test should include a purpose comment explaining why it exists and what it validates
- Meaningful tests: Avoid tests that always pass regardless of behavior
- Edge case testing: Include tests that can fail to reveal real issues
- Performance Considerations: Impact on performance and mitigation strategies
- Security Considerations: Security implications and safeguards
- Documentation: What documentation needs to be created/updated
- Implementation Phases:
- Phase 1: MVP/Core functionality
- Phase 2: Enhanced features (if applicable)
- Phase 3: Polish and optimization (if applicable)
- Open Questions: Any unresolved questions or decisions
- References:
- Links to related issues, PRs, or documentation
- External library documentation links
- Relevant design patterns or architectural decisions
Follow these guidelines:
- Use Markdown format similar to existing specs
- Be thorough and technical but also accessible
- Include code examples where helpful
- Consider edge cases and error scenarios
- Reference existing project patterns and conventions
- Use diagrams if they would clarify complex flows (using ASCII art or mermaid)
- When referencing external libraries, include version-specific information
- Do NOT include time or effort estimations (no "X days", "Y hours", or complexity estimates)
Name the spec file descriptively based on the feature:
- Features:
feat-{kebab-case-name}.md
- Bugfixes:
fix-{issue-number}-{brief-description}.md
PROGRESSIVE VALIDATION CHECKPOINTS
After completing each major section:
- Problem Statement: Verify it's specific and measurable
- Technical Requirements: Confirm all dependencies are available
- Implementation Plan: Validate approach is technically sound
- Testing Strategy: Ensure testability of all requirements
At each checkpoint, if quality is insufficient, revise before proceeding.
FINAL SPECIFICATION VALIDATION
Before marking complete:
- Completeness Check: All 17 sections meaningfully filled
- Consistency Check: No contradictions between sections
- Implementability Check: Someone could build this from the spec
- Quality Score: Rate spec 1-10, only accept 8+
Before writing, search the codebase for:
- Related existing features or code
- Similar patterns in the codebase
- Potential conflicts or dependencies
- Current library versions in package.json or equivalent