Review existing plans and collect structured feedback for improvements
Facilitates structured review of implementation plans and collects actionable feedback systematically.
/plugin marketplace add nstrayer/claude-planning/plugin install bootandshoe@bootandshoe-claude-planningopusYou are tasked with facilitating a structured review of an existing implementation plan. Your goal is to guide the user through a comprehensive review process and collect actionable feedback.
When this command is invoked:
Check if a plan file was provided:
If no plan file provided, respond with:
I'll help you review an implementation plan and collect structured feedback.
Which plan would you like to review? Please provide the path (e.g., `thoughts/shared/plans/2026-01-07-feature.md`).
Tip: List recent plans with `ls -lt thoughts/shared/plans/ | head`
After reading the plan, present a structured summary:
## Plan Summary: [Title]
**Goal**: [1-sentence summary of what the plan accomplishes]
**Phases**:
1. [Phase 1 name] - [brief description]
2. [Phase 2 name] - [brief description]
...
**Key Technical Decisions**:
- [Decision 1]
- [Decision 2]
**Success Criteria Overview**:
- Automated: [count] checks
- Manual: [count] verification steps
Ready to begin the review? I'll walk through each section and collect your feedback.
Guide the user through reviewing each major section:
## Scope Review
**Currently IN scope**:
- [Item 1]
- [Item 2]
**Currently OUT of scope**:
- [Item 1]
- [Item 2]
Questions:
- Is anything missing from the scope?
- Should anything be removed or deferred?
- Are the boundaries clear?
Provide feedback using <feedback> tags or respond with "looks good" to continue.
## Technical Approach Review
**Proposed approach**: [Summary]
**Key patterns/conventions being followed**:
- [Pattern 1] from [file:line]
- [Pattern 2] from [file:line]
Questions:
- Does this approach align with how you want the system to evolve?
- Are there alternative approaches worth considering?
- Any concerns about maintainability or complexity?
Provide feedback using <feedback> tags or respond with "looks good" to continue.
For each phase:
## Phase [N]: [Name] Review
**Changes**:
- [Change 1]
- [Change 2]
**Dependencies**: [What this phase depends on]
**Success Criteria**:
- Automated: [List]
- Manual: [List]
Questions:
- Is this phase properly scoped?
- Are the success criteria measurable and complete?
- Any missing edge cases?
Provide feedback using <feedback> tags or respond with "looks good" to continue.
## Risk Assessment
Based on my analysis, potential risks include:
- **[Risk 1]**: [Description and mitigation in plan]
- **[Risk 2]**: [Description and mitigation in plan]
Questions:
- Are there risks I haven't identified?
- Are the mitigations adequate?
- Should any risks block implementation?
Provide feedback using <feedback> tags or respond with "looks good" to continue.
After completing the review:
## Review Summary
**Feedback Collected**:
### Scope
- [Feedback item 1]
- [Feedback item 2]
### Technical Approach
- [Feedback item 1]
### Phase-Specific
- Phase 1: [Feedback]
- Phase 2: [Feedback]
### Risks
- [Feedback item 1]
**Recommended Actions**:
1. [Action based on feedback]
2. [Action based on feedback]
Would you like me to:
1. Update the plan with this feedback now (runs /iterate_plan)
2. Save this feedback for later review
3. Discuss specific items in more detail
If user chooses to update:
If user chooses to save:
thoughts/shared/reviews/YYYY-MM-DD-plan-name-review.md<feedback> tag usageThroughout the review, watch for <feedback> tags:
<feedback>
Phase 2 success criteria need a specific performance benchmark
</feedback>