From sdd
Validate a specification document against the codebase and best practices
npx claudepluginhub mkelk/claude-marketplace --plugin sddspec-file-path## Context - Today's date: !`date +%Y-%m-%d` - Spec file: $ARGUMENTS ## Your Task You are validating a specification document to ensure it's well-defined and implementable. Follow these steps: ### Step 1: Locate the Spec File Specs live in subdirectories of `/docs/projects/` following the pattern: `/docs/projects/<date>-<project-name>/<date>-<project-name>-spec.md` If no spec file path was provided (`$ARGUMENTS` is empty): 1. List available spec directories in `/docs/projects/` (folders matching `YYYY-MM-DD-*/`) 2. Look for `*-spec.md` files within those directories 3. Ask the user w...
date +%Y-%m-%dYou are validating a specification document to ensure it's well-defined and implementable. Follow these steps:
Specs live in subdirectories of /docs/projects/ following the pattern:
/docs/projects/<date>-<project-name>/<date>-<project-name>-spec.md
If no spec file path was provided ($ARGUMENTS is empty):
/docs/projects/ (folders matching YYYY-MM-DD-*/)*-spec.md files within those directoriesWhich specification would you like me to validate? [list the available specs, or provide a path]
Read the specification file thoroughly. Identify:
Explore the relevant parts of the codebase to check for:
This is critical. Before implementation begins, the spec must define how the changes will be verified. Review the project's existing test infrastructure by checking /docs/current/test.md, package.json scripts, and existing test files.
Evaluate whether the spec adequately addresses:
The spec does NOT need to list every individual test case. It DOES need to describe:
If the testing strategy is weak or missing:
The spec lacks a clear testing strategy. Before implementation, we need to define:
- How will [feature X] be tested?
- What manual verification is needed for [feature Y]?
- Are there edge cases that need specific test coverage?
A spec without a testing strategy leads to untested code or ad-hoc testing that misses important scenarios.
Before marking a spec as ready, verify the codebase is in a healthy state:
/docs/current/ for environment validation commandsThis ensures implementation won't be blocked by pre-existing issues.
Identify any gaps that could cause implementation uncertainty:
/docs/current/Present a structured validation report:
## Spec Validation Report: <spec-name>
**Validated:** <today's date>
**Status:** [READY | NEEDS REVISION | BLOCKED]
### Summary
[1-2 sentence overall assessment]
### Consistency Check
- [ ] Referenced files exist
- [ ] APIs/interfaces are accurate
- [ ] Naming conventions match codebase
### Technology Alignment
- [ ] Uses existing tech stack appropriately
- [ ] No conflicts with current choices
- [ ] Follows established patterns
### Environment Validation
- [ ] `/docs/current/` contains validation commands
- [ ] Validation commands documented in this report (or flagged as missing)
### Testing Strategy
- [ ] Spec defines testing approach for each major feature area
- [ ] Clear distinction between automated vs. manual testing needs
- [ ] Edge cases and error scenarios addressed
- [ ] Testing tools/frameworks identified (using project's existing infrastructure)
**Testing Assessment:** [ADEQUATE | NEEDS WORK | MISSING]
[If not adequate, list what's missing and challenge the user to define it]
### Issues Found
#### Critical (Must Fix)
[List issues that would block implementation]
#### Warnings (Should Address)
[List issues that could cause problems]
#### Suggestions (Nice to Have)
[List improvements that would strengthen the spec]
### Missing Information
[List any gaps that need clarification before implementation]
### Recommendations
[Specific next steps to make the spec implementation-ready]
After presenting the report, ask:
Would you like me to:
- Update the spec with fixes for the issues found?
- Explore any specific concern in more detail?
- Create follow-up tasks for unresolved items?