Test a story or feature to verify the implementation works correctly
Verifies implementations work correctly through automated and manual testing.
/plugin marketplace add eLafo/hefesto/plugin install elafo-hefesto-2@eLafo/hefesto<path> | story <path> | feature <id>dev/Router command that verifies implementations work correctly.
Key distinction: Testing verifies the software works correctly (Dev concern). Validation verifies it meets requirements (PO concern).
| Input | Strategy |
|---|---|
story <path> | Story Testing (unit/component) |
feature <id> | Feature Testing (integration) |
<path> (auto-detect) | Detect from path pattern |
Trigger: dev:test story <path> or auto-detected story path
codedDetected Testing Setup:
- Framework: Jest / Pytest / Go test
- Runner: npm test / pytest / go test
- Coverage: Available
- E2E: Playwright detected
npm test -- --coverage
# or equivalent
Analyze:
Check quality:
Create {story_path}/test-checklist.md:
# Test Checklist: {Story Title}
## Automated Tests
| Metric | Value |
|--------|-------|
| Total | 24 |
| Passing | 24 |
| Coverage | 87% |
## Manual Test Cases
### TC-001: {Test Name}
**Preconditions**: {setup}
**Steps**:
1. {action}
2. {action}
**Expected**: {result}
**Result**: [ ] Pass / [ ] Fail
### TC-002: {Error Case}
...
## Exploratory Notes
{observations}
## Issues Found
| ID | Description | Severity |
|----|-------------|----------|
| BUG-001 | ... | High |
## Summary
- [ ] All automated tests passing
- [ ] All manual cases executed
- [ ] No critical bugs
- [ ] Ready for dev documentation
If all tests pass:
testedCreate {story_path}/test-report.md:
# Test Report: {Story Title}
## Summary
- **Date**: {timestamp}
- **Result**: PASS / FAIL
## Automated
- Tests: 24/24 passing
- Coverage: 87%
## Manual
- Cases: 5/5 passed
## Issues
None / {list}
## Recommendation
Ready for documentation / Needs fixes
✅ Story Testing Complete
Story: {title}
Automated Tests:
✅ 24/24 passing
✅ Coverage: 87%
Manual Tests:
✅ 5/5 passed
No issues found.
→ Next (DEV): dev:document story {story_path}
Trigger: dev:test feature <id> or auto-detected feature path
dev_documentedBased on:
npm run test:integration
# or equivalent
Create .hefesto/features/{id}/integration-tests.md:
# Integration Tests: {Feature Title}
## User Flow Tests
### Flow 1: {Name}
**Stories**: 001, 002, 003
**Steps**:
1. [ ] Complete story 001 action
2. [ ] Transition to story 002
3. [ ] Complete story 002 action
**Result**: [ ] Pass / [ ] Fail
### Flow 2: {Error Flow}
...
## Integration Point Tests
### IP-1: {Story A} → {Story B}
**Testing**: Data passing
**Steps**: ...
**Result**: [ ] Pass / [ ] Fail
## Performance Tests
### Perf-1: {Scenario}
**Threshold**: < 500ms
**Actual**: {measured}
**Result**: [ ] Pass / [ ] Fail
## Issues
| ID | Description | Stories | Severity |
|----|-------------|---------|----------|
Create .hefesto/features/{id}/integration-report.md
Update:
workflow.feature_tested → true✅ Feature Integration Testing Complete
Feature: {title}
Results:
User Flows: ✅ 4/4 passed
Integration: ✅ 6/6 passed
Performance: ✅ 2/2 passed
Automated: 15/15 passing
No issues found.
→ Next (PO): po:validate feature {id}
| Aspect | Testing (Dev) | Validation (PO) |
|---|---|---|
| Question | Does it work? | Does it meet requirements? |
| Focus | Correctness | Completeness |
| Methods | Automated + manual tests | AC verification |
| Output | Test reports | Validation reports |