doc-freshness
Detect documentation drift, stale references, and cross-document inconsistencies in any project. Scans for code-doc drift (API/function changes not reflected in docs), cross-doc drift (conflicting information across documents), and stale references (broken links, deleted files, outdated versions). Use when checking "doc freshness", "stale docs", "documentation drift", "broken links", "outdated documentation", "doc accuracy", "docs out of date", "doc audit", "doc health", or "verify documentation".
From roadmapnpx claudepluginhub joaquimscosta/arkhe-claude-plugins --plugin roadmapThis skill is limited to using the following tools:
EXAMPLES.mdTROUBLESHOOTING.mdWORKFLOW.mdscripts/link_checker.pyscripts/scan_freshness.pyscripts/shared.pyscripts/version_checker.pyDocumentation Freshness
Detect documentation drift across any project. Reports findings without auto-fixing.
Context Discovery
Priority 1: Configuration
Read .arkhe.yaml from project root. Extract doc-freshness: section for custom patterns, exclusions, and doc-code mappings.
Extract roadmap: section for output_dir (used by report mode, default: arkhe/roadmap).
Priority 2: Project Identity
Read CLAUDE.md and README.md to understand project structure, tech stack, and conventions.
Priority 3: Documentation Inventory
Run the scanner to discover all documentation files and perform mechanical checks:
python3 ${CLAUDE_SKILL_DIR}/scripts/scan_freshness.py <project-root>
Use --links-only for fast mode (links and file refs only, no git staleness).
Arguments
Parse from $ARGUMENTS:
| Mode | Description |
|---|---|
scan | Full freshness analysis (all three drift types) |
check <path> | Focused analysis on one file or directory |
links | Broken links and stale references only (script-driven, fast) |
drift <path> | Code-doc drift for a specific doc or doc-code pair |
cross-doc | Cross-document consistency check |
report | Persist structured freshness report to {output_dir}/freshness/ |
| (none) | Full scan (same as scan) |
Mode Execution
scan (default)
- Run
scan_freshness.pyfor mechanical checks (links, versions, git staleness) - Present script findings (broken links table, version mismatches, staleness scores)
- For stale/very_stale docs: use Grep/Read to check code-doc alignment on key references
- For docs covering the same topic: cross-check for consistency
- Produce the freshness report
links
- Run
scan_freshness.py --links-only - Present broken links table directly from JSON output
- Group findings by severity: broken links first, then file_ref warnings
check <path>
- If path is a file, run link checker and version checker on that file only
- If path is a directory, scan all
.mdfiles within it - Use Grep to check file path references and function names against the codebase
- Report findings for the targeted scope
drift <path>
- Read the specified doc
- Extract: function/method names, API endpoints, file paths, config keys, class names
- Grep/Read the corresponding code to verify each reference still exists and is accurate
- Report mismatches with evidence (doc line vs code location)
cross-doc
- Run
scan_freshness.pyto get doc inventory with headings - Identify docs with overlapping topics (shared heading keywords)
- Read overlapping docs and compare factual claims
- Flag contradictions (e.g., different version requirements, conflicting setup steps)
report
Same as scan but write output to {output_dir}/freshness/{YYYY-MM-DD}-freshness.md.
Severity Levels
| Severity | Meaning |
|---|---|
| CRITICAL | Broken link to deleted file, doc references removed API/function |
| WARNING | Version mismatch, function signature changed, doc stale >30 days |
| INFO | Minor inconsistency, doc aging (7-30 days), cosmetic drift |
Output Rules
- Evidence-based: every finding backed by file path and line number
- Tabular: summary table first, detailed findings below
- Actionable: each finding includes what needs updating
- Detection only: NEVER auto-fix documentation
Lane Discipline
- Do NOT update or rewrite documentation — detection only
- Do NOT produce roadmap status, architecture analysis, or user stories
- Do NOT write source code
References
- WORKFLOW.md — Detection algorithms and convention tables
- EXAMPLES.md — Usage examples for each mode
- TROUBLESHOOTING.md — Common issues and fixes