From nickcrew-claude-ctx-plugin
Audits documentation completeness by inventorying code surface (CLI commands, env vars, endpoints, config keys) across Python/JS/TS/Rust/Go/Ruby/Java/shell projects and mapping to existing docs for prioritized topic-based gap reports. Use after features, pre-release, or missing doc reports.
npx claudepluginhub nickcrew/claude-cortexThis skill uses the workspace's default tool permissions.
Determine whether a documentation set covers everything it should by building an
Scans workspace for all documentation sources, assesses accuracy/freshness/completeness/discoverability, identifies knowledge gaps and risks. Triggered by doc existence, assessment, or gap queries.
Audits markdown documentation against current codebase for accuracy, broken links, incorrect code examples, docstring coverage, and staleness. Prioritizes stale, recent, or all files.
Audits codebase documentation for accuracy, completeness, and freshness by comparing against code structure. Auto-fixes small discrepancies in fix mode, reports structural changes. Works with any language/framework.
Share bugs, ideas, or general feedback.
Determine whether a documentation set covers everything it should by building an inventory of what needs documenting and comparing it to what exists. The output is a prioritized gap report — not new documentation.
doc-maintenance (structural) and doc-claim-validator (accuracy) to go wider| Resource | Purpose | Load when |
|---|---|---|
references/coverage-model.md | Defines what "complete" means per doc type | Always (Phase 1) |
Phase 1: Inventory → Build the "should exist" list from code and config
Phase 2: Map → Match inventory items to existing documentation
Phase 3: Classify → Score each gap by audience impact
Phase 4: Report → Produce the prioritized gap report
Construct a list of everything that should be documented. Use four sources, checking all of them:
Run the bundled inventory script to extract documentable surface area deterministically:
python3 skills/doc-completeness-audit/scripts/inventory.py --root . --json > inventory.json
# Or human-readable:
python3 skills/doc-completeness-audit/scripts/inventory.py --root .
# Run specific detectors only:
python3 skills/doc-completeness-audit/scripts/inventory.py --root . --detectors env_vars,cli_commands
The script scans source files across Python, JavaScript/TypeScript, Rust, Go, Ruby, Java, and shell, extracting six categories:
| Detector | What it extracts |
|---|---|
env_vars | Environment variable references (os.environ, process.env, env::var, etc.) |
cli_commands | CLI commands and flags (argparse, click, clap, cobra, commander) |
config_keys | Configuration key access in config-related files |
http_endpoints | HTTP route definitions (Flask, FastAPI, Express, Actix, Axum, net/http) |
public_exports | Public module exports (__init__.py, export, pub fn, Go capitalized funcs) |
error_types | Custom error/exception class definitions |
| Event types, webhooks, callbacks | Every event name and payload shape |
Dispatch an Explore agent to scan for these signals. Provide it with the project's primary language and entry points.
Identify features a user interacts with:
Identify what operators and maintainers need:
Check existing docs for promises of documentation that doesn't exist:
Output: A structured inventory list. Each item has:
topic — what needs documentingsource — where the requirement was discovered (code path, config key, user flow)audience — who needs this (end user, developer, operator)type — what kind of doc it needs (reference, tutorial, guide, explanation)For each inventory item, search existing docs:
Search strategy:
Mark each item:
Not all gaps are equal. Score each gap using audience impact:
| Priority | Criteria | Example |
|---|---|---|
| P0 | User cannot accomplish a core task without this | No installation guide, undocumented required config |
| P1 | User can work around it but wastes significant time | CLI flag exists but undocumented, error message without troubleshooting |
| P2 | Missing docs for secondary features or advanced use cases | Plugin API undocumented, advanced config options missing |
| P3 | Missing docs for edge cases or rarely used features | Obscure env var, deprecated feature migration path |
| P4 | Nice to have — explanatory content, design rationale | Architecture decision records, "why" behind defaults |
Apply a multiplier based on audience:
| Audience | Weight | Rationale |
|---|---|---|
| New users / onboarding | 1.5x | First impressions; high abandonment risk |
| Daily users | 1.0x | Core audience |
| Advanced users / contributors | 0.8x | Can read source when docs fail |
| Internal operators | 0.7x | Can ask the team |
A P2 gap for new users (P2 × 1.5 = 3.0) outranks a P1 gap for internal operators (P1 × 0.7 = 2.1).
# Documentation Completeness Audit
**Audit date:** YYYY-MM-DD
**Scope:** [directories or doc sets audited]
**Inventory items:** N total
**Coverage:** N documented / N shallow / N missing / N misplaced
---
## Summary
[2-3 sentences: overall completeness assessment]
Coverage by audience:
| Audience | Documented | Shallow | Missing | Coverage % |
|----------|-----------|---------|---------|------------|
| New users | N | N | N | N% |
| Daily users | N | N | N | N% |
| Contributors | N | N | N | N% |
| Operators | N | N | N | N% |
---
## P0 Gaps — Blocking
| # | Topic | Audience | Source | Current State | What's Needed |
|---|-------|----------|--------|---------------|---------------|
| 1 | [topic] | [who] | [code path] | Missing | [what to write] |
## P1 Gaps — High Impact
| # | Topic | Audience | Source | Current State | What's Needed |
|---|-------|----------|--------|---------------|---------------|
## P2 Gaps — Moderate Impact
| # | Topic | Audience | Source | Current State | What's Needed |
|---|-------|----------|--------|---------------|---------------|
## P3-P4 Gaps — Low Priority
| # | Topic | Audience | Priority | Current State |
|---|-------|----------|----------|---------------|
---
## Shallow Coverage Details
For each Shallow item, explain what's insufficient:
### [Topic]
**Current doc:** [path and section]
**Problem:** [what's missing — examples, edge cases, complete reference, etc.]
**Recommended action:** [specific improvement]
---
## Misplaced Documentation
| Topic | Current Location | Recommended Location | Why |
|-------|-----------------|---------------------|-----|
---
## Well-Documented (No Action Needed)
[List topics with adequate coverage, grouped by audience, so the report
shows the full picture and not just the gaps]
This skill fits into the documentation health pipeline:
doc-maintenance → Structural health (links, orphans, folders)
doc-claim-validator → Semantic accuracy (do claims match code?)
doc-completeness-audit → Topic coverage (is everything documented?)
doc-quality-review → Prose quality (is it well-written?)
doc-architecture-review → Information architecture (is it findable?)
Route gap remediation to the appropriate producer:
reference-documentationtutorial-designdocumentation-productiondocs/archive/) — they are historicalscripts/inventory.py — Extract documentable surface area from any codebase (env vars, CLI commands, config keys, HTTP endpoints, public exports, error types)references/coverage-model.md — Defines coverage expectations per doc type and audience