From armor
Triages documentation sets from Confluence spaces, local folders, Google Docs, markdown files, or Notion exports to identify overlaps, staleness, consolidation opportunities, and backlink candidates.
npx claudepluginhub markacianfrani/armor --plugin armorThis skill uses the workspace's default tool permissions.
You are performing a documentation triage — a systematic review of a collection of documents to
Mandates invoking relevant skills via tools before any response in coding sessions. Covers access, priorities, and adaptations for Claude Code, Copilot CLI, Gemini CLI.
Share bugs, ideas, or general feedback.
You are performing a documentation triage — a systematic review of a collection of documents to surface overlaps, staleness, consolidation opportunities, and backlinking potential. The goal is to help the user understand the health of their documentation and give them a clear action plan.
First, figure out what you're working with. The user might point you at:
If the source isn't clear, ask. Once you know the source, enumerate all the documents. For large sets (20+ docs), tell the user how many you found and confirm before proceeding — scanning everything takes time and tokens.
Spawn subagents to review the documents in parallel. Each subagent should review a batch of documents (aim for 3-5 docs per agent, adjust based on document length). Give each subagent these instructions:
You are analyzing documentation as part of a triage process. For each document, produce a
structured summary in this exact format:
## [Document Title]
- **Source**: [path, URL, or page ID]
- **Primary Topic**: [1-2 sentence description of what this doc is about]
- **Key Concepts**: [comma-separated list of the main ideas, tools, processes, or terms covered]
- **Freshness Signals**: [any dates mentioned, tool versions referenced, deprecated patterns,
or other indicators of how current the content is. Note anything that looks outdated.]
- **Scope & Depth**: [is this a broad overview or a deep-dive? How many subtopics does it cover?]
- **Atomic Ideas**: [list any genuinely independent procedures or concepts bundled into this
doc that could stand alone as their own page AND that other docs might want to link to
individually. Don't list sections of a single narrative — only flag pieces that serve
different audiences or use cases and have enough substance to justify a standalone page.]
- **Natural Link Targets**: [what other topics does this doc reference, depend on, or assume
the reader already knows? These are candidates for backlinks.]
Be thorough but concise. The summaries will be compared across all docs to find patterns.
Save each subagent's output. Wait for all of them to finish before moving on.
Once all document summaries are collected, analyze them together. You're looking for four things:
Compare the Primary Topic and Key Concepts across all documents. Flag any pair (or group) of docs where:
For each overlap found, make a specific recommendation: should these docs be merged? Should one be retired in favor of the other? Should they be restructured into a parent + child pages? Explain your reasoning — don't just flag the overlap, explain what to do about it.
Review the Freshness Signals from all summaries. Flag documents that show signs of being outdated:
Rank staleness concerns by severity: "probably outdated and actively misleading" is more urgent than "could use a refresh but still mostly accurate."
This is about documentation health. A well-linked documentation system lets readers navigate naturally between related topics. Look for:
Review the Atomic Ideas from the summaries. Also apply the Diátaxis framework (see the
diataxis skill) to check whether any document is trying to serve multiple purposes — e.g.,
a tutorial that keeps veering into exhaustive reference, or a how-to guide overloaded with
explanation. Docs straddling Diátaxis categories are strong split candidates.
For atomicity specifically, be careful — not every long doc needs splitting. The litmus test is: would another document realistically need to link to just this one piece independently?
A doc that tells a single narrative across multiple sections (like an RFC with background, analysis, and recommendation) should stay as one page — those sections don't make sense in isolation. Splitting them would just fragment a coherent story.
On the other hand, a doc that bundles genuinely independent procedures or concepts — like "Developer Onboarding" covering macOS setup, Windows setup, IDE config, and database setup — is a strong split candidate because other docs (like a troubleshooting guide) might need to link to "macOS setup" without dragging in everything else.
Flag docs where:
Don't recommend splitting docs that are simply long but cohesive. A thorough analysis, a detailed proposal, or an experiment write-up with context → method → results is fine as a single page. Length alone is not a reason to split.
Present your findings directly in the conversation. Organize them by priority — lead with the highest-impact recommendations. Use this structure:
For each finding, include:
Keep it actionable. The user should be able to take this output and start making changes immediately. Don't pad the findings with caveats or preamble — get straight to the substance.
For small doc sets (under 10 docs), you may not even need subagents — just read them all yourself and do the analysis inline. Use your judgment.
For medium sets (10-30 docs), use the subagent approach described above.
For large sets (30+ docs), consider a two-pass approach: first scan titles and any available metadata to cluster docs by topic area, then do detailed analysis within each cluster. This avoids comparing every doc against every other doc.
Confluence: Use searchConfluenceUsingCql or getPagesInConfluenceSpace to enumerate pages.
Use getConfluencePage with contentFormat: "markdown" to read content. Page hierarchy (parent/child)
is useful context — note it in summaries.
Local files: Use Glob to find all .md, .txt, .rst, or .html files in the specified
directory. Use Read to get contents. File path structure often implies topic organization.
Google Drive: Use the Google Drive search tool to find documents. Fetch contents with the document fetch tool.
Mixed sources: Sometimes users have docs spread across multiple systems. Handle each source with its appropriate tools and merge the summaries before synthesis.