From theclauu
Use to query, ingest, compile, promote, or lint the data-hive-mind wiki (Artemis-xyz/data-hive-mind). Ask questions against durable team knowledge; file discovered learnings; run health checks.
npx claudepluginhub artemis-xyz/theclauu --plugin theclauuThis skill uses the workspace's default tool permissions.
Single skill that owns reading + maintaining the team wiki at
Generates design tokens/docs from CSS/Tailwind/styled-components codebases, audits visual consistency across 10 dimensions, detects AI slop in UI.
Records polished WebM UI demo videos of web apps using Playwright with cursor overlay, natural pacing, and three-phase scripting. Activates for demo, walkthrough, screen recording, or tutorial requests.
Delivers idiomatic Kotlin patterns for null safety, immutability, sealed classes, coroutines, Flows, extensions, DSL builders, and Gradle DSL. Use when writing, reviewing, refactoring, or designing Kotlin code.
Single skill that owns reading + maintaining the team wiki at Artemis-xyz/data-hive-mind.
The wiki is durable team knowledge — architecture, conventions, decisions
(ADRs), postmortems, playbooks, entity pages. It sits above Notion
(session-scope) and Linear (ticket-scope). See wiki/decisions/2026-04-22-four-tier-knowledge-model.md
in the wiki for the full tier design.
Five modes. Pick by intent:
========================================
Status: implemented (Slice 1b, TheClauu v0.3.26+)
Answer a question using durable team knowledge already captured in the wiki.
This is the default mode if the user says /theclauu:dhm <question> with no
leading mode keyword.
The SessionStart hook injects wiki content into the agent's context as part
of <theclauu-hive-mind-context>. Before fetching anything, look in that JSON
for wiki.relevant_pages[*].content. Pages always present:
INDEX.md (table of contents)CLAUDE.md (schema + maintenance rules)wiki/entities/<current-repo>.md (if it exists)If the cache is missing or stale, fall through to step 2.
For pages not in the cache, use gh api:
gh api /repos/Artemis-xyz/data-hive-mind/contents/wiki/<path>.md \
--jq '.content' | base64 -d
Or if the user has a local clone at a known path (e.g.
~/DuneAnalytics/dex_trades/data-hive-mind/), read from there directly — no
network round-trip.
wiki/conventions/dbt-scd2.md./theclauu:dhm ingest once Slice 4 lands, or write the ADR by
hand in the meantime."Default: inline markdown answer with citations.
If the user asks for a different format:
/tmp/dhm-query-<timestamp>.md and tell the
user the path so they can open it in their editor..md with Marp frontmatter to
/tmp/dhm-query-<timestamp>.md.If the user's question surfaces a durable insight that belongs in the wiki (a pattern, decision, playbook), suggest filing it — but don't do it without confirmation. The wiki is a curated surface, not a dumping ground.
========================================
Status: not implemented yet (Slice 4). Do not attempt via this skill. For now, instruct the user to:
data-hive-mind/raw/<type>/YYYY-MM-DD-<slug>.<ext>.main (per the
feedback_data_hive_mind_direct_to_main.md feedback — is solo-owned
the repo for now).raw/ subfolders:
postmortems/ — incident transcripts, Slack on-call exportslinear-exports/ — closed epic transcripts worth preservingnotion-promoted/ — Notion pages queued for wiki compilepr-discussions/ — landmark PR comment exportsarticles/ — external reading (papers, blog posts)========================================
Status: not implemented yet (Slice 4).
Planned behavior: read raw/ files that haven't been compiled yet, extract
patterns / entities / decisions, write or update wiki/ pages, maintain
backlinks + INDEX.md.
========================================
Status: implemented (Slice 4, TheClauu v0.4.2+)
Scan a source — default: the current branch's open PR; fallback: an explicit
PR URL, branch name, or a Notion page URL — for wiki-worthy sections and
promote them to pages in Artemis-xyz/data-hive-mind.
Helper module: plugins/theclauu/bin/promote_to_wiki.py
Default source: current branch's open PR
# Detect current branch
git branch --show-current
# Find open PR for the branch
gh pr view --json number,title,body,url,labels
If gh pr view fails (no PR open, or detached HEAD), fall through to the
explicit-arg path.
Explicit args (in priority order):
--pr <URL-or-number> — fetch via fetch_pr_content(repo, pr_number) in
promote_to_wiki.py.--branch <name> — run gh pr list --head <name> --json number then
fetch that PR.--notion <URL> — extract the Notion page ID from the URL (last hex
segment), fetch full page text via bin/sources/notion.py, use that as
the source text.sys.stdin directly.Compose the source text as: PR title + PR body + (if any linked issue bodies are reachable, append them).
The wiki repo is a local clone of Artemis-xyz/data-hive-mind. Ask once and
cache the answer in ~/.theclauu/config.json under "wiki_repo_path".
AskUserQuestion: "Where is your local clone of data-hive-mind?
(Default: ../data-hive-mind/ relative to this repo — press Enter to accept)"
Fall back to ../data-hive-mind/ if the user presses Enter or the key is
already cached. Expand ~ and relative paths before passing to the helper.
Skip asking if --wiki-path <path> was passed as an argument.
Call scan_text(source_text) from bin/promote_to_wiki.py. This returns a
list[WikiCandidate] where each candidate carries:
| Field | Description |
|---|---|
category | decisions / playbooks / conventions / incidents / entities / concepts |
suggested_path | e.g. wiki/decisions/2026-04-22-iceberg.md |
title | inferred from the first heading in the matched section |
body_excerpt | first ~500 chars of the matched section |
Wiki-worthiness heuristics (in precedence order):
Decision:, Rationale:, Status: markers →
wiki/decisions/YYYY-MM-DD-<slug>.mdIncident:, Timeline:, Root cause:
→ wiki/incidents/YYYY-MM-DD-<slug>.md1., Step 1:, Step 1.)
→ wiki/playbooks/<slug>.mdAlways …, Never …, Must …,
Should …, Do not …) and a "Why" or "How to apply" section
→ wiki/conventions/<slug>.mdOverview of X, About X, What is X, Introduction to X
→ wiki/entities/<slug>.mdConcept:, Pattern:, Principle:, Strategy:, Design:
heading → wiki/concepts/<slug>.mdIf scan_text returns no candidates, tell the user: "No wiki-worthy sections
detected in this source. Common reasons: the PR description is too short,
or markers like 'Decision:' / 'Rationale:' are absent. You can promote
manually by editing wiki/ pages and running git commit in the wiki repo."
Show each candidate to the user as a formatted block:
📄 Proposed wiki page
Category: decisions
Path: wiki/decisions/2026-04-22-iceberg-cold-storage.md
Title: Why we chose Iceberg for cold storage
Excerpt: Decision: Use Iceberg tables for Snowflake cold storage tier.
Rationale: Iceberg provides time-travel…
Then ask:
AskUserQuestion: "Promote [N] candidate(s) to the wiki?
a) Promote all
b) Choose individually
c) Skip / cancel"
If "b" (choose individually): present each candidate and ask yes/no.
If "c": abort — print "Nothing promoted." and exit.
For each confirmed candidate, construct the full page content:
# <title>
> Promoted from PR #<number> (<repo>) on <YYYY-MM-DD>
> Source: <PR URL>
<original source section, verbatim>
For decisions and incidents, include today's date in the filename. For
other categories, the slug alone is enough (no date prefix).
Build WikiPage(path=candidate.suggested_path, title=..., content=...) for
each confirmed candidate.
Call apply_commit(wiki_repo_path, pages, message) from
bin/promote_to_wiki.py.
The helper:
origin/main and fast-forward-pulls if the local branch is behind.pathlib).INDEX.md — inserts a new entry under the appropriate category
header (idempotent: no-op if the path is already listed).INDEX.md, commits, and pushes to origin/main.Commit message format: feat(wiki): promote <N> page(s) from PR #<number>
(or feat(wiki): promote <N> page(s) from <source> for non-PR sources).
After a successful commit, print:
✓ Promoted <N> page(s) to Artemis-xyz/data-hive-mind:
- wiki/decisions/2026-04-22-iceberg-cold-storage.md
…
Committed and pushed to main.
If push is not available (no remote configured — e.g. a local-only test clone), the commit still succeeds and the report says "Committed locally (no remote configured — push manually when ready)."
Default: inline confirmation report (paths + commit SHA).
--dry-run flag: skip steps 5–7, print the candidate list and suggested
paths without writing anything.
| Condition | Behavior |
|---|---|
| Wiki repo has dirty working tree | Abort with message listing dirty files |
| Wiki repo is behind origin/main and fast-forward fails | Abort — manual merge required |
gh unavailable (no CLI) | Fall back to prompting the user to paste the PR body manually |
| Notion fetch fails | Warn and continue with PR text only |
| Page file already exists | Append a ## Update — YYYY-MM-DD section rather than clobbering |
========================================
Status: implemented (Slice 5, TheClauu v0.3.29+)
Run structural health checks on the Artemis-xyz/data-hive-mind wiki.
Designed for two contexts:
/theclauu:dhm lint in a session./theclauu:dhm lint [--repo <path>]
Default wiki repo path: ../data-hive-mind/ relative to the current
project. If the path is unknown, prompt the engineer once. Store the answer
for the session.
Steps:
python bin/lint_wiki.py --repo <path> --format text (human output).[HIGH] findings in bold./theclauu:dhm lint in
CI to file these as Linear tickets automatically."The nightly workflow (see install instructions in
plugins/theclauu/docs/slice-5-install.md) runs
bin/lint_wiki_ci.py, which:
lint_wiki.lint(cwd) — pure Python, no network.high or medium severity finding.auto_fixable=True findings in-place.dhm-lint/<YYYY-MM-DD>
and opens a PR. Always branches — never commits direct-to-main — so every
auto-fix gets a human eye before merge.All checks run against the wiki/ subdirectory of the repo root.
dead_links) — severity: highAny ](wiki/<path>) or ](../<path>) in any page pointing to a
non-existent file.
](<dead-target>) with
](TODO(dhm-lint): dead link — <target> not found).orphans) — severity: mediumAny page in wiki/*/ with zero inbound links. Checks both inter-page links
and INDEX.md entries.
INDEX.md itself; top-level files like CLAUDE.md.stale) — severity: mediumAny page in wiki/playbooks/, wiki/entities/, or wiki/conventions/ not
touched (per git log -1 --format=%ct) in >90 days.
wiki/decisions/ and wiki/postmortems/ are historical
records — intentionally never flagged as stale.missing_entities) — severity: lowA system name appears as H1/H2 (# SystemName) in ≥2 distinct pages but
has no wiki/entities/<slug>.md.
index) — two sub-checks5a. File not in INDEX.md — severity: low, auto-fixable. Adds
- [<stem>](<rel-path>) under an ## Unlisted section at the bottom of
INDEX.md.
5b. INDEX.md references non-existent file — severity: medium, not auto-fixable. Files a ticket asking why INDEX points to something gone.
Text (ad-hoc):
dhm-lint: 4 issue(s) found.
[HIGH ] dead_links wiki/playbooks/on-call.md
Dead link: '../entities/pagerduty.md' → resolved path does not exist [AUTO-FIXABLE]
Fix: Replace '](../entities/pagerduty.md)' with '](TODO(dhm-lint): dead link …)'
[MEDIUM] orphans wiki/entities/deprecated-system.md
Orphan page: no other page links here and INDEX.md doesn't list it.
...
JSON (CI): --format json emits a JSON array of Finding objects, one per
issue. Consumed by lint_wiki_ci.py for programmatic processing.
{
"check": str, // dead_links | orphans | stale | missing_entities | index
"path": str, // repo-relative path of affected file
"severity": str, // high | medium | low
"message": str, // human-readable description
"auto_fixable": bool, // safe to apply in CI without engineer review
"suggested_fix": str // what to do (empty if not auto_fixable)
}
Always branch (dhm-lint/<YYYY-MM-DD>) — even for pure INDEX additions.
Reasoning: predictability beats saving a click. Engineers can merge the PR
in seconds; a surprise direct-to-main commit from a bot is confusing.
========================================
Artemis-xyz/data-hive-mindmainmain (single-contributor repo;
feedback_data_hive_mind_direct_to_main.md). If another engineer becomes a
regular contributor, this flips to PR-gated.CLAUDE.md in the wiki.