From thinking-frameworks-skills
Verifies paper citations in substacker drafts use 'Author(s), Institution, Year' inline format per style guide, flags bare hyperlinks, title-only, and missing attributions. Useful for research-heavy drafts.
npx claudepluginhub lyndonkl/claude --plugin thinking-frameworks-skillsThis skill uses the workspace's default tool permissions.
- [Correct form](#correct-form)
Generates design tokens/docs from CSS/Tailwind/styled-components codebases, audits visual consistency across 10 dimensions, detects AI slop in UI.
Records polished WebM UI demo videos of web apps using Playwright with cursor overlay, natural pacing, and three-phase scripting. Activates for demo, walkthrough, screen recording, or tutorial requests.
Delivers idiomatic Kotlin patterns for null safety, immutability, sealed classes, coroutines, Flows, extensions, DSL builders, and Gradle DSL. Use when writing, reviewing, refactoring, or designing Kotlin code.
Related skills: Called by Editor in voice pass. Reads shared-context/style-guide.md for the citation template.
From style-guide.md:
Papers cited in prose:
Author(s), Institution, Year — Titleon first mention. Not a bare hyperlink, not a title alone.
Examples of correct:
Examples to flag:
Citation check draft D:
- [ ] Step 1: Detect paper references (arXiv patterns, italicized titles, hyperlinks to paper sites, "a paper")
- [ ] Step 2: For each reference, parse surrounding sentence for author + institution
- [ ] Step 3: If author OR institution missing, flag tier-2 (unless it's the second+ mention of a paper already correctly cited once)
- [ ] Step 4: Suggest the correct form
arxiv.org/abs/\d{4}\.\d{4,5}10\.\d+/\S+\*[A-Z][^*]+\* of length ≥3 words (filters out italicized phrases)(arxiv|openreview|papers\.nips|proceedings)If a paper is correctly cited in full on first mention, subsequent mentions can use a short form ("Chen et al. also noted…"). Only first-mention requires full Author, Institution, Year.
Draft:
RAG (as shown in this paper) beats fine-tuning for recall.
A Google paper found that retrieval dominates for rare facts.
Attention Is All You Need is the foundational work.
Chen et al., Google, 2024 — "Fine-Tuning or Retrieval?" — compared both approaches; Chen et al. concluded fine-tuning loses on tail knowledge.
Flags:
[short] in their draft (convention for intentional short-form on first mention).