From thinking-frameworks-skills
Drafts one-paragraph suggestions to acknowledge simplified-boundary claim breaks in posts as teaching moments, e.g., folding O(n²) attention intuition before FlashAttention O(n) reality. Runs only on simplified-boundary after cross-reference-claim.
npx claudepluginhub lyndonkl/claude --plugin thinking-frameworks-skillsThis skill uses the workspace's default tool permissions.
```
Generates design tokens/docs from CSS/Tailwind/styled-components codebases, audits visual consistency across 10 dimensions, detects AI slop in UI.
Records polished WebM UI demo videos of web apps using Playwright with cursor overlay, natural pacing, and three-phase scripting. Activates for demo, walkthrough, screen recording, or tutorial requests.
Delivers idiomatic Kotlin patterns for null safety, immutability, sealed classes, coroutines, Flows, extensions, DSL builders, and Gradle DSL. Use when writing, reviewing, refactoring, or designing Kotlin code.
Per simplified-boundary claim:
- [ ] Step 1: Identify the intuition the writer was using (analogy, metaphor, concrete picture)
- [ ] Step 2: Identify precisely where the intuition breaks using primary source
- [ ] Step 3: Draft a one-sentence suggestion that names the break as a feature
- [ ] Step 4: Optionally suggest a follow-up post if break is too rich for a sentence
- [ ] Step 5: Return {intuition, break_point, fold_suggestion, optional_follow_up}
Claim: "Attention is O(n²) in memory."
Classification: simplified-boundary.
Primary source: Dao et al. 2022 — FlashAttention, arXiv:2205.14135. "FlashAttention uses O(N) memory rather than the O(N²) of standard attention."
Fold suggestion:
"The O(n²) figure is the naive memory cost — modern production attention (FlashAttention, Dao et al. 2022) is actually O(n) in HBM by never materializing the full attention matrix. The quadratic view is still the right intuition for 'why context windows are expensive' circa 2020, but the actual frontier now is 'attention is memory-bandwidth-bound,' which is a richer story — probably a follow-up post."
Optional follow-up: "FlashAttention as memory-bandwidth reframe."
wrong claims — those get fixed, not folded.