From learning-loop
Consolidates project memory files across sessions via four-phase cycle (Orient, Gather Signal, Consolidate, Prune Index) using seven operators: MERGE, RESOLVE, ABSTRACT, COMPRESS, PRUNE, LINK, DATE NORMALIZE. Rebuilds MEMORY.md.
npx claudepluginhub robinslange/learning-loop --plugin learning-loopThis skill uses the workspace's default tool permissions.
Seven operators, each defined in `operators/`. This file orchestrates the four-phase cycle. Read operator files only when executing Phase 3.
Guides Next.js Cache Components and Partial Prerendering (PPR): 'use cache' directives, cacheLife(), cacheTag(), revalidateTag() for caching, invalidation, static/dynamic optimization. Auto-activates on cacheComponents: true.
Processes PDFs: extracts text/tables/images, merges/splits/rotates pages, adds watermarks, creates/fills forms, encrypts/decrypts, OCRs scans. Activates on PDF mentions or output requests.
Share bugs, ideas, or general feedback.
Seven operators, each defined in operators/. This file orchestrates the four-phase cycle. Read operator files only when executing Phase 3.
/dream runs immediately, ignores gatesEmit events silently via Bash for each operator action.
node "${CLAUDE_PLUGIN_ROOT}/scripts/provenance-emit.js" '{"agent":"dream","skill":"dream","action":"ACTION","target":"FILENAME"}'
Where ACTION is one of: merge, resolve, abstract, compress, prune, link, normalize.
At start: {"action":"session-start"}. At end: {"action":"session-end","merged":N,"resolved":N,"abstracted":N,"compressed":N,"pruned":N,"linked":N,"normalized":N} + run node PLUGIN/scripts/provenance-consolidate.mjs.
Detect the project memory directory:
$CLAUDE_PROJECT_DIR if available, else use the auto-memory directory for the current projectRead all .md files (excluding MEMORY.md, _dream_log.md, _archived/).
Parse YAML frontmatter: name, description, type, confidence.
Build inventory: total count, count by type, sorted by modification date, line count per file.
Read MEMORY.md. Check links resolve to actual files. Flag orphaned pointers.
Report:
Dreaming: [project name]
Memory files: N (N feedback, N project, N user, N reference)
Index entries: N (N orphaned)
Group by type, sort newest-last within each group. Order: feedback, user, project, reference. Within each group, oldest first (exploits recency bias per Chattaraj & Raj 2026).
Steps 2–8 below mirror the Phase 3 execution order so flagging and consolidation walk the operators in the same sequence.
Flag DATE NORMALIZE candidates. Files containing relative temporal references ("yesterday", "last week", etc.).
Flag MERGE candidates. Within each type group, flag pairs where both descriptions reference the same tool/concept, one is a subset of the other, or both contain the same rule. Skip pairs that contradict each other (those go to RESOLVE).
Flag RESOLVE candidates. Within each type group, flag pairs where two memories assert opposite rules or facts about the same subject.
Flag ABSTRACT candidates. Clusters of 4+ memories within the same type group describing variations of the same pattern. For each cluster, note: the memories, the candidate abstraction (one sentence), which would be archived (fully subsumed), which would remain (unique detail). Conservative: only flag clear patterns.
Flag COMPRESS candidates. Memory files exceeding 15 lines or exceeding size limits (feedback/user: 500 chars, project/reference: 1,000 chars body).
Flag PRUNE candidates.
Flag LINK candidates. Cross-type pairs sharing a keyword or concept. Descriptions only. Cap at 30 most recent files if 50+.
Present signal summary and ask for approval:
Dream signal (operators in execution order):
- DATE NORMALIZE: N candidates
- MERGE: N candidate pairs
- RESOLVE: N contradiction pairs
- ABSTRACT: N clusters (N source memories)
- COMPRESS: N candidates (N over size limit)
- PRUNE: N candidates (N orphaned, N stale)
- LINK: N candidate pairs
Proceed with consolidation? [yes/no]
Note: ABSTRACT has a separate per-cluster gate.
Process in strict order: DATE NORMALIZE, MERGE, RESOLVE, ABSTRACT, COMPRESS, PRUNE, LINK.
Create lock file first using Bash: node -e "require('fs').writeFileSync(require('path').join(require('os').tmpdir(), 'learning-loop-dream-lock'), process.pid.toString())". Abort if lock exists.
For each operator, read its instruction file from operators/ and execute:
| Operator | File | Input |
|---|---|---|
| DATE NORMALIZE | operators/normalize.md | Flagged files with relative dates |
| MERGE | operators/merge.md | Candidate pairs (excluding contradictions) |
| RESOLVE | operators/resolve.md | Contradiction pairs |
| ABSTRACT | operators/abstract.md | Flagged clusters (per-cluster user gate) |
| COMPRESS | operators/compress.md | Files over line/size thresholds |
| PRUNE | operators/prune.md | Orphaned and stale candidates |
| LINK | operators/link.md | Cross-type pairs |
Log every operation to _dream_log.md (append, create if needed).
Remove lock file when done using Bash: node -e "try { require('fs').unlinkSync(require('path').join(require('os').tmpdir(), 'learning-loop-dream-lock')) } catch(e) {}"
Rebuild MEMORY.md from scratch: scan all .md files (excluding MEMORY.md, _dream_log.md, _archived/), format as - [filename.md](filename.md): description, group by topic, under 150 chars per line, drop unmodified-in-90-days if over 200 lines.
Write MEMORY.md (full overwrite). Write timestamp using Bash: node -e "require('fs').writeFileSync(require('path').join(require('os').tmpdir(), 'learning-loop-last-dream'), Math.floor(Date.now()/1000).toString())".
Report:
Dream complete.
Merged: N | Resolved: N | Abstracted: N | Compressed: N | Pruned: N | Linked: N | Normalized: N
Index: N lines (was N)
Unresolved: N contradictions (need user input)
List any unresolved contradictions with the conflicting claims.
{{VAULT}}/ (vault has its own pipeline)_archived/)_dream_log.md