Knowledge Curator agent skill for the RLM Factory. Auto-invoked when tasks involve distilling code summaries, querying the semantic ledger, auditing cache coverage, or maintaining RLM hygiene. Supports both Ollama-based batch distillation and agent-powered direct summarization. V2 enforces Concurrency Safety constraints.
From rlm-factorynpx claudepluginhub richfrem/agent-plugins-skills --plugin rlm-factoryThis skill is limited to using the following tools:
acceptance-criteria.mdassets/diagrams/rlm_late_binding_flow.mmdassets/diagrams/rlm_tool_enrichment_flow.mmdevals/evals.jsonevals/results.tsvfallback-tree.mdreferences/BLUEPRINT.mdreferences/RLM_ARCHITECTURE.mdreferences/gap_analysis_rlm_v1.mdreferences/prompt.mdreferences/research-summary.mdrequirements.txtresources/distiller_manifest.jsonresources/manifest-index.jsonresources/prompts/rlm/rlm_summarize_general.mdresources/prompts/rlm/rlm_summarize_tool.mdresources/rlm_manifest.jsonscripts/cleanup_cache.pyscripts/debug_rlm.pyscripts/distiller.pyGuides Next.js Cache Components and Partial Prerendering (PPR) with cacheComponents enabled. Implements 'use cache', cacheLife(), cacheTag(), revalidateTag(), static/dynamic optimization, and cache debugging.
Migrates code, prompts, and API calls from Claude Sonnet 4.0/4.5 or Opus 4.1 to Opus 4.5, updating model strings on Anthropic, AWS, GCP, Azure platforms.
Configures Istio traffic management with VirtualServices, DestinationRules for routing, canary/blue-green deployments, circuit breakers, load balancing, and fault injection in service meshes.
This skill requires Python 3.8+ and standard library only. No external packages needed.
To install this skill's dependencies:
pip-compile ./requirements.in
pip install -r ./requirements.txt
See ./requirements.txt for the dependency lockfile (currently empty — standard library only).
You are the Knowledge Curator. Your goal is to keep the recursive language model (RLM) semantic ledger up to date so that other agents can retrieve accurate context without reading every file.
| Script | Role | Ollama? |
|---|---|---|
distiller.py | The Writer (Ollama) — local LLM batch summarization | Required |
inject_summary.py | The Writer (Agent/Swarm) -- direct agent-generated injection, no Ollama | None |
inventory.py | The Auditor -- coverage reporting | None |
cleanup_cache.py | The Janitor -- stale entry removal | None |
rlm_config.py | Shared Config -- manifest & profile mgmt | None |
Searching the cache? Use the
rlm-searchskill and itsquery_cache.pyscript.
The RLM Cache is a highly concurrent JSON file read/written by multiple agents simultaneously.
NEVER manually edit the .agent/learning/rlm_summary_cache.json or .agent/learning/rlm_tool_cache.json using raw bash commands, sed, awk, or native LLM tool block writes.
Doing so bypasses the Python fcntl.flock concurrency lock. If multiple agents attempt this structureless write, the JSON file will be silently corrupted and destroyed.
ALWAYS use inject_summary.py or distiller.py to write to the cache. These scripts handle the fcntl.flock locks inherently, guaranteeing data integrity.
When executing distiller.py:
Connection refused (usually pointing to port 11434), it means the Ollama AI server is down. Do not attempt to retry indefinitely or modify python. You MUST IMMEDIATELY refer to ./fallback-tree.md.python3 ./scripts/inventory.py --type legacy
Check: Is coverage < 100%? Are there missing files?
Use the rlm-search skill for all cache queries:
python3 ./scripts/query_cache.py --profile plugins "search_term"
python3 ./scripts/query_cache.py --profile tools --list
Use the Copilot swarm (free, gpt-5-mini) or Gemini swarm (free).
Delegate to the agent-loops:agent-swarm skill, providing:
copilot (free default) or gemini (higher throughput)inventory.py --missing2 for copilot (rate-limit safe), 5 for geminipython3 ./scripts/distiller.py
python3 ./scripts/inject_summary.py \
--profile project \
--file path/to/file.md \
--summary "Your dense summary here..."
python3 ./scripts/cleanup_cache.py --type legacy --apply
Every summary injected should answer "Why does this file exist?"