Distill repository files into the RLM Summary Ledger using agentic intelligence (fast) or Ollama (offline batch)
From rlm-factorynpx claudepluginhub richfrem/agent-plugins-skills --plugin rlm-factoryThis skill is limited to using the following tools:
assets/diagrams/rlm_late_binding_flow.mmdassets/diagrams/rlm_tool_enrichment_flow.mmdevals/evals.jsonevals/results.tsvreferences/BLUEPRINT.mdreferences/RLM_ARCHITECTURE.mdreferences/diagrams/distillation_process.mmdreferences/diagrams/logic.mmdreferences/diagrams/rlm-factory-architecture.mmdreferences/diagrams/rlm-factory-architecture.pngreferences/diagrams/rlm-factory-dual-path.mmdreferences/diagrams/rlm-factory-dual-path.pngreferences/diagrams/rlm-factory-workflow.mmdreferences/diagrams/rlm_late_binding_flow.mmdreferences/diagrams/rlm_mechanism_workflow.mmdreferences/diagrams/rlm_mechanism_workflow.pngreferences/diagrams/rlm_tool_enrichment_flow.mmdreferences/diagrams/search_process.mmdreferences/diagrams/unpacking.mmdreferences/diagrams/workflow.mmdGuides Next.js Cache Components and Partial Prerendering (PPR) with cacheComponents enabled. Implements 'use cache', cacheLife(), cacheTag(), revalidateTag(), static/dynamic optimization, and cache debugging.
Migrates code, prompts, and API calls from Claude Sonnet 4.0/4.5 or Opus 4.1 to Opus 4.5, updating model strings on Anthropic, AWS, GCP, Azure platforms.
Configures VPN and dedicated connections like Direct Connect, ExpressRoute, Interconnect for secure on-premises to AWS, Azure, GCP, OCI hybrid networking.
This skill requires Python 3.8+ and standard library only. No external packages needed.
To install this skill's dependencies:
pip-compile ./requirements.in
pip install -r ./requirements.txt
See ./requirements.txt for the dependency lockfile (currently empty — standard library only).
Summarize files into the RLM Summary Ledger. Two paths depending on context:
For detailed execution protocol, see agent:
rlm-distill-agent
The agent reads each file and writes a high-quality summary via inject_summary.py.
Use for 1-50 files. The agent is faster and produces better summaries than local Ollama.
python3 ./scripts/inject_summary.py \
--profile project \
--file path/to/file.md \
--summary "Your agent-generated summary here."
Requires Ollama running locally (ollama serve, model: granite3.2:8b).
# All files in profile scope
python3 ./scripts/distiller.py --profile project
# Single file
python3 ./scripts/distiller.py --profile project --file path/to/file.md
# Changed in last 2 hours
python3 ./scripts/distiller.py --profile project --since 2
| Profile | Flag | Cache file |
|---|---|---|
| Docs / protocols | --profile project | rlm_summary_cache.json |
| Plugins / scripts | --profile tools | rlm_tool_cache.json |