From agentops
Injects relevant .agents knowledge from learnings, patterns, MEMORY.md into Claude Code session context. Use /inject [topic] or ao inject CLI for on-demand retrieval.
npx claudepluginhub boshu2/agentops --plugin agentopsThis skill uses the workspace's default tool permissions.
> **DEPRECATED (removal target: v3.0.0)** — Use `ao lookup --query "topic"` for on-demand learnings retrieval, or see `.agents/AGENTS.md` for knowledge navigation. This skill and the `ao inject` CLI command still work but are no longer called from hooks or other skills.
Creates isolated Git worktrees for feature branches with prioritized directory selection, gitignore safety checks, auto project setup for Node/Python/Rust/Go, and baseline verification.
Executes implementation plans in current session by dispatching fresh subagents per independent task, with two-stage reviews: spec compliance then code quality.
Dispatches parallel agents to independently tackle 2+ tasks like separate test failures or subsystems without shared state or dependencies.
DEPRECATED (removal target: v3.0.0) — Use
ao lookup --query "topic"for on-demand learnings retrieval, or see.agents/AGENTS.mdfor knowledge navigation. This skill and theao injectCLI command still work but are no longer called from hooks or other skills.
On-demand knowledge retrieval. Not run automatically at startup (since ag-8km).
Inject relevant prior knowledge into the current session.
In the default manual startup mode, MEMORY.md is auto-loaded by Claude Code and no startup injection occurs. Use /inject or ao inject for on-demand retrieval when you need deeper context.
In lean or legacy startup modes (set via AGENTOPS_STARTUP_CONTEXT_MODE), the SessionStart hook runs:
# lean mode (MEMORY.md fresh): 400 tokens
ao inject --apply-decay --format markdown --max-tokens 400 \
[--bead <bead-id>] [--predecessor <handoff-path>]
# legacy mode: 800 tokens
ao inject --apply-decay --format markdown --max-tokens 800 \
[--bead <bead-id>] [--predecessor <handoff-path>]
This searches for relevant knowledge and injects it into context.
When --bead is provided (via HOOK_BEAD env var from Gas Town):
When --predecessor is provided (path to a handoff file):
Given /inject [topic]:
With ao CLI:
ao inject --context "<topic>" --format markdown --max-tokens 1000
Without ao CLI, search manually:
# Global operating memory
sed -n '1,120p' ~/.agents/MEMORY.md 2>/dev/null
# Recent learnings
ls -lt .agents/learnings/ | head -5
# Recent patterns
ls -lt .agents/patterns/ | head -5
# Recent research
ls -lt .agents/research/ | head -5
# Global learnings (cross-repo knowledge)
ls -lt ~/.agents/learnings/ 2>/dev/null | head -5
# Global patterns (cross-repo patterns)
ls -lt ~/.agents/patterns/ 2>/dev/null | head -5
# Legacy patterns (read-only fallback, no new writes)
ls -lt ~/.claude/patterns/ 2>/dev/null | head -5
Use the Read tool to load the most relevant artifacts based on topic.
Present the injected knowledge:
After presenting injected knowledge, record which files were injected for the feedback loop:
mkdir -p .agents/ao
# Record each injected learning file as a citation
for injected_file in <list of files that were read and presented>; do
echo "{\"artifact_path\": \"$injected_file\", \"cited_at\": \"$(date -Iseconds)\", \"session_id\": \"$(date +%Y-%m-%d)\", \"workspace_path\": \"$PWD\"}" >> .agents/ao/citations.jsonl
done
Citation tracking enables the feedback loop: learnings that are frequently cited get confidence boosts during /post-mortem, while uncited learnings decay faster.
| Source | Location | Priority | Weight |
|---|---|---|---|
| Global Memory | ~/.agents/MEMORY.md | Highest | 1.0 |
| Learnings | .agents/learnings/ | High | 1.0 |
| Patterns | .agents/patterns/ | High | 1.0 |
| Global Learnings | ~/.agents/learnings/ | High | 0.8 (configurable) |
| Global Patterns | ~/.agents/patterns/ | High | 0.8 (configurable) |
| Research | .agents/research/ | Medium | — |
| Retros | .agents/learnings/ | Medium | — |
| Legacy Patterns | ~/.claude/patterns/ | Low | 0.6 (read-only, no new writes) |
Knowledge relevance decays over time (~17%/week). More recent learnings are weighted higher.
Hook triggers: session-start.sh runs at session start with AGENTOPS_STARTUP_CONTEXT_MODE=lean or legacy
What happens:
ao inject --apply-decay --format markdown --max-tokens 400 (lean) or --max-tokens 800 (legacy).agents/learnings/, .agents/patterns/, .agents/research/ for relevant artifactsResult: Prior learnings, patterns, research automatically available at session start without manual lookup.
Note: In the default manual mode, MEMORY.md is auto-loaded by Claude Code and this hook emits only a pointer to on-demand retrieval commands (ao search, ao lookup).
User says: /inject authentication or "recall knowledge about auth"
What happens:
ao inject --context "authentication" --format markdown --max-tokens 1000Result: Topic-specific knowledge retrieved and summarized, enabling faster context loading than full artifact reads.
| Problem | Cause | Solution |
|---|---|---|
| No knowledge injected | Empty knowledge pools or ao CLI unavailable | Run /post-mortem to seed pools; verify ao CLI installed |
| Irrelevant knowledge | Topic mismatch or stale artifacts dominate | Use --context "<topic>" to filter; prune stale artifacts |
| Token budget exceeded | Too many high-relevance artifacts | Reduce --max-tokens or increase topic specificity |
| Decay too aggressive | Recent learnings not prioritized | Check artifact modification times; verify --apply-decay flag |