npx claudepluginhub lossless-claude/lcmLossless context management — DAG-based summarization that preserves every message
Share bugs, ideas, or general feedback.
lossless-claude
Shared memory infrastructure for coding agents
DAG-based summarization, SQLite-backed message persistence, promoted long-term memory, MCP retrieval tools
Website • Runtime Model • Installation • MCP Tools • Development
lossless-claude replaces sliding-window forgetfulness with a persistent memory runtime for both humans and agents.
Humans and agents use the same backend. The integration surface differs by client, but the memory model is shared.
This repo started as a fork of lossless-claw by Martian Engineering, adapted for Claude Code. The LCM model and DAG architecture originate from the Voltropy paper.
flowchart LR
subgraph Clients["Clients"]
CC["Claude Code<br/>hooks + MCP"]
end
CC --> D["lossless-claude daemon"]
D --> DB[("project SQLite DAG")]
D --> PM[("promoted memory FTS5")]
D --> TOOLS["MCP tools<br/>search / grep / expand / describe / store / stats / doctor"]
| Path | Restore | Prompt hints | Turn writeback | Automatic compaction | Notes |
|---|---|---|---|---|---|
| Claude Code | Yes | Yes | Yes, via transcript/hooks | Yes | Primary hook-based integration |
| GitHub Copilot (VS Code) | No | Yes, via skill/rules | No | No | Repo-local skill can teach Copilot to call lcm, but there is no automatic restore or turn capture yet |
| Codex | No | Yes, via skill/rules | No | No | Repo-local or global skill plus lcm import --codex; MCP config in .codex/config.toml is still manual, and first-class runtime support is tracked in issue #232 |
| Phase | What happens |
|---|---|
| Persist | Raw messages are stored in SQLite per conversation |
| Summarize | Older messages are grouped into leaf summaries |
| Condense | Summaries roll up into higher-level DAG nodes |
| Promote | Durable insights are copied into cross-session memory |
| Restore | New sessions recover context from summaries and promoted memory |
| Recall | Agents query, expand, and inspect memory on demand |
Nothing is dropped. Raw messages remain in the database. Summaries point back to their sources. Promoted memory remains searchable across sessions.
flowchart TD
A["conversation / tool output"] --> B["persist raw messages"]
B --> C["compact into leaf summaries"]
C --> D["condense into deeper DAG nodes"]
C --> E["promote durable insights"]
D --> F["restore future context"]
E --> F
F --> G["search / grep / describe / expand / store"]
Install the lcm binary first:
npm install -g @lossless-claude/lcm # provides the `lcm` command
claude plugin add github:lossless-claude/lcm
lcm install
lcm install writes config, registers hooks, installs slash commands, registers MCP, and verifies the daemon.
Install the lcm binary first:
npm install -g @lossless-claude/lcm
Then install the repo-local Copilot connector:
lcm connectors install github-copilot
lcm connectors doctor github-copilot
This creates a workspace skill under .github/skills/lcm-memory/SKILL.md so Copilot can search and store memory through the lcm CLI.
Install the lcm binary first:
npm install -g @lossless-claude/lcm
Then install the Codex connector:
lcm connectors install codex
lcm connectors doctor codex
Import older Codex sessions when needed: