Store project decisions, patterns, bug fixes, and context persistently across Claude Code sessions in a knowledge graph and vector DB. Automatically recall relevant memories on session start or debugging, store new insights with tags and importance, monitor health via FalkorDB/Qdrant checks, and capture auto-snapshots after tools.
npx claudepluginhub verygoodplugins/mcp-automem --plugin automemGraph-vector memory service providing AI assistants with durable, relational memory for context awareness.
Persistent memory for Claude Code. Capture work across sessions and recall relevant context.
Persistent memory system for AI coding sessions — cross-tool memory sharing with 6-dimensional hybrid search
Persistent memory for AI coding agents. Survives across sessions and compactions.
Persistent memory system enabling AI coding assistants (Claude, Cursor, Copilot) to maintain context across sessions.
Curated persistent memory for Claude Code. Write gate prevents bloat — only behavior-changing facts get saved. Tiered architecture: daily logs, structured registers, and auto-loaded working memory.
Executes bash commands
Hook triggers when Bash tool is used
Share bugs, ideas, or general feedback.
One command. Infinite memory. Perfect recall across all your AI tools.
npx @verygoodplugins/mcp-automem setup
Your AI assistant now remembers everything. Forever. Across every conversation.
Works with Claude Desktop, Cursor IDE, Claude Code, GitHub Copilot (coding agent), ChatGPT, ElevenLabs, OpenAI Codex - any MCP-compatible AI platform.
Every AI conversation starts from zero. Claude forgets your coding style. Cursor can't learn your patterns. Your assistant doesn't remember yesterday's decisions.
Until now.
AutoMem MCP connects your AI to persistent memory powered by AutoMem - a graph-vector memory service.
associate_memories inputs)| Platform | Support | Setup Time |
|---|---|---|
| Claude Desktop | ✅ Full | 30 seconds |
| Cursor IDE | ✅ Full | 30 seconds |
| Claude Code | ✅ Full | 30 seconds |
| GitHub Copilot | ✅ Full | 2 minutes |
| OpenAI Codex | ✅ Full | 30 seconds |
| Any MCP client | ✅ Full | 30 seconds |
Claude automatically recalls memories at conversation start using custom instructions
Cursor uses automem.mdc rule to automatically recall and store memories
Git commits, builds, and deployments automatically stored to memory
OpenAI Codex uses config.toml to automatically recall and store memories
// After 1 week, your AI writes EXACTLY like you
// ✅ It knows you prefer early returns
// ✅ It uses your specific variable naming
// ✅ It matches your comment style
// ✅ It follows YOUR patterns, not generic best practices
User: "Should we use Redis for this?"
Without AutoMem:
"Consider RabbitMQ, Kafka, or AWS SQS based on your needs..."
With AutoMem:
"Based on your pattern of preferring boring technology that works,
and your positive experience with Redis in Project X (March 2024),
yes. You specifically value operational simplicity over feature
richness - Redis fits perfectly."
You need a running AutoMem service (the memory backend). Choose one:
Option A: Local Development (fastest, free)
git clone https://github.com/verygoodplugins/automem.git
cd automem
make dev
Service runs at http://localhost:8001 - perfect for single-machine use.
Option B: Railway Cloud (recommended for production)
One-click deploy with $5 free credits. Typical cost: ~$0.50-1/month after trial.
👉 AutoMem Service Installation Guide - Complete setup instructions for local, Railway, Docker, and production deployments.
Download and double-click to install AutoMem in Claude Desktop:
Own this plugin?
Verify ownership to unlock analytics, metadata editing, and a verified badge.
Sign in to claimOwn this plugin?
Verify ownership to unlock analytics, metadata editing, and a verified badge.
Sign in to claim