Memora
"You never truly know the value of a moment until it becomes a memory."
Give your AI agents persistent memory
A lightweight MCP server for semantic memory storage, knowledge graphs, conversational recall, and cross-session context.
Features · Install · Usage · Config · Live Graph · Cloud Graph · Chat · Semantic Search · LLM Dedup · Linking · Neovim
Features
Core Storage
- 💾 Persistent Storage - SQLite with optional cloud sync (S3, R2, D1)
- 📂 Hierarchical Organization - Section/subsection structure with auto-hierarchy assignment
- 📦 Export/Import - Backup and restore with merge strategies
Search & Intelligence
- 🔍 Semantic Search - Vector embeddings (TF-IDF, sentence-transformers, OpenAI)
- 🎯 Advanced Queries - Full-text, date ranges, tag filters (AND/OR/NOT), hybrid search
- 🔀 Cross-references - Auto-linked related memories based on similarity
- 🤖 LLM Deduplication - Find and merge duplicates with AI-powered comparison
- 🔗 Memory Linking - Typed edges, importance boosting, and cluster detection
Tools & Visualization
- ⚡ Memory Automation - Structured tools for TODOs, issues, and sections
- 🕸️ Knowledge Graph - Interactive visualization with Mermaid rendering and cluster overlays
- 🌐 Live Graph Server - Built-in HTTP server with cloud-hosted option (D1/Pages)
- 💬 Chat with Memories - RAG-powered chat panel with LLM tool calling to search, create, update, and delete memories via streaming chat
- 📡 Event Notifications - Poll-based system for inter-agent communication
- 📊 Statistics & Analytics - Tag usage, trends, and connection insights
- 🧠 Memory Insights - Activity summary, stale detection, consolidation suggestions, and LLM-powered pattern analysis
- 📜 Action History - Track all memory operations (create, update, delete, merge, boost, link) with grouped timeline view
Install
pip install git+https://github.com/agentic-box/memora.git
Includes cloud storage (S3/R2) and OpenAI embeddings out of the box.
# Optional: local embeddings (offline, ~2GB for PyTorch)
pip install "memora[local]" @ git+https://github.com/agentic-box/memora.git
Usage
The server runs automatically when configured in Claude Code. Manual invocation:
# Default (stdio mode for MCP)
memora-server
# With graph visualization server
memora-server --graph-port 8765
# HTTP transport (alternative to stdio)
memora-server --transport streamable-http --host 127.0.0.1 --port 8080
Configuration
Claude Code
Add to .mcp.json in your project root:
Local DB:
{
"mcpServers": {
"memora": {
"command": "memora-server",
"args": [],
"env": {
"MEMORA_DB_PATH": "~/.local/share/memora/memories.db",
"MEMORA_ALLOW_ANY_TAG": "1",
"MEMORA_GRAPH_PORT": "8765"
}
}
}
}
Cloud DB (Cloudflare D1) - Recommended:
{
"mcpServers": {
"memora": {
"command": "memora-server",
"args": ["--no-graph"],
"env": {
"MEMORA_STORAGE_URI": "d1://<account-id>/<database-id>",
"CLOUDFLARE_API_TOKEN": "<your-api-token>",
"MEMORA_ALLOW_ANY_TAG": "1"
}
}
}
}
With D1, use --no-graph to disable the local visualization server. Instead, use the hosted graph at your Cloudflare Pages URL (see Cloud Graph).