Skill guidance for agentdb persistent memory patterns.
/plugin marketplace add DNYoussef/context-cascade/plugin install dnyoussef-context-cascade@DNYoussef/context-cascadeThis skill inherits all available tools. When active, it can use any tool Claude has access to.
PROCESS.mdREADME.mdSKILL-meta.yamlprocess-diagram.gvBefore writing ANY code, you MUST check:
.claude/library/catalog.json.claude/docs/inventories/LIBRARY-PATTERNS-GUIDE.mdD:\Projects\*| Match | Action |
|---|---|
| Library >90% | REUSE directly |
| Library 70-90% | ADAPT minimally |
| Pattern exists | FOLLOW pattern |
| In project | EXTRACT |
| No match | BUILD (add to library after) |
Implement persistent memory patterns for AI agents using AgentDB - session memory, long-term storage, pattern learning, and context management for stateful agents, chat systems, and intelligent assistants.
import { AgentDB, MemoryManager } from 'agentdb-memory';
// Initialize memory system
const memoryDB = new AgentDB({
name: 'agent-memory',
dimensions: 768,
memory: {
sessionTTL: 3600,
consolidationInterval: 300,
maxSessionSize: 1000
}
});
const memoryManager = new MemoryManager({
database: memoryDB,
layers: ['episodic', 'semantic', 'procedural']
});
// Store memory
await memoryManager.store({
type: 'episodic',
content: 'User preferred dark theme',
context: { userId: '123', timestamp: Date.now() }
});
// Retrieve memory
const memories = await memoryManager.retrieve({
query: 'user preferences',
type: 'episodic',
limit: 10
});
const session = await memoryManager.createSession('user-123');
await session.store('conversation', messageHistory);
await session.store('preferences', userPrefs);
const context = await session.getContext();
await memoryManager.consolidate({
from: 'working-memory',
to: 'long-term-memory',
strategy: 'importance-based'
});
const patterns = await memoryManager.learnPatterns({
memory: 'episodic',
algorithm: 'clustering',
minSupport: 0.1
});
This skill operates using AgentDB's npm package and API only. No additional MCP servers required.
All AgentDB memory operations are performed through:
npx agentdb@latestimport { AgentDB, MemoryManager } from 'agentdb-memory'AgentDB Persistent Memory Patterns operates on 3 fundamental principles:
Memory systems mirror human cognition by organizing information across distinct temporal layers. Short-term memory handles immediate context (current conversation), working memory maintains active task state, and long-term memory consolidates important patterns for future retrieval.
In practice:
Raw episodic memories (specific events) are valuable but incomplete. True intelligence emerges when systems detect patterns across episodes - recurring user preferences, common error scenarios, effective solution strategies - and encode them as semantic knowledge.
In practice:
Memory systems fail if retrieval is slower than computation. Production AI agents require sub-50ms memory access to maintain real-time responsiveness, necessitating HNSW indexing, quantization, and aggressive caching strategies.
In practice:
| Anti-Pattern | Problem | Solution |
|---|---|---|
| Memory Hoarder - Store Everything Forever | Unbounded storage growth leads to slow retrieval, high costs, and context pollution. Agents retrieve irrelevant memories from 6 months ago. | Implement aggressive TTL policies (1-hour sessions, 30-day working memory, importance-based long-term retention). Use consolidation strategies to compress episodic memories into semantic patterns. |
| Flat Memory - Single Storage Layer | All memories treated equally creates retrieval chaos. No distinction between current conversation context and learned patterns from last year. | Use 3-layer architecture: session (ephemeral), working (task-scoped), long-term (consolidated). Apply different retrieval strategies per layer (recency for session, relevance for semantic). |
| Retrieval Thrashing - Query Every Memory Store on Every Request | Exhaustive searches across all memory layers cause latency spikes (200ms+ retrieval). Agents spend more time remembering than acting. | Use cascading retrieval: session first (fastest), semantic second (indexed), episodic last (cold storage). Implement query routing based on memory type and recency. Cache hot paths. |
AgentDB Persistent Memory Patterns transforms stateless AI agents into intelligent systems with genuine memory. By implementing layered storage (session, working, long-term), pattern learning algorithms, and performance-optimized retrieval, you enable agents to accumulate knowledge across interactions rather than starting from zero on every request. The 5-phase SOP ensures systematic implementation from architecture design through performance tuning, with success validated through sub-50ms retrieval latency and 95%+ context accuracy.
This skill is essential when building chat systems requiring conversation history, intelligent assistants that learn user preferences over time, or multi-agent systems coordinating through shared memory. The pattern learning capabilities distinguish AgentDB from basic vector databases - instead of merely storing embeddings, it actively extracts reusable knowledge from experience. When agents can remember what worked before, recall user preferences without re-asking, and apply proven patterns to new problems, they transition from tools to true collaborators.
The performance requirements are non-negotiable for production systems. Users abandon agents that "think" for 500ms between responses. By combining HNSW indexing, quantization, and caching strategies, you achieve both intelligent memory and real-time responsiveness - the foundation for AI systems that feel genuinely aware.
This skill should be used when the user asks to "create a hookify rule", "write a hook rule", "configure hookify", "add a hookify rule", or needs guidance on hookify rule syntax and patterns.
Create distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.