From sundial-org-awesome-openclaw-skills-4
Combines OpenClaw vector memory_search with Graphiti temporal graph for recalling past context, answering temporal questions, and searching memory files. Includes decision framework for tool selection.
npx claudepluginhub joshuarweaver/cascade-ai-ml-agents-misc-2 --plugin sundial-org-awesome-openclaw-skills-4This skill uses the workspace's default tool permissions.
Two memory systems, each with different strengths. Use both.
Guides Next.js Cache Components and Partial Prerendering (PPR) with cacheComponents enabled. Implements 'use cache', cacheLife(), cacheTag(), revalidateTag(), static/dynamic optimization, and cache debugging.
Guides building MCP servers enabling LLMs to interact with external services via tools. Covers best practices, TypeScript/Node (MCP SDK), Python (FastMCP).
Generates original PNG/PDF visual art via design philosophy manifestos for posters, graphics, and static designs on user request.
Two memory systems, each with different strengths. Use both.
| Question Type | Tool | Example |
|---|---|---|
| Document content | memory_search | "What's in GOALS.md?" |
| Curated notes | memory_search | "What are our project guidelines?" |
| Temporal facts | Graphiti | "When did we set up Slack?" |
| Conversations | Graphiti | "What did the user say last Tuesday?" |
| Entity tracking | Graphiti | "What projects involve Alice?" |
Semantic search over markdown files (MEMORY.md, memory/**/*.md).
memory_search query="your question"
Then use memory_get to read specific lines if needed.
Search for facts with time awareness:
graphiti-search.sh "your question" GROUP_ID 10
Log important facts:
graphiti-log.sh GROUP_ID user "Name" "Fact to remember"
Common group IDs:
main-agent — Primary agentuser-personal — User's personal contextWhen answering questions about past context:
memory_searchAdd to your AGENTS.md:
### Memory Recall (Hybrid)
**Temporal questions** ("when?", "what changed?", "last Tuesday"):
```bash
graphiti-search.sh "query" main-agent 10
Document questions ("what's in X?", "find notes about Y"):
memory_search query="your query"
When answering past context: check Graphiti for temporal, memory_search for docs.
## Setup
Full setup guide: https://github.com/clawdbrunner/openclaw-graphiti-memory
**Part 1: OpenClaw Memory** — Configure embedding provider (Gemini recommended)
**Part 2: Graphiti** — Deploy Docker stack, install sync daemons