What is Hindsight?
Hindsight™ is an agent memory system built to create smarter agents that learn over time. Most agent memory systems focus on recalling conversation history. Hindsight is focused on making agents that learn, not just remember.
It eliminates the shortcomings of alternative techniques such as RAG and knowledge graph and delivers state-of-the-art performance on long term memory tasks.
Memory Performance & Accuracy
Hindsight is the most accurate agent memory system ever tested according to benchmark performance. It has achieved state-of-the-art performance on the LongMemEval benchmark, widely used to assess memory system performance across a variety of conversational AI scenarios. The current reported performance of Hindsight and other agent memory solutions as of January 2026 is shown here:

The benchmark performance data for Hindsight has been independently reproduced by research collaborators at the Virginia Tech Sanghani Center for Artificial Intelligence and Data Analytics and The Washington Post. Other scores are self-reported by software vendors.
Hindsight is being used in production at Fortune 500 enterprises and by a growing number of AI startups.
Adding Hindsight to Your AI Agents
The easiest way to use Hindsight with an existing agent is with the LLM Wrapper. You can add memory to your agent with 2 lines of code. That will swap your current LLM client out with the Hindsight wrapper. After that, memories will be stored and retrieved automatically as you make LLM calls.
If you need more control over how and when your agent stores and recalls memories, there's also a simple API you can integrate with using the SDKs or directly via HTTP.

🤖 Using a coding agent? Install the Hindsight documentation skill for instant access to docs while you code:
npx skills add https://github.com/vectorize-io/hindsight --skill hindsight-docs
Works with Claude Code, Cursor, and other AI coding assistants.
Quick Start
Docker (recommended)
export OPENAI_API_KEY=sk-xxx
docker run --rm -it --pull always -p 8888:8888 -p 9999:9999 \
-e HINDSIGHT_API_LLM_API_KEY=$OPENAI_API_KEY \
-v $HOME/.hindsight-docker:/home/hindsight/.pg0 \
ghcr.io/vectorize-io/hindsight:latest
API: http://localhost:8888
UI: http://localhost:9999
You can modify the LLM provider by setting HINDSIGHT_API_LLM_PROVIDER. Valid options are openai, anthropic, gemini, groq, ollama, lmstudio, and minimax. The documentation provides more details on supported models.
Docker (external PostgreSQL)
export OPENAI_API_KEY=sk-xxx
export HINDSIGHT_DB_PASSWORD=choose-a-password
cd docker/docker-compose
docker compose up
API: http://localhost:8888
UI: http://localhost:9999
Client
pip install hindsight-client -U
# or
npm install @vectorize-io/hindsight-client
Python
from hindsight_client import Hindsight
client = Hindsight(base_url="http://localhost:8888")
# Retain: Store information
client.retain(bank_id="my-bank", content="Alice works at Google as a software engineer")
# Recall: Search memories
client.recall(bank_id="my-bank", query="What does Alice do?")