Your entire codebase as Claude's context

Note: This is a fork of claude-context by Zilliz, maintained by PleaseAI with additional features and improvements.
Extensions Status: Chrome and VSCode extensions are currently TBD (To Be Determined) and not yet available in this fork.
Context Please is an MCP plugin that adds semantic code search to Claude Code and other AI coding agents, giving them deep context from your entire codebase.
🧠 Your Entire Codebase as Context: Claude Context uses semantic search to find all relevant code from millions of lines. No multi-round discovery needed. It brings results straight into the Claude's context.
💰 Cost-Effective for Large Codebases: Instead of loading entire directories into Claude for every request, which can be very expensive, Claude Context efficiently stores your codebase in a vector database and only uses related code in context to keep your costs manageable.
🚀 Demo

Model Context Protocol (MCP) allows you to integrate Claude Context with your favorite AI coding assistants, e.g. Claude Code.
Quick Start
Prerequisites
🚀 New: Zero-Config Local Mode with FAISS
You can now use Context Please with no external database required! Simply provide an OpenAI API key, and FAISS will handle local storage automatically. Perfect for getting started quickly or working with small-to-medium codebases.
For production deployments or large codebases, consider using Zilliz Cloud or Qdrant:
Get a free vector database on Zilliz Cloud 👈
Claude Context needs a vector database. You can sign up on Zilliz Cloud to get an API key.

Copy your Personal Key to replace your-zilliz-cloud-api-key in the configuration examples.
Get OpenAI API Key for embedding model
You need an OpenAI API key for the embedding model. You can get one by signing up at OpenAI.
Your API key will look like this: it always starts with sk-.
Copy your key and use it in the configuration examples below as your-openai-api-key.
Configure MCP for Claude Code
System Requirements:
- Node.js >= 20.0.0 and < 24.0.0
Claude Context is not compatible with Node.js 24.0.0, you need downgrade it first if your node version is greater or equal to 24.
Configuration
Option 1: Local Mode with FAISS (Recommended for Getting Started)
The simplest way to get started - no external database required:
claude mcp add context-please \
-e OPENAI_API_KEY=sk-your-openai-api-key \
-- npx @pleaseai/context-please-mcp@latest
Option 2: Cloud Mode with Zilliz (For Production/Large Codebases)
For larger codebases or production deployments:
claude mcp add context-please \
-e OPENAI_API_KEY=sk-your-openai-api-key \
-e MILVUS_TOKEN=your-zilliz-cloud-api-key \
-- npx @pleaseai/context-please-mcp@latest
See the Claude Code MCP documentation for more details about MCP server management.
Other MCP Client Configurations
OpenAI Codex CLI
Codex CLI uses TOML configuration files:
-
Create or edit the ~/.codex/config.toml file.
-
Add the following configuration:
# IMPORTANT: the top-level key is `mcp_servers` rather than `mcpServers`.
[mcp_servers.context-please]
command = "npx"
args = ["@pleaseai/context-please-mcp@latest"]
env = { "OPENAI_API_KEY" = "your-openai-api-key", "MILVUS_TOKEN" = "your-zilliz-cloud-api-key" }
# Optional: override the default 10s startup timeout
startup_timeout_ms = 20000
- Save the file and restart Codex CLI to apply the changes.
Gemini CLI