From exa
Build RAG pipelines with Exa.ai for real-time web retrieval. Use when building retrieval-augmented generation, integrating Exa with LangChain, LlamaIndex, Vercel AI SDK, or implementing AI agents with web search capabilities. Triggers on: RAG pipeline, retrieval augmented generation, Exa LangChain, Exa LlamaIndex, ExaSearchRetriever, ExaSearchResults, Exa MCP, Exa tool calling, Claude tool use, AI agent web search, grounded generation, citation generation, fact checking, hallucination detection, OpenAI compatibility, chat completions.
npx claudepluginhub ejirocodes/agent-skills --plugin exaThis skill uses the workspace's default tool permissions.
| Topic | When to Use | Reference |
Guides Next.js Cache Components and Partial Prerendering (PPR) with cacheComponents enabled. Implements 'use cache', cacheLife(), cacheTag(), revalidateTag(), static/dynamic optimization, and cache debugging.
Guides building MCP servers enabling LLMs to interact with external services via tools. Covers best practices, TypeScript/Node (MCP SDK), Python (FastMCP).
Generates original PNG/PDF visual art via design philosophy manifestos for posters, graphics, and static designs on user request.
| Topic | When to Use | Reference |
|---|---|---|
| LangChain | Building RAG chains with LangChain | langchain.md |
| LlamaIndex | Using Exa as a LlamaIndex data source | llamaindex.md |
| Vercel AI SDK | Adding web search to Next.js AI apps | vercel-ai.md |
| MCP & Tools | Claude MCP server, OpenAI tools, function calling | mcp-tools.md |
from langchain_exa import ExaSearchRetriever
retriever = ExaSearchRetriever(
exa_api_key="your-key",
k=5,
highlights=True
)
docs = retriever.invoke("latest AI research papers")
from llama_index.readers.web import ExaReader
reader = ExaReader(api_key="your-key")
documents = reader.load_data(
query="machine learning best practices",
num_results=10
)
import { exa } from "@agentic/exa";
import { createOpenAI } from "@ai-sdk/openai";
import { generateText } from "ai";
const result = await generateText({
model: openai("gpt-4"),
tools: { search: exa.searchAndContents },
prompt: "Search for the latest TypeScript features",
});
from openai import OpenAI
client = OpenAI(
base_url="https://api.exa.ai/v1",
api_key="your-exa-key"
)
response = client.chat.completions.create(
model="exa",
messages=[{"role": "user", "content": "What are the latest AI trends?"}]
)
| Framework | Best For | Key Feature |
|---|---|---|
| LangChain | Complex chains, agents | ExaSearchRetriever, tool integration |
| LlamaIndex | Document indexing, Q&A | ExaReader, query engines |
| Vercel AI SDK | Next.js apps, streaming | Tool definitions, edge-ready |
| OpenAI Compat | Drop-in replacement | Minimal code changes |
| Claude MCP | Claude Desktop, Claude Code | Native tool calling |
highlights=True for relevant snippetsresult.url in citations for grounded responsessummary=True provides concise context without full page overheadinclude_domains to limit to authoritative sourcesstart_published_date to avoid stale info