Deploy FireCrawl integrations to Vercel, Fly.io, and Cloud Run platforms. Use when deploying FireCrawl-powered applications to production, configuring platform-specific secrets, or setting up deployment pipelines. Trigger with phrases like "deploy firecrawl", "firecrawl Vercel", "firecrawl production deploy", "firecrawl Cloud Run", "firecrawl Fly.io".
From firecrawl-packnpx claudepluginhub nickloveinvesting/nick-love-plugins --plugin firecrawl-packThis skill is limited to using the following tools:
Guides Next.js Cache Components and Partial Prerendering (PPR) with cacheComponents enabled. Implements 'use cache', cacheLife(), cacheTag(), revalidateTag(), static/dynamic optimization, and cache debugging.
Migrates code, prompts, and API calls from Claude Sonnet 4.0/4.5 or Opus 4.1 to Opus 4.5, updating model strings on Anthropic, AWS, GCP, Azure platforms.
Details PluginEval's skill quality evaluation: 3 layers (static, LLM judge), 10 dimensions, rubrics, formulas, anti-patterns, badges. Use to interpret scores, improve triggering, calibrate thresholds.
Deploy applications using Firecrawl's web scraping API (api.firecrawl.dev) to production. Covers API key management, webhook endpoint deployment for async crawl results, and self-hosted Firecrawl deployment options using Docker.
FIRECRAWL_API_KEY environment variable@mendable/firecrawl-js SDK# Vercel
vercel env add FIRECRAWL_API_KEY production
# Cloud Run
echo -n "your-key" | gcloud secrets create firecrawl-api-key --data-file=-
// api/scrape.ts
import FirecrawlApp from "@mendable/firecrawl-js";
const firecrawl = new FirecrawlApp({
apiKey: process.env.FIRECRAWL_API_KEY!,
});
export async function POST(req: Request) {
const { url, formats } = await req.json();
const result = await firecrawl.scrapeUrl(url, {
formats: formats || ["markdown"],
});
return Response.json({
markdown: result.markdown,
metadata: result.metadata,
});
}
# docker-compose.yml
version: "3.8"
services:
firecrawl:
image: mendableai/firecrawl:latest
ports:
- "3002:3002" # 3002 = configured value
environment:
- REDIS_URL=redis://redis:6379 # 6379: Redis port
- PLAYWRIGHT_BROWSERS_PATH=/browsers
depends_on:
- redis
redis:
image: redis:7-alpine
ports:
- "6379:6379" # Redis port
app:
build: .
ports:
- "3000:3000" # 3000: 3 seconds in ms
environment:
- FIRECRAWL_API_URL=http://firecrawl:3002
depends_on:
- firecrawl
// api/webhooks/firecrawl.ts
export async function POST(req: Request) {
const { type, id, data } = await req.json();
if (type === "crawl.completed") {
await processScrapedPages(id, data.pages);
}
return Response.json({ received: true });
}
export async function GET() {
try {
const result = await firecrawl.scrapeUrl("https://example.com", {
formats: ["markdown"],
});
return Response.json({ status: result ? "healthy" : "degraded" });
} catch {
return Response.json({ status: "unhealthy" }, { status: 503 }); # HTTP 503 Service Unavailable
}
}
| Issue | Cause | Solution |
|---|---|---|
| Rate limited | Too many scrape requests | Queue requests with delays |
| Scrape blocked | Target site protection | Use waitFor and browser options |
| API key invalid | Key expired | Regenerate at firecrawl.dev dashboard |
| Self-hosted memory | Playwright overhead | Increase container memory to 2GB+ |
Basic usage: Apply firecrawl deploy integration to a standard project setup with default configuration options.
Advanced scenario: Customize firecrawl deploy integration for production environments with multiple constraints and team-specific requirements.
For webhook handling, see firecrawl-webhooks-events.