From vercel-pack
Implement reliability patterns for Vercel deployments including circuit breakers, retry logic, and graceful degradation. Use when building fault-tolerant serverless functions, implementing retry strategies, or adding resilience to production Vercel services. Trigger with phrases like "vercel reliability", "vercel circuit breaker", "vercel resilience", "vercel fallback", "vercel graceful degradation".
npx claudepluginhub flight505/skill-forge --plugin vercel-packThis skill is limited to using the following tools:
Build fault-tolerant Vercel deployments with circuit breakers, retry logic, graceful degradation, and instant rollback integration. Addresses reliability at two levels: function-level resilience (protecting against dependency failures) and deployment-level resilience (protecting against bad deploys).
Guides Next.js Cache Components and Partial Prerendering (PPR): 'use cache' directives, cacheLife(), cacheTag(), revalidateTag() for caching, invalidation, static/dynamic optimization. Auto-activates on cacheComponents: true.
Guides building MCP servers enabling LLMs to interact with external services via tools. Covers best practices, TypeScript/Node (MCP SDK), Python (FastMCP).
Share bugs, ideas, or general feedback.
Build fault-tolerant Vercel deployments with circuit breakers, retry logic, graceful degradation, and instant rollback integration. Addresses reliability at two levels: function-level resilience (protecting against dependency failures) and deployment-level resilience (protecting against bad deploys).
// lib/circuit-breaker.ts
type CircuitState = 'CLOSED' | 'OPEN' | 'HALF_OPEN';
class CircuitBreaker {
private state: CircuitState = 'CLOSED';
private failures = 0;
private lastFailure = 0;
private readonly threshold: number;
private readonly resetTimeMs: number;
constructor(threshold = 5, resetTimeMs = 30000) {
this.threshold = threshold;
this.resetTimeMs = resetTimeMs;
}
async call<T>(fn: () => Promise<T>, fallback: () => T): Promise<T> {
if (this.state === 'OPEN') {
if (Date.now() - this.lastFailure > this.resetTimeMs) {
this.state = 'HALF_OPEN';
} else {
console.warn('Circuit OPEN — returning fallback');
return fallback();
}
}
try {
const result = await fn();
this.onSuccess();
return result;
} catch (error) {
this.onFailure();
console.error('Circuit breaker caught error:', error);
return fallback();
}
}
private onSuccess(): void {
this.failures = 0;
this.state = 'CLOSED';
}
private onFailure(): void {
this.failures++;
this.lastFailure = Date.now();
if (this.failures >= this.threshold) {
this.state = 'OPEN';
console.warn(`Circuit OPENED after ${this.failures} failures`);
}
}
}
// Usage in a serverless function:
const dbCircuit = new CircuitBreaker(3, 30000);
export default async function handler(req, res) {
const users = await dbCircuit.call(
() => db.user.findMany({ take: 10 }),
() => [] // Fallback: empty array when DB is down
);
res.json({ users, degraded: users.length === 0 });
}
Important for serverless: Circuit breaker state lives in a single function instance. Different instances have independent circuits. For global circuit state, use Vercel KV or Edge Config.
// lib/retry.ts
interface RetryOptions {
maxRetries?: number;
baseDelayMs?: number;
maxDelayMs?: number;
retryOn?: (error: unknown) => boolean;
}
async function withRetry<T>(
fn: () => Promise<T>,
options: RetryOptions = {}
): Promise<T> {
const { maxRetries = 3, baseDelayMs = 200, maxDelayMs = 5000, retryOn } = options;
for (let attempt = 0; attempt <= maxRetries; attempt++) {
try {
return await fn();
} catch (error) {
if (attempt === maxRetries) throw error;
if (retryOn && !retryOn(error)) throw error;
const delay = Math.min(
baseDelayMs * Math.pow(2, attempt) + Math.random() * 200,
maxDelayMs
);
await new Promise(r => setTimeout(r, delay));
}
}
throw new Error('Unreachable');
}
// Usage:
const data = await withRetry(
() => fetch('https://api.example.com/data').then(r => {
if (!r.ok) throw new Error(`HTTP ${r.status}`);
return r.json();
}),
{
maxRetries: 3,
retryOn: (err) => {
// Only retry on network errors and 5xx, not 4xx
if (err instanceof TypeError) return true; // network error
return err.message?.includes('5');
},
}
);
// api/products.ts — serve stale data when primary source is down
import { get, set } from '@vercel/kv';
export default async function handler(req, res) {
const cacheKey = 'products:latest';
try {
// Try primary data source
const freshData = await fetchProductsFromDB();
// Update cache with fresh data
await set(cacheKey, JSON.stringify(freshData), { ex: 3600 });
res.setHeader('x-data-source', 'live');
res.json(freshData);
} catch (error) {
// Primary failed — serve stale cache
const cachedData = await get(cacheKey);
if (cachedData) {
console.warn('Serving stale cache — primary source unavailable');
res.setHeader('x-data-source', 'cache-stale');
res.json(JSON.parse(cachedData as string));
} else {
// No cache available — return degraded response
res.setHeader('x-data-source', 'degraded');
res.status(503).json({
error: 'Service temporarily unavailable',
degraded: true,
});
}
}
}
// api/orders/route.ts — idempotent order creation
import { NextRequest, NextResponse } from 'next/server';
import { db } from '@/lib/db';
export async function POST(request: NextRequest) {
const idempotencyKey = request.headers.get('idempotency-key');
if (!idempotencyKey) {
return NextResponse.json(
{ error: 'idempotency-key header required' },
{ status: 400 }
);
}
// Check if this request was already processed
const existing = await db.idempotencyRecord.findUnique({
where: { key: idempotencyKey },
});
if (existing) {
// Return the cached response — same status and body
return NextResponse.json(JSON.parse(existing.responseBody), {
status: existing.responseStatus,
headers: { 'x-idempotent-replay': 'true' },
});
}
// Process the order
const body = await request.json();
const order = await db.order.create({ data: body });
// Cache the response for idempotency
const responseBody = JSON.stringify({ order });
await db.idempotencyRecord.create({
data: { key: idempotencyKey, responseStatus: 201, responseBody },
});
return NextResponse.json({ order }, { status: 201 });
}
// api/health/route.ts
export const dynamic = 'force-dynamic';
interface HealthCheck {
name: string;
check: () => Promise<boolean>;
}
const checks: HealthCheck[] = [
{
name: 'database',
check: async () => {
await db.$queryRaw`SELECT 1`;
return true;
},
},
{
name: 'cache',
check: async () => {
await kv.ping();
return true;
},
},
{
name: 'external-api',
check: async () => {
const r = await fetch('https://api.example.com/health', { signal: AbortSignal.timeout(3000) });
return r.ok;
},
},
];
export async function GET() {
const results: Record<string, 'ok' | 'error'> = {};
await Promise.all(
checks.map(async ({ name, check }) => {
try {
await check();
results[name] = 'ok';
} catch {
results[name] = 'error';
}
})
);
const healthy = Object.values(results).every(v => v === 'ok');
return Response.json(
{ status: healthy ? 'healthy' : 'degraded', checks: results },
{ status: healthy ? 200 : 503 }
);
}
# Instant rollback on health check failure (CI integration)
DEPLOY_URL=$(vercel --prod)
HEALTH=$(curl -s -o /dev/null -w "%{http_code}" "$DEPLOY_URL/api/health")
if [ "$HEALTH" != "200" ]; then
echo "Health check failed ($HEALTH) — rolling back"
vercel rollback
exit 1
fi
echo "Deployment healthy"
| Pattern | Protects Against | Vercel Implementation |
|---|---|---|
| Circuit breaker | Dependency degradation | In-function state or Edge Config |
| Retry + backoff | Transient failures | withRetry wrapper |
| Stale cache | Primary source outage | Vercel KV with TTL |
| Idempotency | Duplicate mutations | Database record per request |
| Health checks | Bad deployments | /api/health + rollback automation |
| Instant rollback | Deployment regression | vercel rollback in CI |
| Error | Cause | Solution |
|---|---|---|
| Circuit opens too aggressively | Threshold too low | Increase failure threshold (e.g., 5 → 10) |
| Retry causes duplicate side effects | No idempotency | Add idempotency-key to mutation endpoints |
| Stale cache expired | TTL too short or never populated | Increase TTL, seed cache on deploy |
| Health check false positive | Timeout too short | Increase AbortSignal timeout to 5s |
| Rollback reverts good deployment | Flaky health check | Add retry to health check before rollback |
For policy guardrails, see vercel-policy-guardrails.