Optimize Vercel API performance with caching, batching, and connection pooling. Use when experiencing slow API responses, implementing caching strategies, or optimizing request throughput for Vercel integrations. Trigger with phrases like "vercel performance", "optimize vercel", "vercel latency", "vercel caching", "vercel slow", "vercel batch".
From vercel-packnpx claudepluginhub nickloveinvesting/nick-love-plugins --plugin vercel-packThis skill is limited to using the following tools:
references/caching-strategy.mdreferences/errors.mdreferences/examples.mdGuides Next.js Cache Components and Partial Prerendering (PPR) with cacheComponents enabled. Implements 'use cache', cacheLife(), cacheTag(), revalidateTag(), static/dynamic optimization, and cache debugging.
Migrates code, prompts, and API calls from Claude Sonnet 4.0/4.5 or Opus 4.1 to Opus 4.5, updating model strings on Anthropic, AWS, GCP, Azure platforms.
Details PluginEval's skill quality evaluation: 3 layers (static, LLM judge), 10 dimensions, rubrics, formulas, anti-patterns, badges. Use to interpret scores, improve triggering, calibrate thresholds.
Performance tuning Vercel deployments centers on four levers: reducing JavaScript bundle size (which directly controls Time to Interactive), leveraging Vercel's edge caching for static and dynamic content, minimizing serverless function cold-start time, and moving latency-critical code to Edge Functions which run close to the user without cold-start penalties. Measuring before and after each optimization is essential to confirm impact and justify the added complexity.
Measure current performance metrics using Vercel's built-in analytics or Lighthouse. Record Core Web Vitals (LCP, FID, CLS), serverless function cold-start latency, and cache hit rates for the most-visited routes. Identify which pages or API routes account for the largest share of poor-performing user experiences.
Configure Vercel's cache-control headers to maximize edge cache utilization for eligible responses. For dynamic pages generated with Next.js ISR or Vercel's stale-while-revalidate pattern, set revalidation intervals appropriate to how frequently the underlying data changes. Cache hit rate is the single most impactful metric to improve since it eliminates function invocations entirely.
Use DataLoader or similar batching libraries to eliminate N+1 query patterns in serverless function handlers. Batching reduces the number of database round-trips per request, which has a multiplicative effect on function latency since each network hop adds cold connection overhead.
Configure connection pooling and keep-alive for database connections used inside serverless functions. Since functions may be invoked by many concurrent instances, use a connection pooler like PgBouncer to prevent connection exhaustion at the database level.
See ${CLAUDE_SKILL_DIR}/references/errors.md for comprehensive error handling.
See ${CLAUDE_SKILL_DIR}/references/examples.md for detailed examples.
Optimize Vercel API performance with caching, batching, and connection pooling.