Implements API throttling middleware with concurrency limits, circuit breakers, priority queues, and adaptive controls to protect backends from overload.
npx claudepluginhub jeremylongshore/claude-code-plugins-plus-skills --plugin api-throttling-managerThis skill is limited to using the following tools:
Implement API throttling policies that protect backend services from overload by controlling request concurrency, queue depth, and processing rates. Apply backpressure mechanisms including concurrent request limits, priority queues, circuit breakers, and adaptive throttling that adjusts limits based on real-time backend health metrics.
Implements API rate limiting with sliding windows, token buckets, quotas using Redis and libraries for Node.js, Python/FastAPI, Java. Protects endpoints from excessive requests with headers and 429 responses.
Implements API rate limiting and throttling using token bucket, sliding window, fixed window algorithms with Redis counters or middleware. Protects against brute force, abuse, DoS attacks.
Implements API rate limiting and throttling via token bucket, sliding window, fixed window using Redis counters, middleware, or gateways for per-user/IP/endpoint limits and 429 responses.
Share bugs, ideas, or general feedback.
Implement API throttling policies that protect backend services from overload by controlling request concurrency, queue depth, and processing rates. Apply backpressure mechanisms including concurrent request limits, priority queues, circuit breakers, and adaptive throttling that adjusts limits based on real-time backend health metrics.
Retry-After during the open state.X-Throttle-Limit, X-Throttle-Remaining, and X-Throttle-Reset for client-side awareness.Retry-After, and recovery behavior when load subsides.See ${CLAUDE_SKILL_DIR}/references/implementation.md for the full implementation guide.
${CLAUDE_SKILL_DIR}/src/middleware/throttle.js - Concurrency and request rate throttling middleware${CLAUDE_SKILL_DIR}/src/middleware/circuit-breaker.js - Circuit breaker for downstream service protection${CLAUDE_SKILL_DIR}/src/middleware/priority-queue.js - Tier-based request prioritization${CLAUDE_SKILL_DIR}/src/config/throttle-config.js - Per-endpoint throttle policy definitions${CLAUDE_SKILL_DIR}/tests/throttle/ - Load tests validating throttle engagement and recovery| Error | Cause | Solution |
|---|---|---|
| 503 Service Unavailable | Concurrency limit reached for the endpoint | Return Retry-After header with estimated wait time; include throttle state headers |
| 503 Circuit Open | Circuit breaker tripped due to downstream failures | Return cached response if available; provide circuit reset time in response body |
| Queue overflow | Request buffer exceeded maximum depth | Reject with 503; alert operations team; consider scaling backend capacity |
| Stale throttle state | Redis connection lost; throttle counters become inaccurate | Fall back to in-process counters; reconnect with backoff; log state inconsistency |
| Priority starvation | Low-tier requests never served under sustained high-tier load | Reserve minimum throughput percentage for each tier to prevent complete starvation |
Refer to ${CLAUDE_SKILL_DIR}/references/errors.md for comprehensive error patterns.
Database-heavy endpoint protection: Apply concurrency limit of 10 to a report generation endpoint that runs expensive aggregation queries, queueing additional requests with estimated wait times.
Multi-tier SaaS throttling: Enterprise tier gets 100 concurrent requests, Pro tier gets 25, Free tier gets 5, with priority queue ensuring enterprise requests are served first during contention.
Adaptive autoscaling trigger: Throttle middleware emits metrics that trigger horizontal pod autoscaling when throttle engagement rate exceeds 20% sustained over 5 minutes.
See ${CLAUDE_SKILL_DIR}/references/examples.md for additional examples.