From flexport-pack
Optimizes Flexport API performance for logistics data using max pagination, LRU caching, p-queue parallel requests, and webhook invalidation.
npx claudepluginhub jeremylongshore/claude-code-plugins-plus-skills --plugin flexport-packThis skill is limited to using the following tools:
Optimize Flexport API integration performance. The API is rate-limited and serves logistics data that changes infrequently (shipments update hourly, products rarely). Cache aggressively for reads, batch writes, and use maximum page sizes.
Optimizes Flexport API costs using webhooks instead of polling, max pagination, smart caching TTLs, and usage monitoring. For logistics apps with high API volume.
Guides Next.js Cache Components and Partial Prerendering (PPR): 'use cache' directives, cacheLife(), cacheTag(), revalidateTag() for caching, invalidation, static/dynamic optimization. Auto-activates on cacheComponents: true.
Guides building MCP servers enabling LLMs to interact with external services via tools. Covers best practices, TypeScript/Node (MCP SDK), Python (FastMCP).
Share bugs, ideas, or general feedback.
Optimize Flexport API integration performance. The API is rate-limited and serves logistics data that changes infrequently (shipments update hourly, products rarely). Cache aggressively for reads, batch writes, and use maximum page sizes.
// Default per=25. Use per=100 (max) to reduce API calls by 4x
async function fetchAllShipments(): Promise<Shipment[]> {
const all: Shipment[] = [];
let page = 1;
while (true) {
const res = await flexport(`/shipments?per=100&page=${page}`);
all.push(...res.data.records);
if (res.data.records.length < 100) break;
page++;
}
return all;
// 1000 shipments = 10 API calls instead of 40
}
import { LRUCache } from 'lru-cache';
const cache = new LRUCache<string, any>({
max: 500,
ttl: 5 * 60 * 1000, // 5 min for shipment data
});
// Products change rarely — cache longer
const productCache = new LRUCache<string, any>({
max: 1000,
ttl: 60 * 60 * 1000, // 1 hour
});
async function cachedFlexport(path: string, ttlCache = cache): Promise<any> {
const cached = ttlCache.get(path);
if (cached) return cached;
const data = await flexport(path);
ttlCache.set(path, data);
return data;
}
import PQueue from 'p-queue';
const queue = new PQueue({ concurrency: 5, interval: 1000, intervalCap: 10 });
// Fetch details for multiple shipments in parallel
async function enrichShipments(ids: string[]) {
return Promise.all(
ids.map(id => queue.add(() => flexport(`/shipments/${id}`)))
);
}
// Instead of polling, invalidate cache on webhook events
async function handleWebhook(event: any) {
if (event.type.startsWith('shipment.')) {
cache.delete(`/shipments/${event.data.shipment_id}`);
cache.delete('/shipments'); // Invalidate list cache
}
if (event.type.startsWith('product.')) {
productCache.delete(`/products/${event.data.product_id}`);
}
}
| Metric | Target | Strategy |
|---|---|---|
| Shipment list load | < 500ms | Cache with 5min TTL |
| Product lookup | < 100ms | Cache with 1hr TTL |
| Bulk shipment fetch | < 3s for 100 | Parallel with p-queue |
| Dashboard refresh | < 2s | Stale-while-revalidate |
For cost optimization, see flexport-cost-tuning.