From figma-pack
Optimizes Figma REST API performance with partial fetches, depth limits, and LRU caching for large files and slow responses.
npx claudepluginhub jeremylongshore/claude-code-plugins-plus-skills --plugin figma-packThis skill is limited to using the following tools:
Optimize Figma REST API performance. Large Figma files can return multi-megabyte JSON responses. Key strategies: fetch only what you need, cache aggressively, and batch requests.
Optimizes Figma API usage to minimize costs by tracking request volumes, reducing calls with techniques like depth parameters, and selecting plan tiers based on limits.
Loads mandatory prerequisite context before every use_figma tool call for Figma writes or JS-executed reads like node edits, variable setup, component building, or file inspection.
Optimizes Canva Connect REST API performance with LRU/Redis caching, continuation pagination, and async polling in TypeScript for faster integrations.
Share bugs, ideas, or general feedback.
Optimize Figma REST API performance. Large Figma files can return multi-megabyte JSON responses. Key strategies: fetch only what you need, cache aggressively, and batch requests.
// BAD: fetches the entire file tree (can be 10+ MB for large files)
const file = await fetch(`https://api.figma.com/v1/files/${fileKey}`, {
headers: { 'X-Figma-Token': token },
}).then(r => r.json());
// GOOD: use depth parameter to limit tree depth
// depth=1 returns only pages (CANVAS nodes), not their children
const fileMeta = await fetch(
`https://api.figma.com/v1/files/${fileKey}?depth=1`,
{ headers: { 'X-Figma-Token': token } }
).then(r => r.json());
// GOOD: fetch only specific nodes you need
const nodes = await fetch(
`https://api.figma.com/v1/files/${fileKey}/nodes?ids=${nodeIds.join(',')}`,
{ headers: { 'X-Figma-Token': token } }
).then(r => r.json());
// GOOD: use plugin_data or branch_data params only when needed
// By default, plugin data and branch data are NOT returned
import { LRUCache } from 'lru-cache';
// File metadata changes rarely -- cache for 5 minutes
const fileCache = new LRUCache<string, any>({
max: 100,
ttl: 5 * 60 * 1000, // 5 minutes
});
async function getCachedFile(fileKey: string, token: string) {
const cached = fileCache.get(fileKey);
if (cached) return cached;
const file = await fetch(
`https://api.figma.com/v1/files/${fileKey}?depth=1`,
{ headers: { 'X-Figma-Token': token } }
).then(r => r.json());
fileCache.set(fileKey, file);
return file;
}
// Image URLs expire after 30 days -- cache them but with a shorter TTL
const imageUrlCache = new LRUCache<string, string>({
max: 1000,
ttl: 24 * 60 * 60 * 1000, // 1 day (well within 30-day expiry)
});
async function getCachedImageUrl(
fileKey: string, nodeId: string, format: string, token: string
): Promise<string | null> {
const cacheKey = `${fileKey}:${nodeId}:${format}`;
const cached = imageUrlCache.get(cacheKey);
if (cached) return cached;
const data = await fetch(
`https://api.figma.com/v1/images/${fileKey}?ids=${nodeId}&format=${format}`,
{ headers: { 'X-Figma-Token': token } }
).then(r => r.json());
const url = data.images[nodeId];
if (url) imageUrlCache.set(cacheKey, url);
return url;
}
// Instead of polling, use webhooks to know when to re-fetch
// See figma-webhooks-events for full webhook setup
async function handleFileUpdate(fileKey: string) {
// Invalidate cached data for this file
fileCache.delete(fileKey);
// Proactively re-fetch commonly accessed data
const token = process.env.FIGMA_PAT!;
await getCachedFile(fileKey, token);
console.log(`Cache invalidated and refreshed for ${fileKey}`);
}
// The /nodes endpoint accepts multiple IDs -- batch them
// Max practical batch size: ~50-100 IDs per request
async function batchFetchNodes(
fileKey: string,
nodeIds: string[],
token: string,
batchSize = 50
): Promise<Map<string, any>> {
const results = new Map<string, any>();
for (let i = 0; i < nodeIds.length; i += batchSize) {
const batch = nodeIds.slice(i, i + batchSize);
const ids = encodeURIComponent(batch.join(','));
const data = await fetch(
`https://api.figma.com/v1/files/${fileKey}/nodes?ids=${ids}`,
{ headers: { 'X-Figma-Token': token } }
).then(r => r.json());
for (const [id, node] of Object.entries(data.nodes)) {
results.set(id, node);
}
}
return results;
}
import { Agent } from 'undici';
// Reuse HTTP connections to api.figma.com
const figmaAgent = new Agent({
keepAliveTimeout: 30_000,
keepAliveMaxTimeout: 60_000,
connections: 5,
});
// Use with Node.js 18+ built-in fetch
async function optimizedFetch(path: string, token: string) {
return fetch(`https://api.figma.com${path}`, {
headers: { 'X-Figma-Token': token },
// @ts-ignore -- dispatcher is a Node.js fetch option
dispatcher: figmaAgent,
});
}
depth and nodes endpoints| Issue | Cause | Solution |
|---|---|---|
| Stale cache | No invalidation | Use webhooks to invalidate on changes |
| Out of memory | Caching full file JSON | Use depth=1 or nodes endpoint |
| Slow image exports | Large batch, high scale | Reduce scale; batch in groups of 50 |
| Expired image URLs | Cached URL older than 30 days | Set image cache TTL to <24h |
For cost optimization, see figma-cost-tuning.