From harness-claude
Implements DataLoader pattern in GraphQL to batch and cache database fetches per request, eliminating N+1 queries in resolvers for lists and shared data.
npx claudepluginhub intense-visions/harness-engineering --plugin harness-claudeThis skill uses the workspace's default tool permissions.
> Batch and cache data fetches to eliminate N+1 queries in GraphQL resolvers
Implements GraphQL resolvers covering function signatures, field resolvers, context management, DataLoader batching, error handling, authentication, and testing strategies.
Optimizes GraphQL API performance with DataLoader for N+1 fixes, query depth/complexity limits, @cacheControl, persisted queries, and CDN caching. Use for slow responses under load or query abuse.
Designs GraphQL schemas with Relay pagination, TypeScript resolvers using DataLoader for N+1 prevention, subscriptions, mutations, and error payloads.
Share bugs, ideas, or general feedback.
Batch and cache data fetches to eliminate N+1 queries in GraphQL resolvers
.load(key) calls within a single tick of the event loop into one batch function call. It must be request-scoped to avoid leaking cached data between users.import DataLoader from 'dataloader';
function createLoaders(db: Database) {
return {
userById: new DataLoader<string, User>(async (ids) => {
const users = await db.users.findByIds([...ids]);
const userMap = new Map(users.map((u) => [u.id, u]));
return ids.map((id) => userMap.get(id) || new Error(`User ${id} not found`));
}),
};
}
The batch function must return results in the same order as the input keys. This is the critical contract. If key index 0 is "abc", result index 0 must be the value for "abc" or an Error. Use a Map to reorder database results to match the input key order.
Attach loaders to the GraphQL context. Create fresh loaders for each request in your server's context factory.
const server = new ApolloServer({
typeDefs,
resolvers,
context: ({ req }) => ({
currentUser: authenticate(req),
loaders: createLoaders(db),
}),
});
.load() in field resolvers instead of direct database calls.const resolvers = {
Order: {
customer: (order, _args, { loaders }) => {
return loaders.userById.load(order.customerId);
},
},
Comment: {
author: (comment, _args, { loaders }) => {
return loaders.userById.load(comment.authorId);
},
},
};
When a query returns 50 orders, the customer field resolver calls .load() 50 times, but DataLoader batches them into a single findByIds([...50 ids]) call.
.loadMany() for one-to-many relationships where you have an array of keys.const resolvers = {
User: {
favoriteProducts: (user, _args, { loaders }) => {
return loaders.productById.loadMany(user.favoriteProductIds);
},
},
};
Create loaders for different access patterns. A userById loader and a userByEmail loader are separate DataLoader instances with separate batch functions and separate caches.
Use .prime() to warm the cache when you already have the data. After a mutation creates a user, prime the loader so subsequent reads in the same request hit the cache.
const newUser = await db.users.create(input);
context.loaders.userById.prime(newUser.id, newUser);
{ cache: false } to the DataLoader constructor if you need batching without memoization — rare, but useful for data that changes mid-request.How batching works: DataLoader collects all .load() calls made synchronously (within the same microtask/tick). At the end of the tick, it calls the batch function once with all collected keys, then distributes results back to each caller. This turns N sequential queries into 1 batched query.
Per-request scope is mandatory. If you reuse a DataLoader across requests, User A's data will be served from cache to User B. Always create new DataLoader instances in the context factory.
Batch function constraints:
Error instances at the corresponding indexNested batching: DataLoader batches within a single tick, but GraphQL resolves level by level. This means field resolvers at depth 2 are batched separately from depth 3. This is usually fine — each level gets one batch call per loader.
Composite keys: When the lookup requires multiple values (e.g., orgId + userId), serialize the key to a string and provide a cacheKeyFn:
new DataLoader<{ orgId: string; userId: string }, Membership>(
async (keys) => {
/* batch by org+user */
},
{ cacheKeyFn: (key) => `${key.orgId}:${key.userId}` }
);
Common mistakes:
undefined instead of Error breaks the contract)https://github.com/graphql/dataloader