From lisa
Performance review methodology. N+1 queries, inefficient algorithms, memory leaks, missing indexes, unnecessary re-renders, bundle size issues. Evidence-based recommendations.
npx claudepluginhub codyswanngt/lisa --plugin lisaThis skill uses the workspace's default tool permissions.
Identify bottlenecks, inefficiencies, and scalability risks in code changes.
Guides Next.js Cache Components and Partial Prerendering (PPR) with cacheComponents enabled. Implements 'use cache', cacheLife(), cacheTag(), revalidateTag(), static/dynamic optimization, and cache debugging.
Guides building MCP servers enabling LLMs to interact with external services via tools. Covers best practices, TypeScript/Node (MCP SDK), Python (FastMCP).
Generates original PNG/PDF visual art via design philosophy manifestos for posters, graphics, and static designs on user request.
Identify bottlenecks, inefficiencies, and scalability risks in code changes.
Structure findings as:
## Performance Analysis
### Critical Issues
Issues that will cause noticeable degradation at scale.
- [issue] -- where in the code, why it matters, estimated impact
### N+1 Query Detection
| Location | Pattern | Fix |
|----------|---------|-----|
| file:line | Description of the N+1 | Eager load / batch / join |
### Algorithmic Complexity
| Location | Current | Suggested | Why |
|----------|---------|-----------|-----|
| file:line | O(n^2) | O(n) | Description |
### Database Concerns
- Missing indexes, unoptimized queries, excessive round trips
### Memory Concerns
- Unbounded growth, large allocations, retained references
### Caching Opportunities
- Computations or queries that could benefit from caching
### Recommendations
- [recommendation] -- priority (critical/warning/suggestion), estimated impact
// Bad: N+1 -- one query per user inside loop
const users = await userRepo.find();
const profiles = await Promise.all(users.map(u => profileRepo.findOne({ userId: u.id })));
// Good: Single query with join or batch
const users = await userRepo.find({ relations: ["profile"] });
// Bad: Recomputes on every call
const getExpensiveResult = () => heavyComputation(data);
// Good: Compute once, reuse
const expensiveResult = heavyComputation(data);
// Bad: Cache grows without limit
const cache = new Map();
const get = (key) => { if (!cache.has(key)) cache.set(key, compute(key)); return cache.get(key); };
// Good: LRU or bounded cache
const cache = new LRUCache({ max: 1000 });