From python
Guides performance profiling workflows: measure baselines, identify bottlenecks like N+1 queries and memory leaks, apply optimizations such as caching and lazy loading, verify improvements using Node.js, Python, and Go tools.
npx claudepluginhub martinffx/atelier --plugin pythonThis skill uses the workspace's default tool permissions.
Performance profiling and optimization workflow.
Identifies performance bottlenecks and optimizes via profiling, caching strategies, database query tuning, and language-specific tools for Python, Rust, JS/Node.js, Go.
Profiles bottlenecks and optimizes code performance: fixes N+1 queries, reduces bundle sizes, eliminates memory leaks, improves algorithms in Python/JS apps.
Analyzes code performance, identifies bottlenecks in CPU/memory/I/O/algorithms/DB queries, suggests optimizations. Supports JS/TS, Python, Java, Go, Rust, web apps.
Share bugs, ideas, or general feedback.
Performance profiling and optimization workflow.
Never optimize without profiling.
You don't know where the bottleneck is until you measure. Making changes without profiling wastes time and often makes things worse.
Measure the current performance.
Questions:
Tools:
Find the hot paths.
Look for:
Apply the right optimization pattern.
Re-profile to confirm improvement.
See references/common-bottlenecks.md for detailed patterns:
| Pattern | Symptom | Solution |
|---|---|---|
| N+1 | Multiple DB queries | Eager load, batch |
| Memory leak | Growing RSS | Clear caches, weak refs |
| Blocking I/O | Thread blocked | Async, worker pool |
| Unnecessary work | CPU high | Skip redundant calc |
| Large data | Slow serialization | Pagination, streams |
See references/optimization-patterns.md for:
// Before: every call hits DB
const user = await db.users.find(id);
// After: cache
const cacheKey = `user:${id}`;
let user = await redis.get(cacheKey);
if (!user) {
user = await db.users.find(id);
await redis.set(cacheKey, user, 'EX', 300);
}
// Before: load everything
const allUsers = await db.users.findMany();
// After: paginate
const users = await db.users.findMany({
take: 20,
skip: page * 20
});
// Before: new connection each time
const client = new Client();
await client.connect();
// After: pool
const pool = new Pool({ max: 20 });
// Use pool.query() throughout
See references/profiling-tools.md for:
# CPU profiling
node --prof app.js
# Memory
node --inspect app.js # Chrome DevTools
# clinic.js
npx clinic doctor -- node app.js
# cProfile
python -m cProfile -o profile.prof app.py
# view with: python -m cProfile -s cumulative app.py
# pprof
go test -cpuprofile=cpu.prof
go tool pprof cpu.prof
# Web interface
go tool pprof -http=:8080 cpu.prof