Identifies performance bottlenecks in code, databases, and APIs. Uses profiling, timings, and fixes like indexing, caching, N+1 resolution, and pagination. Verifies with before/after measurements.
From antigravity-awesome-skillsnpx claudepluginhub sickn33/antigravity-awesome-skills --plugin antigravity-awesome-skillsThis skill uses the workspace's default tool permissions.
README.mdDesigns and optimizes AI agent action spaces, tool definitions, observation formats, error recovery, and context for higher task completion rates.
Enables AI agents to execute x402 payments with per-task budgets, spending controls, and non-custodial wallets via MCP tools. Use when agents pay for APIs, services, or other agents.
Compares coding agents like Claude Code and Aider on custom YAML-defined codebase tasks using git worktrees, measuring pass rate, cost, time, and consistency.
Find and fix performance bottlenecks. Measure, optimize, verify. Make it fast.
Never optimize without measuring:
// Measure execution time
console.time('operation');
await slowOperation();
console.timeEnd('operation'); // operation: 2341ms
What to measure:
Use profiling tools to find the slow parts:
Browser:
DevTools ā Performance tab ā Record ā Stop
Look for long tasks (red bars)
Node.js:
node --prof app.js
node --prof-process isolate-*.log > profile.txt
Database:
EXPLAIN ANALYZE SELECT * FROM users WHERE email = 'test@example.com';
Fix the slowest thing first (biggest impact).
Problem: N+1 Queries
// Bad: N+1 queries
const users = await db.users.find();
for (const user of users) {
user.posts = await db.posts.find({ userId: user.id }); // N queries
}
// Good: Single query with JOIN
const users = await db.users.find()
.populate('posts'); // 1 query
Problem: Missing Index
-- Check slow query
EXPLAIN SELECT * FROM users WHERE email = 'test@example.com';
-- Shows: Seq Scan (bad)
-- Add index
CREATE INDEX idx_users_email ON users(email);
-- Check again
EXPLAIN SELECT * FROM users WHERE email = 'test@example.com';
-- Shows: Index Scan (good)
**Problem: SELECT ***
// Bad: Fetches all columns
const users = await db.query('SELECT * FROM users');
// Good: Only needed columns
const users = await db.query('SELECT id, name, email FROM users');
Problem: No Pagination
// Bad: Returns all records
const users = await db.users.find();
// Good: Paginated
const users = await db.users.find()
.limit(20)
.skip((page - 1) * 20);
Problem: No Caching
// Bad: Hits database every time
app.get('/api/stats', async (req, res) => {
const stats = await db.stats.calculate(); // Slow
res.json(stats);
});
// Good: Cache for 5 minutes
const cache = new Map();
app.get('/api/stats', async (req, res) => {
const cached = cache.get('stats');
if (cached && Date.now() - cached.time < 300000) {
return res.json(cached.data);
}
const stats = await db.stats.calculate();
cache.set('stats', { data: stats, time: Date.now() });
res.json(stats);
});
Problem: Sequential Operations
// Bad: Sequential (slow)
const user = await getUser(id);
const posts = await getPosts(id);
const comments = await getComments(id);
// Total: 300ms + 200ms + 150ms = 650ms
// Good: Parallel (fast)
const [user, posts, comments] = await Promise.all([
getUser(id),
getPosts(id),
getComments(id)
]);
// Total: max(300ms, 200ms, 150ms) = 300ms
Problem: Large Payloads
// Bad: Returns everything
res.json(users); // 5MB response
// Good: Only needed fields
res.json(users.map(u => ({
id: u.id,
name: u.name,
email: u.email
}))); // 500KB response
Problem: Unnecessary Re-renders
// Bad: Re-renders on every parent update
function UserList({ users }) {
return users.map(user => <UserCard user={user} />);
}
// Good: Memoized
const UserCard = React.memo(({ user }) => {
return <div>{user.name}</div>;
});
Problem: Large Bundle
// Bad: Imports entire library
import _ from 'lodash'; // 70KB
// Good: Import only what you need
import debounce from 'lodash/debounce'; // 2KB
Problem: No Code Splitting
// Bad: Everything in one bundle
import HeavyComponent from './HeavyComponent';
// Good: Lazy load
const HeavyComponent = React.lazy(() => import('./HeavyComponent'));
Problem: Unoptimized Images
<!-- Bad: Large image -->
<img src="photo.jpg" /> <!-- 5MB -->
<!-- Good: Optimized and responsive -->
<img
src="photo-small.webp"
srcset="photo-small.webp 400w, photo-large.webp 800w"
loading="lazy"
width="400"
height="300"
/> <!-- 50KB -->
Problem: Inefficient Algorithm
// Bad: O(n²) - nested loops
function findDuplicates(arr) {
const duplicates = [];
for (let i = 0; i < arr.length; i++) {
for (let j = i + 1; j < arr.length; j++) {
if (arr[i] === arr[j]) duplicates.push(arr[i]);
}
}
return duplicates;
}
// Good: O(n) - single pass with Set
function findDuplicates(arr) {
const seen = new Set();
const duplicates = new Set();
for (const item of arr) {
if (seen.has(item)) duplicates.add(item);
seen.add(item);
}
return Array.from(duplicates);
}
Problem: Repeated Calculations
// Bad: Calculates every time
function getTotal(items) {
return items.reduce((sum, item) => sum + item.price * item.quantity, 0);
}
// Called 100 times in render
// Good: Memoized
const getTotal = useMemo(() => {
return items.reduce((sum, item) => sum + item.price * item.quantity, 0);
}, [items]);
Problem: Memory Leak
// Bad: Event listener not cleaned up
useEffect(() => {
window.addEventListener('scroll', handleScroll);
// Memory leak!
}, []);
// Good: Cleanup
useEffect(() => {
window.addEventListener('scroll', handleScroll);
return () => window.removeEventListener('scroll', handleScroll);
}, []);
Problem: Large Data in Memory
// Bad: Loads entire file into memory
const data = fs.readFileSync('huge-file.txt'); // 1GB
// Good: Stream it
const stream = fs.createReadStream('huge-file.txt');
stream.on('data', chunk => process(chunk));
Always measure before and after:
// Before optimization
console.time('query');
const users = await db.users.find();
console.timeEnd('query');
// query: 2341ms
// After optimization (added index)
console.time('query');
const users = await db.users.find();
console.timeEnd('query');
// query: 23ms
// Improvement: 100x faster!
Set targets:
Page Load: < 2 seconds
API Response: < 200ms
Database Query: < 50ms
Bundle Size: < 200KB
Time to Interactive: < 3 seconds
Browser:
Node.js:
node --prof (profiling)clinic (diagnostics)autocannon (load testing)Database:
EXPLAIN ANALYZE (query plans)Monitoring:
Easy optimizations with big impact:
@database-design - Query optimization@codebase-audit-pre-push - Code review@bug-hunter - Debugging