High-performance job queue for Bun. Built for AI agents and automation.
Zero external dependencies. MCP-native. TypeScript-first.
Documentation ·
Benchmarks ·
npm
Quickstart
bun add bunqueue
import { Bunqueue } from 'bunqueue/client';
const app = new Bunqueue('emails', {
embedded: true,
processor: async (job) => {
console.log(`Sending to ${job.data.to}`);
return { sent: true };
},
});
await app.add('send', { to: 'alice@example.com' });
That's it. Queue + Worker in one object. No Redis, no config, no setup.
Simple Mode
Simple Mode gives you a Queue and a Worker in a single object. Add jobs, process them, add middleware, schedule crons — all from one place.
Use Bunqueue when producer and consumer are in the same process. For distributed systems, use Queue + Worker separately. For AI agent workflows, use the MCP Server instead — agents control queues via natural language without writing code.
Architecture
new Bunqueue('emails', opts)
│
├── this.queue = new Queue('emails', ...)
├── this.worker = new Worker('emails', ...)
│
└── Subsystems (all optional):
├── RetryEngine ── jitter, fibonacci, exponential, custom
├── CircuitBreaker ── pauses worker after N failures
├── BatchAccumulator ── groups N jobs into one call
├── TriggerManager ── "on complete → create job B"
├── TtlChecker ── rejects expired jobs
├── PriorityAger ── boosts old jobs' priority
├── CancellationManager ── AbortController per job
└── DedupDebounceMerger ── deduplication & debounce
Processing pipeline per job:
Job → Circuit Breaker → TTL check → AbortController → Retry → Middleware → Processor
Routes
Route jobs to different handlers by name:
const app = new Bunqueue('notifications', {
embedded: true,
routes: {
'send-email': async (job) => {
await sendEmail(job.data.to);
return { channel: 'email' };
},
'send-sms': async (job) => {
await sendSMS(job.data.to);
return { channel: 'sms' };
},
},
});
await app.add('send-email', { to: 'alice' });
await app.add('send-sms', { to: 'bob' });
Note: Use one of processor, routes, or batch. Passing multiple or none throws an error.
Middleware
Wraps every job execution. Execution order is onion-style: mw1 → mw2 → processor → mw2 → mw1. When no middleware is added, zero overhead.
// Timing middleware
app.use(async (job, next) => {
const start = Date.now();
const result = await next();
console.log(`${job.name}: ${Date.now() - start}ms`);
return result;
});
// Error recovery middleware
app.use(async (job, next) => {
try {
return await next();
} catch (err) {
return { recovered: true, error: err.message };
}
});
Batch Processing
Accumulates N jobs and processes them together. Flushes when buffer reaches size or timeout expires. On close(), remaining jobs are flushed.
const app = new Bunqueue('db-inserts', {
embedded: true,
batch: {
size: 50,
timeout: 2000,
processor: async (jobs) => {
const rows = jobs.map(j => j.data.row);
await db.insertMany('table', rows);
return jobs.map(() => ({ inserted: true }));
},
},
});
Advanced Retry
5 strategies + retry predicate: