Set up local Kafka development environment using Docker Compose. Includes Kafka (KRaft mode), Schema Registry, Kafka UI, Prometheus, and Grafana.
Spin up a complete local Kafka or Redpanda development environment with monitoring. Includes Kafka (KRaft), Schema Registry, Kafka UI, Prometheus, and Grafana for testing event-driven applications locally.
/plugin marketplace add anton-abyzov/specweave/plugin install sw-kafka@specweaveSpin up a complete local Kafka development environment with one command.
docker-compose up -dServices:
Use When: Testing Apache Kafka specifically, need Schema Registry
Services:
Use When: Testing high-performance alternative, need multi-broker cluster locally
# Start dev environment setup
/sw-kafka:dev-env
# I'll ask:
# 1. Which stack? (Kafka or Redpanda)
# 2. Where to create files? (current directory or specify path)
# 3. Custom ports? (use defaults or customize)
# Then I'll:
# - Generate docker-compose.yml
# - Start all services
# - Wait for health checks
# - Provide connection details
# - Open Kafka UI in browser
Directory Structure:
./kafka-dev/
├── docker-compose.yml # Main compose file
├── .env # Environment variables
├── data/ # Persistent volumes
│ ├── kafka/
│ ├── prometheus/
│ └── grafana/
└── config/
├── prometheus.yml # Prometheus config
└── grafana/ # Dashboard provisioning
Services Running:
After setup, connect with:
const { Kafka } = require('kafkajs');
const kafka = new Kafka({
clientId: 'my-app',
brokers: ['localhost:9092']
});
const producer = kafka.producer();
await producer.connect();
await producer.send({
topic: 'test-topic',
messages: [{ value: 'Hello Kafka!' }]
});
from kafka import KafkaConsumer
consumer = KafkaConsumer(
'test-topic',
bootstrap_servers=['localhost:9092'],
group_id='my-group',
auto_offset_reset='earliest'
)
for message in consumer:
print(f"Received: {message.value}")
# Produce message
echo "Hello Kafka" | kcat -P -b localhost:9092 -t test-topic
# Consume messages
kcat -C -b localhost:9092 -t test-topic -o beginning
I'll also create sample code templates:
producer-nodejs.js - Production-ready Node.js producerconsumer-nodejs.js - Production-ready Node.js consumerproducer-python.py - Python producer with error handlingconsumer-python.py - Python consumer with DLQAfter environment starts, I'll:
# Start environment
docker-compose up -d
# Stop environment
docker-compose down
# Stop and remove data
docker-compose down -v
# View logs
docker-compose logs -f kafka
# Restart Kafka only
docker-compose restart kafka
# Check health
docker-compose ps
Skills Activated: kafka-cli-tools
Docker Compose Location: plugins/specweave-kafka/docker/
Sample Code: plugins/specweave-kafka/docker/templates/