← All articlesPlatform Engineering

Background Job Processing: Celery vs BullMQ vs Temporal

Compare Celery, BullMQ, and Temporal for background job processing. Task queues, retry strategies, workflow orchestration, and choosing the right tool for...

Y
Yash Pritwani
14 min read

Why Background Jobs Matter

Every web application has tasks that should not block the HTTP request-response cycle: sending emails, processing images, generating reports, syncing data, or running AI inference. These need a reliable background job system.

<div style="margin:2.5rem auto;max-width:600px;width:100%;text-align:center;"><svg viewBox="0 0 600 180" xmlns="http://www.w3.org/2000/svg" style="width:100%;height:auto;"><rect width="600" height="180" rx="12" fill="#1a1a2e"/><rect x="30" y="60" width="80" height="50" rx="25" fill="#3b82f6" opacity="0.85"/><text x="70" y="90" text-anchor="middle" fill="#ffffff" font-size="11" font-family="system-ui">Prompt</text><rect x="145" y="50" width="90" height="70" rx="8" fill="#6366f1" opacity="0.85"/><text x="190" y="80" text-anchor="middle" fill="#ffffff" font-size="10" font-family="system-ui">Embed</text><text x="190" y="95" text-anchor="middle" fill="#ffffff" font-size="10" font-family="system-ui">[0.2, 0.8...]</text><rect x="270" y="50" width="90" height="70" rx="8" fill="#a855f7" opacity="0.85"/><text x="315" y="75" text-anchor="middle" fill="#ffffff" font-size="10" font-family="system-ui">Vector</text><text x="315" y="90" text-anchor="middle" fill="#ffffff" font-size="10" font-family="system-ui">Search</text><text x="315" y="105" text-anchor="middle" fill="#ffffff" font-size="9" font-family="system-ui" opacity="0.7">top-k=5</text><rect x="395" y="50" width="90" height="70" rx="8" fill="#2dd4bf" opacity="0.85"/><text x="440" y="80" text-anchor="middle" fill="#1a1a2e" font-size="11" font-family="system-ui" font-weight="bold">LLM</text><text x="440" y="95" text-anchor="middle" fill="#1a1a2e" font-size="9" font-family="system-ui">+ context</text><rect x="520" y="60" width="55" height="50" rx="25" fill="#f59e0b" opacity="0.85"/><text x="547" y="90" text-anchor="middle" fill="#1a1a2e" font-size="10" font-family="system-ui">Reply</text><defs><marker id="arrow4" markerWidth="8" markerHeight="6" refX="8" refY="3" orient="auto"><path d="M0,0 L8,3 L0,6" fill="#e2e8f0"/></marker></defs><line x1="112" y1="85" x2="143" y2="85" stroke="#e2e8f0" stroke-width="1.5" marker-end="url(#arrow4)"/><line x1="237" y1="85" x2="268" y2="85" stroke="#e2e8f0" stroke-width="1.5" marker-end="url(#arrow4)"/><line x1="362" y1="85" x2="393" y2="85" stroke="#e2e8f0" stroke-width="1.5" marker-end="url(#arrow4)"/><line x1="487" y1="85" x2="518" y2="85" stroke="#e2e8f0" stroke-width="1.5" marker-end="url(#arrow4)"/><text x="300" y="155" text-anchor="middle" fill="#94a3b8" font-size="10" font-family="system-ui">Retrieval-Augmented Generation (RAG) Flow</text></svg><p style="margin-top:0.75rem;font-size:0.85rem;color:#94a3b8;font-style:italic;line-height:1.4;">RAG architecture: user prompts are embedded, matched against a vector store, then fed to an LLM with retrieved context.</p></div>

The choice depends on your language ecosystem and complexity needs.

Celery: The Python Standard

Celery is the dominant task queue for Python applications. It supports Redis and RabbitMQ as message brokers.

# tasks.py
from celery import Celery

app = Celery('myapp', broker='redis://redis:6379/0', backend='redis://redis:6379/1')

# Configure
app.conf.update(
    task_serializer='json',
    result_serializer='json',
    accept_content=['json'],
    timezone='UTC',
    task_acks_late=True,          # Acknowledge after completion (safer)
    worker_prefetch_multiplier=1,  # One task at a time per worker
    task_reject_on_worker_lost=True,
    task_default_retry_delay=60,
    task_max_retries=3,
)

@app.task(bind=True, max_retries=3, default_retry_delay=60)
def send_welcome_email(self, user_id: int):
    """Send welcome email to new user."""
    try:
        user = get_user(user_id)
        email_service.send(
            to=user.email,
            subject='Welcome!',
            template='welcome.html',
        )
    except EmailServiceError as exc:
        # Retry with exponential backoff
        raise self.retry(exc=exc, countdown=2 ** self.request.retries * 60)

@app.task(bind=True, time_limit=300, soft_time_limit=280)
def generate_report(self, report_id: int):
    """Generate PDF report (max 5 minutes)."""
    report = get_report(report_id)
    update_report_status(report_id, 'processing')

    try:
        pdf = generate_pdf(report.data)
        upload_to_storage(pdf, f'reports/{report_id}.pdf')
        update_report_status(report_id, 'complete')
    except SoftTimeLimitExceeded:
        update_report_status(report_id, 'timeout')
        raise

# Chain tasks (pipeline)
from celery import chain

workflow = chain(
    fetch_data.s(source_id),
    transform_data.s(),
    load_to_warehouse.s(),
    notify_completion.s(),
)
workflow.apply_async()

# Group tasks (parallel)
from celery import group

batch = group([
    process_image.s(image_id) for image_id in image_ids
])
result = batch.apply_async()
# Start Celery worker
celery -A tasks worker --loglevel=info --concurrency=4

# Start Celery beat (scheduler)
celery -A tasks beat --loglevel=info

# Monitor with Flower
celery -A tasks flower --port=5555

BullMQ: The Node.js Standard

BullMQ is the most popular job queue for Node.js/TypeScript. Built on Redis, it provides reliable job processing with a clean API.

// queue.ts
import { Queue, Worker, QueueEvents } from 'bullmq';
import IORedis from 'ioredis';

const connection = new IORedis({
  host: 'redis',
  port: 6379,
  maxRetriesPerRequest: null,
});

// Define queues
const emailQueue = new Queue('email', { connection });
const reportQueue = new Queue('reports', { connection });

// Add jobs
await emailQueue.add('welcome-email', {
  userId: 123,
  email: '[email protected]',
}, {
  attempts: 3,
  backoff: { type: 'exponential', delay: 1000 },
  removeOnComplete: { count: 1000 },
  removeOnFail: { count: 5000 },
});

// Scheduled job (run in 5 minutes)
await reportQueue.add('generate', { reportId: 456 }, {
  delay: 5 * 60 * 1000,
});

// Recurring job (every hour)
await reportQueue.upsertJobScheduler('hourly-sync', {
  every: 60 * 60 * 1000,
}, {
  name: 'data-sync',
  data: { source: 'external-api' },
});
// worker.ts
import { Worker } from 'bullmq';

const emailWorker = new Worker('email', async (job) => {
  switch (job.name) {
    case 'welcome-email':
      await sendWelcomeEmail(job.data.userId);
      break;
    case 'password-reset':
      await sendPasswordReset(job.data.email);
      break;
  }
}, {
  connection,
  concurrency: 5,
  limiter: {
    max: 10,     // Max 10 jobs
    duration: 1000, // Per second
  },
});

emailWorker.on('completed', (job) => {
  console.log('Job ' + job.id + ' completed');
});

emailWorker.on('failed', (job, err) => {
  console.error('Job ' + job.id + ' failed:', err.message);
});

// Report worker with progress tracking
const reportWorker = new Worker('reports', async (job) => {
  await job.updateProgress(10);
  const data = await fetchData(job.data.reportId);

  await job.updateProgress(50);
  const pdf = await generatePDF(data);

  await job.updateProgress(90);
  await uploadToStorage(pdf);

  await job.updateProgress(100);
  return { url: '/reports/' + job.data.reportId + '.pdf' };
}, { connection });

<div style="margin:2.5rem auto;max-width:600px;width:100%;text-align:center;"><svg viewBox="0 0 600 200" xmlns="http://www.w3.org/2000/svg" style="width:100%;height:auto;"><rect width="600" height="200" rx="12" fill="#1a1a2e"/><text x="80" y="25" text-anchor="middle" fill="#94a3b8" font-size="10" font-family="system-ui">Input</text><circle cx="80" cy="50" r="14" fill="none" stroke="#3b82f6" stroke-width="2"/><circle cx="80" cy="100" r="14" fill="none" stroke="#3b82f6" stroke-width="2"/><circle cx="80" cy="150" r="14" fill="none" stroke="#3b82f6" stroke-width="2"/><text x="230" y="25" text-anchor="middle" fill="#94a3b8" font-size="10" font-family="system-ui">Hidden</text><circle cx="230" cy="45" r="14" fill="#6366f1" opacity="0.8"/><circle cx="230" cy="85" r="14" fill="#6366f1" opacity="0.8"/><circle cx="230" cy="125" r="14" fill="#6366f1" opacity="0.8"/><circle cx="230" cy="165" r="14" fill="#6366f1" opacity="0.8"/><text x="380" y="25" text-anchor="middle" fill="#94a3b8" font-size="10" font-family="system-ui">Hidden</text><circle cx="380" cy="55" r="14" fill="#a855f7" opacity="0.8"/><circle cx="380" cy="100" r="14" fill="#a855f7" opacity="0.8"/><circle cx="380" cy="145" r="14" fill="#a855f7" opacity="0.8"/><text x="520" y="25" text-anchor="middle" fill="#94a3b8" font-size="10" font-family="system-ui">Output</text><circle cx="520" cy="80" r="14" fill="none" stroke="#2dd4bf" stroke-width="2"/><circle cx="520" cy="130" r="14" fill="none" stroke="#2dd4bf" stroke-width="2"/><line x1="94" y1="50" x2="216" y2="45" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="94" y1="50" x2="216" y2="85" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="94" y1="50" x2="216" y2="125" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="94" y1="50" x2="216" y2="165" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="94" y1="100" x2="216" y2="45" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="94" y1="100" x2="216" y2="85" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="94" y1="100" x2="216" y2="125" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="94" y1="100" x2="216" y2="165" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="94" y1="150" x2="216" y2="45" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="94" y1="150" x2="216" y2="85" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="94" y1="150" x2="216" y2="125" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="94" y1="150" x2="216" y2="165" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="244" y1="45" x2="366" y2="55" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="244" y1="45" x2="366" y2="100" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="244" y1="45" x2="366" y2="145" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="244" y1="85" x2="366" y2="55" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="244" y1="85" x2="366" y2="100" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="244" y1="85" x2="366" y2="145" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="244" y1="125" x2="366" y2="55" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="244" y1="125" x2="366" y2="100" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="244" y1="125" x2="366" y2="145" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="244" y1="165" x2="366" y2="55" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="244" y1="165" x2="366" y2="100" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="244" y1="165" x2="366" y2="145" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="394" y1="55" x2="506" y2="80" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="394" y1="55" x2="506" y2="130" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="394" y1="100" x2="506" y2="80" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="394" y1="100" x2="506" y2="130" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="394" y1="145" x2="506" y2="80" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/><line x1="394" y1="145" x2="506" y2="130" stroke="#e2e8f0" stroke-width="0.5" opacity="0.3"/></svg><p style="margin-top:0.75rem;font-size:0.85rem;color:#94a3b8;font-style:italic;line-height:1.4;">Neural network architecture: data flows through input, hidden, and output layers.</p></div>

Temporal: The Workflow Engine

Temporal is not a simple task queue — it is a workflow orchestration engine. It handles long-running, complex workflows with built-in state management, retries, and failure recovery.

// workflows.ts
import { proxyActivities, sleep, condition } from '@temporalio/workflow';
import type * as activities from './activities';

const { sendEmail, generateReport, chargePayment, notifyUser } =
  proxyActivities<typeof activities>({
    startToCloseTimeout: '5m',
    retry: {
      initialInterval: '1s',
      backoffCoefficient: 2,
      maximumAttempts: 5,
    },
  });

// Order processing workflow (can run for days)
export async function orderWorkflow(orderId): Promise<void> {
  // Step 1: Charge payment
  const paymentResult = await chargePayment(orderId);
  if (!paymentResult.success) {
    await notifyUser(orderId, 'Payment failed');
    return;
  }

  // Step 2: Generate invoice
  await generateReport(orderId);

  // Step 3: Send confirmation email
  await sendEmail(orderId, 'order-confirmation');

  // Step 4: Wait for shipment (could be hours/days)
  const shipped = await condition(() => isShipped(orderId), '7d');
  if (!shipped) {
    await sendEmail(orderId, 'shipping-delay');
  }

  // Step 5: Final notification
  await sendEmail(orderId, 'delivered');
}
// activities.ts
export async function sendEmail(orderId, template): Promise<void> {
  const order = await db.orders.findById(orderId);
  await emailService.send({
    to: order.userEmail,
    template: template,
    data: { orderId, items: order.items },
  });
}

export async function chargePayment(orderId): Promise<{ success: boolean }> {
  const order = await db.orders.findById(orderId);
  return await paymentGateway.charge(order.total, order.paymentMethod);
}

Comparison

Feature
Celery
BullMQ
Temporal

|---------|--------|--------|----------|

Language
Python
Node.js/TypeScript
Any (Go, TS, Python, Java)
Broker
Redis, RabbitMQ
Redis
Temporal Server (PostgreSQL)
Job types
Tasks, chains, groups
Jobs, flows, schedulers
Workflows, activities
State management
Minimal
Job-level
Full workflow state
Long-running workflows
Not ideal
Not ideal
Designed for it
Priority queues
Yes
Yes
Task queues
Rate limiting
Via plugins
Built-in
Via task queue config
Delayed jobs
Yes
Yes
Timers built-in
Cron/scheduled
Celery Beat
Built-in schedulers
Cron schedules
Dashboard
Flower
Bull Board
Temporal UI
Memory footprint
~100MB (worker)
~50MB (worker)
~500MB (server)
Setup complexity
Low
Low
Medium-High
Best for
Python task queues
Node.js task queues
Complex workflows

<div style="margin:2.5rem auto;max-width:600px;width:100%;text-align:center;"><svg viewBox="0 0 600 220" xmlns="http://www.w3.org/2000/svg" style="width:100%;height:auto;"><rect width="600" height="220" rx="12" fill="#1a1a2e"/><rect x="230" y="15" width="140" height="35" rx="8" fill="#6366f1" opacity="0.9"/><text x="300" y="38" text-anchor="middle" fill="#ffffff" font-size="12" font-family="system-ui" font-weight="bold">API Gateway</text><rect x="30" y="80" width="100" height="50" rx="8" fill="#3b82f6" opacity="0.8"/><text x="80" y="100" text-anchor="middle" fill="#ffffff" font-size="10" font-family="system-ui">Auth</text><text x="80" y="115" text-anchor="middle" fill="#ffffff" font-size="10" font-family="system-ui">Service</text><rect x="160" y="80" width="100" height="50" rx="8" fill="#a855f7" opacity="0.8"/><text x="210" y="100" text-anchor="middle" fill="#ffffff" font-size="10" font-family="system-ui">User</text><text x="210" y="115" text-anchor="middle" fill="#ffffff" font-size="10" font-family="system-ui">Service</text><rect x="290" y="80" width="100" height="50" rx="8" fill="#2dd4bf" opacity="0.8"/><text x="340" y="100" text-anchor="middle" fill="#1a1a2e" font-size="10" font-family="system-ui">Order</text><text x="340" y="115" text-anchor="middle" fill="#1a1a2e" font-size="10" font-family="system-ui">Service</text><rect x="420" y="80" width="100" height="50" rx="8" fill="#f59e0b" opacity="0.8"/><text x="470" y="100" text-anchor="middle" fill="#1a1a2e" font-size="10" font-family="system-ui">Payment</text><text x="470" y="115" text-anchor="middle" fill="#1a1a2e" font-size="10" font-family="system-ui">Service</text><line x1="265" y1="50" x2="80" y2="78" stroke="#e2e8f0" stroke-width="1" opacity="0.5"/><line x1="285" y1="50" x2="210" y2="78" stroke="#e2e8f0" stroke-width="1" opacity="0.5"/><line x1="315" y1="50" x2="340" y2="78" stroke="#e2e8f0" stroke-width="1" opacity="0.5"/><line x1="335" y1="50" x2="470" y2="78" stroke="#e2e8f0" stroke-width="1" opacity="0.5"/><ellipse cx="80" cy="175" rx="35" ry="12" fill="none" stroke="#3b82f6" stroke-width="1.5"/><line x1="45" y1="175" x2="45" y2="190" stroke="#3b82f6" stroke-width="1.5"/><line x1="115" y1="175" x2="115" y2="190" stroke="#3b82f6" stroke-width="1.5"/><ellipse cx="80" cy="190" rx="35" ry="12" fill="none" stroke="#3b82f6" stroke-width="1.5"/><line x1="80" y1="130" x2="80" y2="163" stroke="#94a3b8" stroke-width="1" stroke-dasharray="3,3"/><ellipse cx="340" cy="175" rx="35" ry="12" fill="none" stroke="#2dd4bf" stroke-width="1.5"/><line x1="305" y1="175" x2="305" y2="190" stroke="#2dd4bf" stroke-width="1.5"/><line x1="375" y1="175" x2="375" y2="190" stroke="#2dd4bf" stroke-width="1.5"/><ellipse cx="340" cy="190" rx="35" ry="12" fill="none" stroke="#2dd4bf" stroke-width="1.5"/><line x1="340" y1="130" x2="340" y2="163" stroke="#94a3b8" stroke-width="1" stroke-dasharray="3,3"/><rect x="155" y="160" width="150" height="30" rx="6" fill="#a855f7" opacity="0.3"/><text x="230" y="180" text-anchor="middle" fill="#a855f7" font-size="10" font-family="system-ui">Message Bus / Events</text><line x1="210" y1="130" x2="210" y2="158" stroke="#94a3b8" stroke-width="1" stroke-dasharray="3,3"/><line x1="470" y1="130" x2="470" y2="175" stroke="#94a3b8" stroke-width="1" stroke-dasharray="3,3"/><line x1="305" y1="175" x2="470" y2="175" stroke="#94a3b8" stroke-width="0.5" stroke-dasharray="3,3" opacity="0.3"/></svg><p style="margin-top:0.75rem;font-size:0.85rem;color:#94a3b8;font-style:italic;line-height:1.4;">Microservices architecture: independent services communicate through an API gateway and event bus.</p></div>

Choosing the Right Tool

Choose Celery when: You are building Python applications, need a proven task queue, want RabbitMQ support, or have simple task → retry → complete workflows.

Choose BullMQ when: You are building Node.js/TypeScript applications, want Redis-only dependency, need job priorities and rate limiting, or want built-in job scheduling.

Choose Temporal when: You have complex multi-step workflows that span hours or days, need durable execution (survive server restarts), have polyglot services, or need workflow versioning for long-running processes.

At TechSaaS, we use BullMQ for our Node.js services and n8n for workflow automation. For most applications, a simple task queue (Celery or BullMQ) handles 95% of background job needs. Only reach for Temporal when you have genuinely complex, long-running workflows that need durable state management.

#background-jobs#celery#bullmq#temporal#queues#redis

Need help with platform engineering?

TechSaaS provides expert consulting and managed services for cloud infrastructure, DevOps, and AI/ML operations.