← All articlesPlatform Engineering

Background Job Processing: Celery vs BullMQ vs Temporal

Compare Celery, BullMQ, and Temporal for background job processing. Task queues, retry strategies, workflow orchestration, and choosing the right tool for...

Y
Yash Pritwani
14 min read

Why Background Jobs Matter

Every web application has tasks that should not block the HTTP request-response cycle: sending emails, processing images, generating reports, syncing data, or running AI inference. These need a reliable background job system.

PromptEmbed[0.2, 0.8...]VectorSearchtop-k=5LLM+ contextReplyRetrieval-Augmented Generation (RAG) Flow

RAG architecture: user prompts are embedded, matched against a vector store, then fed to an LLM with retrieved context.

The choice depends on your language ecosystem and complexity needs.

Celery: The Python Standard

Celery is the dominant task queue for Python applications. It supports Redis and RabbitMQ as message brokers.

# tasks.py
from celery import Celery

app = Celery('myapp', broker='redis://redis:6379/0', backend='redis://redis:6379/1')

# Configure
app.conf.update(
    task_serializer='json',
    result_serializer='json',
    accept_content=['json'],
    timezone='UTC',
    task_acks_late=True,          # Acknowledge after completion (safer)
    worker_prefetch_multiplier=1,  # One task at a time per worker
    task_reject_on_worker_lost=True,
    task_default_retry_delay=60,
    task_max_retries=3,
)

@app.task(bind=True, max_retries=3, default_retry_delay=60)
def send_welcome_email(self, user_id: int):
    """Send welcome email to new user."""
    try:
        user = get_user(user_id)
        email_service.send(
            to=user.email,
            subject='Welcome!',
            template='welcome.html',
        )
    except EmailServiceError as exc:
        # Retry with exponential backoff
        raise self.retry(exc=exc, countdown=2 ** self.request.retries * 60)

@app.task(bind=True, time_limit=300, soft_time_limit=280)
def generate_report(self, report_id: int):
    """Generate PDF report (max 5 minutes)."""
    report = get_report(report_id)
    update_report_status(report_id, 'processing')

    try:
        pdf = generate_pdf(report.data)
        upload_to_storage(pdf, f'reports/{report_id}.pdf')
        update_report_status(report_id, 'complete')
    except SoftTimeLimitExceeded:
        update_report_status(report_id, 'timeout')
        raise

# Chain tasks (pipeline)
from celery import chain

workflow = chain(
    fetch_data.s(source_id),
    transform_data.s(),
    load_to_warehouse.s(),
    notify_completion.s(),
)
workflow.apply_async()

# Group tasks (parallel)
from celery import group

batch = group([
    process_image.s(image_id) for image_id in image_ids
])
result = batch.apply_async()

Get more insights on Platform Engineering

Join 2,000+ engineers who get our weekly deep-dives. No spam, unsubscribe anytime.

# Start Celery worker
celery -A tasks worker --loglevel=info --concurrency=4

# Start Celery beat (scheduler)
celery -A tasks beat --loglevel=info

# Monitor with Flower
celery -A tasks flower --port=5555

BullMQ: The Node.js Standard

BullMQ is the most popular job queue for Node.js/TypeScript. Built on Redis, it provides reliable job processing with a clean API.

// queue.ts
import { Queue, Worker, QueueEvents } from 'bullmq';
import IORedis from 'ioredis';

const connection = new IORedis({
  host: 'redis',
  port: 6379,
  maxRetriesPerRequest: null,
});

// Define queues
const emailQueue = new Queue('email', { connection });
const reportQueue = new Queue('reports', { connection });

// Add jobs
await emailQueue.add('welcome-email', {
  userId: 123,
  email: '[email protected]',
}, {
  attempts: 3,
  backoff: { type: 'exponential', delay: 1000 },
  removeOnComplete: { count: 1000 },
  removeOnFail: { count: 5000 },
});

// Scheduled job (run in 5 minutes)
await reportQueue.add('generate', { reportId: 456 }, {
  delay: 5 * 60 * 1000,
});

// Recurring job (every hour)
await reportQueue.upsertJobScheduler('hourly-sync', {
  every: 60 * 60 * 1000,
}, {
  name: 'data-sync',
  data: { source: 'external-api' },
});
// worker.ts
import { Worker } from 'bullmq';

const emailWorker = new Worker('email', async (job) => {
  switch (job.name) {
    case 'welcome-email':
      await sendWelcomeEmail(job.data.userId);
      break;
    case 'password-reset':
      await sendPasswordReset(job.data.email);
      break;
  }
}, {
  connection,
  concurrency: 5,
  limiter: {
    max: 10,     // Max 10 jobs
    duration: 1000, // Per second
  },
});

emailWorker.on('completed', (job) => {
  console.log('Job ' + job.id + ' completed');
});

emailWorker.on('failed', (job, err) => {
  console.error('Job ' + job.id + ' failed:', err.message);
});

// Report worker with progress tracking
const reportWorker = new Worker('reports', async (job) => {
  await job.updateProgress(10);
  const data = await fetchData(job.data.reportId);

  await job.updateProgress(50);
  const pdf = await generatePDF(data);

  await job.updateProgress(90);
  await uploadToStorage(pdf);

  await job.updateProgress(100);
  return { url: '/reports/' + job.data.reportId + '.pdf' };
}, { connection });
InputHiddenHiddenOutput

Neural network architecture: data flows through input, hidden, and output layers.

Temporal: The Workflow Engine

Temporal is not a simple task queue — it is a workflow orchestration engine. It handles long-running, complex workflows with built-in state management, retries, and failure recovery.

// workflows.ts
import { proxyActivities, sleep, condition } from '@temporalio/workflow';
import type * as activities from './activities';

const { sendEmail, generateReport, chargePayment, notifyUser } =
  proxyActivities<typeof activities>({
    startToCloseTimeout: '5m',
    retry: {
      initialInterval: '1s',
      backoffCoefficient: 2,
      maximumAttempts: 5,
    },
  });

// Order processing workflow (can run for days)
export async function orderWorkflow(orderId): Promise<void> {
  // Step 1: Charge payment
  const paymentResult = await chargePayment(orderId);
  if (!paymentResult.success) {
    await notifyUser(orderId, 'Payment failed');
    return;
  }

  // Step 2: Generate invoice
  await generateReport(orderId);

  // Step 3: Send confirmation email
  await sendEmail(orderId, 'order-confirmation');

  // Step 4: Wait for shipment (could be hours/days)
  const shipped = await condition(() => isShipped(orderId), '7d');
  if (!shipped) {
    await sendEmail(orderId, 'shipping-delay');
  }

  // Step 5: Final notification
  await sendEmail(orderId, 'delivered');
}
// activities.ts
export async function sendEmail(orderId, template): Promise<void> {
  const order = await db.orders.findById(orderId);
  await emailService.send({
    to: order.userEmail,
    template: template,
    data: { orderId, items: order.items },
  });
}

export async function chargePayment(orderId): Promise<{ success: boolean }> {
  const order = await db.orders.findById(orderId);
  return await paymentGateway.charge(order.total, order.paymentMethod);
}

Comparison

Feature Celery BullMQ Temporal
Language Python Node.js/TypeScript Any (Go, TS, Python, Java)
Broker Redis, RabbitMQ Redis Temporal Server (PostgreSQL)
Job types Tasks, chains, groups Jobs, flows, schedulers Workflows, activities
State management Minimal Job-level Full workflow state
Long-running workflows Not ideal Not ideal Designed for it
Priority queues Yes Yes Task queues
Rate limiting Via plugins Built-in Via task queue config
Delayed jobs Yes Yes Timers built-in
Cron/scheduled Celery Beat Built-in schedulers Cron schedules
Dashboard Flower Bull Board Temporal UI
Memory footprint ~100MB (worker) ~50MB (worker) ~500MB (server)
Setup complexity Low Low Medium-High
Best for Python task queues Node.js task queues Complex workflows
API GatewayAuthServiceUserServiceOrderServicePaymentServiceMessage Bus / Events

Microservices architecture: independent services communicate through an API gateway and event bus.

Free Resource

Free Cloud Architecture Checklist

A 47-point checklist covering security, scalability, cost optimization, and disaster recovery for production cloud environments.

Download the Checklist

Choosing the Right Tool

Choose Celery when: You are building Python applications, need a proven task queue, want RabbitMQ support, or have simple task → retry → complete workflows.

Choose BullMQ when: You are building Node.js/TypeScript applications, want Redis-only dependency, need job priorities and rate limiting, or want built-in job scheduling.

Choose Temporal when: You have complex multi-step workflows that span hours or days, need durable execution (survive server restarts), have polyglot services, or need workflow versioning for long-running processes.

At TechSaaS, we use BullMQ for our Node.js services and n8n for workflow automation. For most applications, a simple task queue (Celery or BullMQ) handles 95% of background job needs. Only reach for Temporal when you have genuinely complex, long-running workflows that need durable state management.

#background-jobs#celery#bullmq#temporal#queues#redis

Related Service

Cloud Solutions

Let our experts help you build the right technology strategy for your business.

Need help with platform engineering?

TechSaaS provides expert consulting and managed services for cloud infrastructure, DevOps, and AI/ML operations.

We Will Build You a Demo Site — For Free

Like it? Pay us. Do not like it? Walk away, zero complaints. You will spend way less than hiring developers or any agency.

47+ companies trusted us
99.99% uptime
< 48hr response

No spam. No contracts. Just a free demo.