← All articlesCloud & Infrastructure

Complete PaaS Exit Playbook: Heroku to Self-Hosted in 72 Hours

Step-by-step migration from Heroku/Render to self-hosted Docker. Real case study: $2,800/mo to $240/mo. Same deploy experience.

Y
Yash Pritwani
6 min read read

# Complete PaaS Exit Playbook: Heroku to Self-Hosted in 72 Hours

We've migrated 6 startups off Heroku and Render in the past year. Average cost reduction: 87%. No client has gone back.

This is the exact playbook we use. Three days, start to finish.

The Economics That Force the Move

Here's a real client breakdown (Series A, Rails app, ~5K DAU):

Heroku Item
Monthly Cost

|-------------|-------------|

4× Performance-M Dynos
$1,000
Heroku Postgres (Standard-0)
$50
Heroku Redis (Premium-0)
$200
Heroku Data for Redis
$200
Papertrail (logging)
$230
Scout APM
$120
Heroku CI
$100
SSL, Scheduler, misc add-ons
$900
Total
$2,800/mo

The replacement:

Self-Hosted Item
Monthly Cost

|-----------------|-------------|

Hetzner CX41 (16GB RAM, 4 vCPU)
$15
Hetzner managed Postgres
$25
Backblaze B2 backups
$5
Domain + DNS (Cloudflare free)
$0
Monitoring (Grafana + Prometheus, self-hosted)
$0
CI/CD (Gitea Actions, self-hosted)
$0
Uptime monitoring (Uptime Kuma, self-hosted)
$0
Total
$45/mo

Actual client paid $240/mo because they chose managed Postgres on a larger plan and a beefier server for headroom. Still 91% savings.

Day 1: Containerize (8 hours)

Step 1: Create a Dockerfile

If you're on Heroku, you likely have a Procfile. The translation is direct:

# Heroku Procfile: web: bundle exec puma -C config/puma.rb
# Docker equivalent:

FROM ruby:3.2-slim AS base
WORKDIR /app

# Install dependencies
RUN apt-get update && apt-get install -y \
  build-essential libpq-dev nodejs npm && \
  rm -rf /var/lib/apt/lists/*

COPY Gemfile Gemfile.lock ./
RUN bundle install --deployment --without development test

COPY . .
RUN bundle exec rake assets:precompile

# Production stage
FROM ruby:3.2-slim
WORKDIR /app

RUN apt-get update && apt-get install -y libpq-dev && \
  rm -rf /var/lib/apt/lists/*

COPY --from=base /app /app

USER 1000:1000
EXPOSE 3000
CMD ["bundle", "exec", "puma", "-C", "config/puma.rb"]

Step 2: Create docker-compose.yml

services:
  app:
    build: .
    user: "1000:1000"
    ports:
      - "127.0.0.1:3000:3000"
    environment:
      - DATABASE_URL=postgres://app:${DB_PASS}@postgres:5432/app_prod
      - REDIS_URL=redis://redis:6379/0
      - RAILS_ENV=production
      - SECRET_KEY_BASE=${SECRET_KEY}
    depends_on:
      - postgres
      - redis
    deploy:
      resources:
        limits:
          memory: 1G
          cpus: '2.0'
    networks:
      - backend

  postgres:
    image: postgres:16-alpine
    user: "999:999"
    volumes:
      - pgdata:/var/lib/postgresql/data
    environment:
      - POSTGRES_PASSWORD=${DB_PASS}
      - POSTGRES_DB=app_prod
    deploy:
      resources:
        limits:
          memory: 1G
    networks:
      - backend

  redis:
    image: redis:7-alpine
    volumes:
      - redisdata:/data
    deploy:
      resources:
        limits:
          memory: 256M
    networks:
      - backend

  traefik:
    image: traefik:v3
    ports:
      - "443:443"
      - "80:80"
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock:ro
      - ./traefik:/etc/traefik
    networks:
      - backend

volumes:
  pgdata:
  redisdata:

networks:
  backend:

Step 3: Test locally

docker compose up --build
# Hit localhost:3000, verify everything works
# Run your test suite against Docker

Day 2: Provision and Migrate Data (8 hours)

Step 1: Provision the server

# Hetzner CLI (or use their web UI)
hcloud server create \
  --name prod-01 \
  --type cx41 \
  --image ubuntu-24.04 \
  --ssh-key my-key \
  --location nbg1

Step 2: Bootstrap the server

# SSH in and run
apt update && apt upgrade -y
apt install -y docker.io docker-compose-v2
systemctl enable docker

# Create deploy user
useradd -m -s /bin/bash deploy
usermod -aG docker deploy

# Set up firewall
ufw allow 22/tcp
ufw allow 80/tcp
ufw allow 443/tcp
ufw enable

Step 3: Migrate the database

# Export from Heroku
heroku pg:backups:capture --app your-app
heroku pg:backups:download --app your-app

# Import to new Postgres
docker compose up -d postgres
docker compose exec -T postgres pg_restore \
  -U postgres -d app_prod < latest.dump

Step 4: Migrate files/assets

If using Heroku's ephemeral filesystem, you're probably already on S3. Just update the credentials in your env.

If using Heroku's built-in file storage... that data is gone on every deploy anyway. Nothing to migrate.

Day 3: Go Live (4 hours)

Step 1: Deploy and verify

# On the server
docker compose up -d
docker compose logs -f app  # Watch for startup errors

# Health check
curl -s https://your-domain.com/health | jq .

Step 2: Set up CI/CD

# .gitea/workflows/deploy.yml (or .github/workflows)
name: Deploy
on:
  push:
    branches: [main]

jobs:
  deploy:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - name: Deploy
        run: |
          ssh deploy@your-server "cd /app && git pull && docker compose up -d --build"

Now git push deploys — same as Heroku.

Step 3: Flip DNS

# Update your domain's A record to the new server IP
# TTL: start at 60 seconds, increase after verification

Step 4: Monitor for 48 hours

Keep Heroku running for 48 hours as rollback. Watch:

Response times (should be same or faster)
Error rates
Database connections
Memory/CPU usage

What You Keep

Heroku Feature
Self-Hosted Equivalent

|---------------|----------------------|

git push deploy
CI/CD pipeline (2 minutes)
Auto-SSL (ACM)
Traefik + Let's Encrypt (automatic)
Rollbacks
docker compose up -d --build previous commit
Logging
Loki + Grafana (better than Papertrail)
Metrics
Prometheus + Grafana (better than Scout)
Scaling
Docker Compose replicas

What You Gain

Full control — no vendor can change pricing under you
10x capacity headroom — a $15/month server handles more than 4 Heroku dynos
Better debugging — SSH into the box, inspect everything
No add-on tax — every Heroku add-on has a free self-hosted alternative

When NOT to Self-Host

Be honest with yourself:

No ops experience and no budget to learn: Stay on PaaS until you have someone who can SSH into a server confidently
Compliance requirements: Some industries require specific cloud certifications
True auto-scaling needs: If you go from 100 to 100,000 requests in seconds, managed infrastructure is worth it

For the other 90% of startups: you're overpaying for convenience you've already outgrown.

Free Migration Assessment

Not sure if migration makes sense for your stack? We'll review your current Heroku/Render setup, estimate your self-hosted costs, and give you an honest recommendation in 15 minutes.

Book a call: techsaas.cloud/contacttechsaas.cloud/contacthttps://techsaas.cloud/contact

#heroku#migration#self-hosting#docker#cost-optimization

Need help with cloud & infrastructure?

TechSaaS provides expert consulting and managed services for cloud infrastructure, DevOps, and AI/ML operations.