Effortless Automated Backups for Your Dockerized n8n (SQLite) to GitLab or GitHub

Shell Script for N8N Docker Backup

How to bulletproof your workflow automations with dead-simple, automated backups—even if you’re not a Linux wizard.


Why You Absolutely Need to Back Up n8n (Especially on Docker + SQLite)

n8n is the backbone of your workflow automation, orchestrating everything from business logic to personal productivity hacks. But if you’re running n8n in Docker (the way 90% of us do), and especially if you’re using the default SQLite database, you are one corrupted file, failed disk, or accidental docker rm away from total workflow extinction.

Why?

  • SQLite is a single file. Great for simplicity, terrible for resilience.
  • Docker containers can be deleted, rebuilt, or migrated at any time.
  • Automated backup = Zero downtime, zero tears.
  • Storing backups offsite (like GitLab or GitHub) = Safety against ransomware and hardware failures.

Bottom line: If your n8n automations matter at all, you should automate your backups now—not after you lose everything.


Prerequisites

You’ll need:

  • n8n running in Docker (hosted anywhere: VPS, home server, cloud).
  • Your n8n instance uses SQLite (the default DB for new installs).
  • Shell access to your Docker host (SSH or web console).
  • A GitLab or GitHub account with a private repository for storing your backups.
  • SSH keys configured for push access to your repo from the server.

Step 1: Verify n8n Is Using SQLite

Before you start, check that your n8n is using SQLite.
SSH into your Docker host and run:

docker exec -it <n8n_container_name> printenv | grep DB_TYPE

If it returns sqlite, you’re good.

Or, check for the file:

docker exec -it <n8n_container_name> ls /home/node/.n8n/database.sqlite

If you see database.sqlite, you’re running SQLite.


Step 2: Set Up Your Backup Repo

1. Create a private repo on GitLab or GitHub (e.g., n8n-backups).
2. Clone it to your server:

git clone git@<git-provider>:<your-username>/<your-repo>.git /opt/n8n-backups
cd /opt/n8n-backups

3. Configure your git user (inside the repo):

git config --local user.name "YourName"
git config --local user.email ""

4. Make sure your SSH key is added to your git provider for passwordless pushes.

(a) Add your server public key into github or gitlab Check if There’s an SSH Key: ls -l ~/.ssh/id_rsa.pub

If not, generate one: ssh-keygen -t rsa -b 4096 -C "n8n-backup@yourhost" -N "" -f ~/.ssh/id_rsa

Add the Public Key to GitHub/Gitlab cat ~/.ssh/id_rsa.pub

(b) Once that is done, run this to test.

ssh -T 
ssh -T 

Step 3: The Automated Backup Script

Below is a no-nonsense, production-ready shell script to:

  • Copy your live database.sqlite from the n8n container,
  • Save it with a unique timestamp,
  • (Optionally) encrypt it,
  • Commit and push to GitLab/GitHub only if the file changed,
  • Automatically prune backups older than 14 days.

Save this as backup-n8n.sh:

#!/bin/bash
# CONFIGURATION - Change these to your environment!
N8N_CONTAINER="your_n8n_container_name"
BACKUP_DIR="/opt/n8n-backups"
DATE=$(date +"%Y-%m-%d-%H%M%S")
BACKUP_FILE="database_${DATE}.sqlite"
ENCRYPT=0        # Set to 1 to enable GPG encryption

echo "Backing up n8n database from container $N8N_CONTAINER to $BACKUP_DIR/$BACKUP_FILE..."

# Copy sqlite file from the container
docker cp $N8N_CONTAINER:/home/node/.n8n/database.sqlite $BACKUP_DIR/$BACKUP_FILE

# Optional: Encrypt the backup file
if [ "$ENCRYPT" -eq 1 ]; then
    echo "Encrypting $BACKUP_FILE with GPG..."
    gpg --symmetric --cipher-algo AES256 --batch --passphrase "YOUR_STRONG_PASSWORD" -o "${BACKUP_DIR}/${BACKUP_FILE}.gpg" "${BACKUP_DIR}/${BACKUP_FILE}"
    rm -f "${BACKUP_DIR}/${BACKUP_FILE}"
    BACKUP_FILE="${BACKUP_FILE}.gpg"
fi

# Change to backup directory
cd $BACKUP_DIR

# Set git user (optional for each run, safe for automation)
git config --local user.name "YourName"
git config --local user.email ""

# Add the backup file
git add $BACKUP_FILE

# Commit only if there are changes (advanced: skip commit if file is identical)
if ! git diff --cached --quiet; then
    git commit -m "Automated n8n sqlite backup $DATE"
    git push origin main
    echo "Backup committed and pushed to remote!"
else
    echo "No changes detected in the database backup. Skipping commit."
fi

# Cleanup old backups (older than 14 days)
find $BACKUP_DIR/database_*.sqlite* -type f -mtime +14 -delete

echo "Backup script complete."

Make it executable:

chmod +x backup-n8n.sh

Step 4: Automate with Cron

Schedule it (for example, daily at 2am):

crontab -e

Add:

0 2 * * * /opt/n8n-backups/backup-n8n.sh >/dev/null 2>&1

Step 5: How to Restore a Backup

  1. Stop your n8n Docker container:

    docker stop <n8n_container_name>
    
  2. If the backup is encrypted, decrypt it first:

    gpg --decrypt database_YYYY-MM-DD-HHMMSS.sqlite.gpg > database_YYYY-MM-DD-HHMMSS.sqlite
    
  3. Copy a backup into place:

    docker cp /opt/n8n-backups/database_YYYY-MM-DD-HHMMSS.sqlite <n8n_container_name>:/home/node/.n8n/database.sqlite
    
  4. Start n8n:

    docker start <n8n_container_name>
    

That’s it. You’re live again.


Troubleshooting Common Issues

  • Permission denied (publickey):
    Double-check your SSH keys and make sure your backup user has push rights to your repo.

  • Multiple backups with the same timestamp:
    Use seconds in your filename as shown. You’ll never get duplicates.

  • “Nothing to commit” in git:
    The script now skips the commit if the file hasn’t changed—less noise in your Git history.

  • Backup file encryption:
    If you lose your GPG passphrase, you lose access to your backup. Use with care.


Final Thoughts: Paranoia is a Virtue

Automated backups aren’t “paranoid,” they’re smart. SQLite is fast and light, but not invincible.
With this setup, your n8n workflows can survive anything—hardware failures, accidental wipes, or the classic “I don’t know what happened, boss, it just disappeared.”

Make this script your insurance policy.
Set it, forget it, and sleep easy.

Comments