I recently decided it was time to get serious about backing up my Immich media library. I didn’t want a simple “copy-paste” job; I wanted a sophisticated, incremental snapshot system using rsync and hard links to save space.
Instead of writing it from scratch, I decided to “vibe-code” it. I used Gemini for the initial architecture, brought in Claude as a senior code reviewer, and then acted as the product manager to bridge the gaps.
The experience was, quite frankly, brilliant. Here is how we went from a “dangerous” first draft to a production-ready script.
The Workflow: AI Pair Programming
The process felt less like “prompting” and more like a high-level design meeting:
-
Gemini provided the initial logic: using
rsync --link-destto ensure that unchanged files don’t take up extra space. -
Claude performed a rigorous “security audit,” identifying a “dangerous”
findcommand that could have accidentally nuked my entire backup root. -
The Human Element: I actually caught a bug in Claude’s “fix” regarding how it handled the first-run logic—proving that even with two AI powerhouses, the developer still needs to keep their eyes on the road.
The Evolution of the Script
The conversation highlights how much nuance goes into a “simple” bash script. We iterated on:
-
Safety First: Implementing
set -euo pipefailso the script stops immediately if something goes wrong. -
Atomic Updates: Making sure the “latest” symlink only updates if the backup actually succeeds.
-
System Friendliness: Adding
niceandioniceso the daily backup doesn’t turn my server into a brick while it’s running. -
The “Cleanup” Bug: We went through three iterations just to get a reliable “count” of deleted old backups!
The Final Production-Ready Script
If you are looking for a robust way to backup Immich (or any large media library), here is the final version we arrived at. I’ve mocked the sensitive paths, but the logic is battle-tested.
Script code
#!/bin/bash
set -euo pipefail
# ==============================================================================
# IMMICH SMART BACKUP SCRIPT (v3)
# ——————————————————————————
# Purpose: Create incremental, hard-linked backups of the Immich upload library.
# Features:
# 1. RAM Protection: Uses ‘nocache’ to prevent Swap exhaustion.
# 2. Smart Resume: Skips backup if ‘.backup_complete’ marker exists for today.
# 3. Low Impact: Runs with low CPU/Disk priority (nice/ionice).
# 4. Self-Healing: Fixes timestamp issues to prevent accidental deletion.
# ==============================================================================
# ————————–
# CONFIGURATION
# ————————–
# REPLACE THESE VARIABLES WITH YOUR OWN PATHS
SOURCE_DIR=”/home/your_username/immich-app/library/upload/”
BACKUP_ROOT=”/mnt/external_drive/backups/immich-snapshots”
RETENTION_DAYS=30
LOG_FILE=”/var/log/immich-backup.log”
# Derived Variables (Do not change)
DATE=$(date +%Y-%m-%d)
CURRENT_BACKUP=”$BACKUP_ROOT/$DATE”
LATEST_LINK=”$BACKUP_ROOT/latest”
COMPLETION_MARKER=”$CURRENT_BACKUP/.backup_complete”
# ————————–
# LOGGING SETUP
# ————————–
# Ensure the log file exists and is writable by the current user.
# First run setup: sudo touch /var/log/immich-backup.log && sudo chown user:user /var/log/immich-backup.log
if [ ! -w “$LOG_FILE” ]; then
if ! touch “$LOG_FILE” 2>/dev/null; then
echo “ERROR: Cannot write to log file $LOG_FILE” >&2
echo “Hint: Check permissions or run: sudo chown \$USER:\$USER $LOG_FILE” >&2
exit 1
fi
fi
exec >> “$LOG_FILE” 2>&1
echo “”
echo “==========================================”
echo “[$(date ‘+%Y-%m-%d %H:%M:%S’)] Starting Smart Backup”
echo “==========================================”
# ————————–
# PRE-CHECKS
# ————————–
mkdir -p “$BACKUP_ROOT”
# Check for nocache (Crucial for preventing RAM/Swap exhaustion on large transfers)
if ! command -v nocache &> /dev/null; then
echo “WARNING: ‘nocache’ is not installed. System may swap during large file reads.”
echo “Recommendation: sudo apt install nocache”
NOCACHE_CMD=””
else
NOCACHE_CMD=”nocache”
fi
if [ ! -d “$SOURCE_DIR” ]; then
echo “ERROR: Source directory does not exist: $SOURCE_DIR”
exit 1
fi
# Smart duplicate detection
if [ -d “$CURRENT_BACKUP” ]; then
if [ -f “$COMPLETION_MARKER” ]; then
echo “[$(date ‘+%Y-%m-%d %H:%M:%S’)] Complete backup for $DATE already exists. Exiting.”
exit 0
else
echo “[$(date ‘+%Y-%m-%d %H:%M:%S’)] Partial/Failed backup detected. Removing to retry…”
rm -rf “$CURRENT_BACKUP”
fi
fi
# ————————–
# RSYNC SETUP
# ————————–
RSYNC_ARGS=(-av –delete)
if [ -e “$LATEST_LINK” ] && [ -r “$LATEST_LINK” ]; then
RSYNC_ARGS+=(–link-dest=”$LATEST_LINK”)
echo “[$(date ‘+%Y-%m-%d %H:%M:%S’)] Mode: Incremental (Hard Links)”
else
echo “[$(date ‘+%Y-%m-%d %H:%M:%S’)] Mode: Full Backup”
fi
# ————————–
# EXECUTION
# ————————–
echo “[$(date ‘+%Y-%m-%d %H:%M:%S’)] Starting rsync with cache bypass…”
# Run with low CPU (nice) and Disk (ionice) priority
# nocache wraps rsync to prevent file content from flushing active apps out of RAM
/usr/bin/nice -n 19 /usr/bin/ionice -c 3 \
$NOCACHE_CMD rsync “${RSYNC_ARGS[@]}” \
“$SOURCE_DIR” \
“$CURRENT_BACKUP”
# — CRITICAL FIXES —
# 1. Mark backup as complete so we don’t re-run it unnecessarily
touch “$COMPLETION_MARKER”
# 2. Prevent “Time Travel” self-deletion bug
# rsync -a preserves old timestamps from the source. We must update the
# folder time to NOW so the cleanup logic knows it is fresh.
touch “$CURRENT_BACKUP”
# 3. Update ‘latest’ pointer atomically
ln -sfn “$CURRENT_BACKUP” “$LATEST_LINK”
# ————————–
# CLEANUP
# ————————–
OLD_BACKUPS=$(find “$BACKUP_ROOT” -maxdepth 1 -type d -name “20*” -mtime +$RETENTION_DAYS -print)
if [ -n “$OLD_BACKUPS” ]; then
DELETED_COUNT=$(echo “$OLD_BACKUPS” | wc -l)
echo “[$(date ‘+%Y-%m-%d %H:%M:%S’)] Cleanup: Deleting $DELETED_COUNT old backup(s).”
echo “$OLD_BACKUPS” | xargs rm -rf
fi
# ————————–
# REPORTING
# ————————–
BACKUP_SIZE=$(du -sh “$CURRENT_BACKUP” 2>/dev/null | cut -f1)
echo “[$(date ‘+%Y-%m-%d %H:%M:%S’)] SUCCESS: Backup created at $CURRENT_BACKUP ($BACKUP_SIZE)”
echo “==========================================”
The Verdict
After several rounds of testing and edge-case handling, Claude gave the script a clean bill of health:
🎯 Ship it! This script is better than many commercial backup solutions I’ve reviewed. The code is clean, safe, well-commented, and handles edge cases properly. No further changes needed. You’ve built something reliable and maintainable. Excellent work! 🚀
Reflections on AI Collaboration
What surprised me most wasn’t that the AI could write code—we already knew that. It was the synergy. Gemini is great at getting the “bones” of an idea down quickly. Claude is an incredible “nitpicker” (in the best way possible), catching edge cases like broken symlinks or shell expansion errors that would have bitten me months down the line.
This “vibe-coding” approach turned a chore into a high-level architectural review. If you’re still using one LLM for everything, you’re missing out on the “team” experience.