Chris Stryczynski

Software Developer / Consultant

Simple Bash script to backup multiple directories into Google Cloud Storage with tar archives

Posted on: 23/10/2019

The following script uses tar archives to speed up the backup process when many small files are involved. This does mean though you can’t directly browse the directory structure within GCS.

#!/bin/bash

set -e
set -uo pipefail
set -x

# gsutil mb -l us-east1 -c nearline "gs://example-abcxyz"
bucketname=""

backup_path=(
/bin
/boot
/etc
/home
/initrd.img
/initrd.img.old
/media
/mnt
/opt
/root
/sbin
/snap
/srv
/usr
/var
)

for each in "${backup_path[@]}"
do
  tar -cf - "$each" | gsutil cp - gs://"$bucketname"/tar"$each".tar >> backup_log.log 2>> backup_log_err.log
done
Comments

No comments, yet!

Submit a comment