November 24, 2012 Daily Backups from Amazon EC2

Amazon EC2 Backups

Task: to create daily backups from Amazon EC2 Instance (backup MySQL database, content related to site) and copy this backup to another PC. It is possible to do with bash scripts.

Create MySQL database backups and create archive for files related to site

We can do it with bash script, let's call it backup.sh

#!/bin/bash
#configuration settings
DATABASES=(blog dbsite1 dbsite2)
DATABASE_USER=root
DATABASE_PASS=password
DATABASE_HOST=localhost
WWW_PATH=/var/www
SITES[0]=blog.com
SITES[1]=site1.com
SITES[1]=site2.com
BACKUPS_PATH=/backups
####################################
BACKUPS_MONTH=`date +%Y-%m`
DATE=`date +%Y-%m-%d`
for DATABASE in ${DATABASES[*]}
do
        mysqldump -u$DATABASE_USER -h$DATABASE_HOST -p$DATABASE_PASS $DATABASE > /tmp/$DATABASE.sql
        mysql_backup_filename=$DATABASE'_'$DATE'_mysql.tar.gz'
        cd /tmp
        if [ ! -d "$BACKUPS_PATH" ]; then
            mkdir $BACKUPS_PATH
        fi
        if [ ! -d "$BACKUPS_PATH/$BACKUPS_MONTH" ]; then
                mkdir $BACKUPS_PATH/$BACKUPS_MONTH
        fi
        tar -zcf $BACKUPS_PATH/$BACKUPS_MONTH/$mysql_backup_filename $DATABASE.sql
        rm $DATABASE.sql
done
for SITE in ${SITES[*]}
do
        site_backup_filename=$SITE'_'$DATE'_files.tar.gz'
        cd $WWW_PATH
        if [ ! -d "$BACKUPS_PATH" ]; then
            mkdir $BACKUPS_PATH
        fi
        if [ ! -d "$BACKUPS_PATH/$BACKUPS_MONTH" ]; then
                mkdir $BACKUPS_PATH/$BACKUPS_MONTH
        fi
        tar -zcf $BACKUPS_PATH/$BACKUPS_MONTH/$site_backup_filename $SITE
done

You need to change script settings to yours. Do it in script header. Set list of the MySQL database names which you want to backup

DATABASES=(blog dbsite1 dbsite2)

Database connection settings

DATABASE_USER=root
DATABASE_PASS=password
DATABASE_HOST=localhost

Web servers root path

WWW_PATH=/var/www

Directories for virtual hosts that placed in /var/www and which you want to backup

SITES[0]=blog.com
SITES[1]=site1.com
SITES[1]=site2.com

Path where backups will be placed

BACKUPS_PATH=/backups

Removing old backups

Backups will be created every day. So, we can remove backups older than 10 days. We can do it with another one script backup_cleaner.sh

#!/bin/bash
EXPIRE_DAYS=10
SCAN_DIR=/backups
FILES=`find $SCAN_DIR -type f`
for file in $FILES;
   do
         timestamp=`date -r $file +%Y%m%d`;
         echo "Processing $file file..";
         date1yrs=`date -d "$timestamp" +%Y`;
         date1days=`date -d "$timestamp" +%j`;
         date2yrs=`date +%Y`;
         date2days=`date +%j`;
         diffyr=`expr $date2yrs - $date1yrs`;
         diffyr2days=`expr $diffyr \* 365`;
         diffdays=`expr $date2days - $date1days`;
         DAYS=`expr $diffyr2days + $diffdays`;
         if [ $DAYS -ge $EXPIRE_DAYS ]
           then
                echo "Deleting $file file...";
                rm $file;
         fi
   done

Do you want to store backups for more than 10 days period? Yo can set another value in EXPIRE_DAYS variable. For example 30, instead of 10

Automation of scripts

Place this scripts to cron

15 3 * * * root /path/to/backup.sh
30 3 * * * root /path/to/backup_cleaner.sh

In this cron job example backup script will be executed every day at 3:15 am and script for removing old backups will be executed at 3:30 am

Copy backups from Amazon EC2 Instance

We will copy backups with rsync. This utility allow to synchronize files without copy them each time again

rsync -e "ssh -i /home/username/.ec2/amazon.pem" --progress -zoguhvr --compress-level=9 ec2-user@myamazoninstance.com:/backups/ /storage/Backup/amazon/

Place this task to cron. Synchronization will be every day at 9:15 pm

15 21 * * * username /usr/bin/rsync -e "ssh -i /home/username/.ec2/amazon.pem" --progress -zoguhvr --compress-level=9 ec2-user@myamazoninstance.com:/backups/ /storage/Backup/amazon/

P.S. With this scripts you can create backups not only from Amazon but from any dedicated server

Development