Cyberborean Chronicles

Home IT: Backup

I have to say I am quite serious about losing any bit of my data. It was a time when I backed up everything, including the web-browser history: actually I have dropped this practice only when I realized that every of my tar.gz files overgrew the standard DVD size and that a half of the data I stored didn’t worth it at all.

There is however, a lot of important stuff I’m backing up regularly: projects, documents, Basket notes, emails and so on. A long ago, I have developed a custom automated backup procedure that has been greatly improved with the help of a dedicated server in my home network. The solution is simple, based on standard Linux tools and works perfectly for me.

On the desktop

I have a special directory in my home space, ~/.toBackup, that contains symbolic links to the places I want to include into the backup. A tiny script is called by cron every hour to loop over this list and send incremental updates to the server using rsync:

#! /bin/sh
for D in $TO_BACKUP/* do
  rsync -avzL $D $REMOTE

The target directory (“backup“) of the server’s rsync daemon is shared via NFS, so I always have a mirror of my things one hour back maximum.

On the server

Every night, the server runs a cron job to copy the contents of the backup directory to another hard drive, thus making a daily snapshot of my work. If something horrible would happen with both the original data and with the mirror, there is a yesterday’s copy to restore.

#! /bin/sh

And, finally, once a week (every Friday), yet another cron job packages the latest daily backup into tar.gz archives and puts them into new directory, named after the current date:

#! /bin/sh
WEEKLY_BACKUP=/mnt/disk2/backup-weekly/`date +%Y-%m-%d
for D in ./* do
   tar czhf $WEEKLY_BACKUP/$D.tgz $D

All what I have to do manually is to burn the DVD’s with latest archives once in a couple of months and clean up the weekly backups directory (though it is possible to automate the latter too).

Leave a Reply