Help with a simple incremental backup script

Posted by Evan on Ask Ubuntu See other posts from Ask Ubuntu or by Evan
Published on 2011-01-07T19:09:12Z Indexed on 2011/01/07 19:58 UTC
Read the original article Hit count: 529

Filed under:
|
|
|

I'd like to run the following incomplete script weekly in as a cron job to backup my home directory to an external drive mounted as /mnt/backups

#!/bin/bash
#

TIMEDATE=$(date +%b-%d-%Y-%k:%M)

LASTBACKUP=pathToDirWithLastBackup

rsync -avr --numeric-ids --link-dest=$LASTBACKUP /home/myfiles /mnt/backups/myfiles$TIMEDATE

My first question is how do I correctly set LASTBACKUP to the the the directory in /backs most recently created?

Secondly, I'm under the impression that using --link-desk will mean that files in previous backups will not will not copied in later backups if they still exist but will rather symbolically link back to the originally copied files? However, I don't want to retain old files forever. What would be the best way to remove all the backups before a certain date without losing files that may think linked in those backups by currents backups? Basically I'm looking to merge all the files before a certain date to a certain date if that makes more sense than the way I initially framed the question :). Can --link-dest create hard links, and if so, just deleting previous directories wouldn't actually remove linked file?

Finally I'd like to add a line to my script that compresses each newly created backup folder (/mnt/backups/myfiles$TIMEDATE). Based on reading this question, I was wondering if I could just use this line

gzip --rsyncable /backups/myfiles$TIMEDATE

after I run rsync so that sequential rsync --link-dest executions would find already copied and compressed files?

I know that's a lot, so many thanks in advance for your help!!

© Ask Ubuntu or respective owner

Related posts about bash

Related posts about backup