The backup procedure written by Peter Hutton-Czapski works nicely. I’ve added a few lines to sync the backup with a removable USB drive that can be taken home.
The files on the USB drive are encrypted and compressed, so if lost, they are not readable.
To list devices and find out where the USB drive is located:
sudo fdisk -l
should see /dev/sdc1
Create a mount point
sudo mkdir /media/usb_external
Mount the drive
sudo mount -t ntfs-3g /dev/sdc1 /media/usb_external -o force
Edit the backup script.
/usr/local/backups/backup.sh
Add the following lines to the end of the backup script
cp /usr/local/backups/data/2012-06* /media/usb_external/backup/
sudo rsync –delete -dptgoD -e ssh /usr/local/backups/data/ /media/usb_external/backup
if backup files are in /home/mysql
sudo rsync –delete -dptgoD -e ssh /home/mysql/ /media/usb_external/backup
sudo umount /media/usb_external
Some clients have a LOT of scanned documents in Oscar. If a complete set of 30 backups is too large for the existing system, you can replace the lines that delete the month-old backup with the following. This removes any files that end with .gz and are over 7 days old:
find ${DEST} -type f -name “*.gz” -mtime +7 -exec rm {} \;