Linux – Dedicated server automatic backup solution

backupfilesystemslinuxMySQL

I have a dedicated Ubuntu web server in a cloud environment, and I am looking for a nice way to do automated backups.

I would like to backup some directories with web apps, and all my MySql databases. As for destination: make snapshots every two hours localy, and every six hours to a remote ftp server. Also delete backup archives older than seven days(localy + ftp), and notify on any problems by email.

Now to achieve some of this functionality I use cron + shell script, and http://www.mysqldumper.net/, but really that doesn't answer my needs. Mysqldumper doesn't know automaticly about new databases, and shell script does not notify on problems. It's something I have to check out from time to time, and i don't have trust for.

I googled a while, and seems like most people solve this stuff with shell scripts. Is this a method you can trust? Are there any web-gui tools, I'm missing? Maybe there is a smarter startegy for doing this? I'm a little bit confused.

Best Answer

Actually I'm using rsnapshot for backup. It can't backup mysql db itself, but it allows to execute scripts before and after backup. So before rsnapshot executes:

 /usr/bin/ssh remote_host 'mysql -N -e "SHOW DATABASES;" | while read db; do mysqldump --skip-comments $db |gzip > ~/db/${db}.sql.gz; done'

all mysql settings are stored in ~/.my.cnf and after backup is done rsnapshot executes script to remove dump on remote host.

Also rsnapshot uses hardlinks, when it makes backup, so it saves your space and you have full backup for any moment it was done.

Rsnapshot uses rsync to transfer data, therefore it's more secure then ftp.

Related Topic