Linux – What’s the best way to backup a web server with 30GB of data

backuplinux

I currently have a Linux server running with around 10,000 users daily on it. The hosting provider offers a backup which I'm also using. Although I trust my hoster, I would like to have an offsite backup, just in case the host goes down for a longer time or goes bankrupt (you never know). My idea was to tar and split the data and copy the archive to my Amazon S3 account, but I'm wondering if that's the best idea.

Best Answer

Best was to offsite-backup 30GB of data?

I'd probably say rsync with your Amazon S3, but keep in mind that its 30gb, so bandwidth costs will be high (if you pay 95%) and it'll take a long time to get your initial push up there. Once your data is up there just keep the files rsynced nightly / weekly / whatever your preference is.

however, Thats ONLY the backup side, dont forget about recovery. I'd purchase a second server in a second datacenter, have similar builds, and rsync the changes nightly.