Linux – Backup solution to backup terabytes and lots of static files on linux server

archivebackupcdplinux

Which backup tool or solution would you use to backup terabytes and lots of files on a production linux server ?
Note that the files are all different and almost never modified, and usage is mostly adding files, so data volume is today 3TB growing all the time at around +15GB/day.

Please do not reply rsync. Basic unix tools are not enough, rsync does not keep history, rdiff-backup miserably fails from time to time and screw the history. Moreover these are all file based backup, which put a lot of IOwait just to browse directories and query stat(). But i guess, except R1Soft CDP, there is no way around that.

We tried R1Soft CDP backup, which is block level backup, and it proved good and efficient for all our other servers, but systematically fails on the server with 3 terabytes and gazillions of files. That is already more than 2 months that the engineers of R1Soft and datacenter are playing a hot ball game… and still no backup except regular rsync

We never tried big commercial solutions, except R1Soft CDP since it was provided as an optional service by the datacented hosting our servers.

Best Answer

I think only solution for you is block-level backups
You may write scripts that uses LVM snapshots (or even lower level dm-snapshots) and transfer them to storage server

You also may take a look into Zumastor project and their ddsnap utility

PS. Solaris/FreeBSD servers have ZFS that can automate this process by using incremental snapshots + ZFS send/recive