Linux – fastest way to regularly back up large data-files on a weekly basis

backupcopylinuxrsync

We have automated script to back up 200 GB of data files to a local disk.
the script shuts down the database , tars and compresses the entire directory locally on the disk, and then starts the database

tar -czvf data.tgz /some/folder

This process takes two hours which is too long a down-time . We want to reduce this down-time.

Consider the following:
– The main goal is to have an identical copy of the files in the shortest possible time while the database is down.
Later on, we can compress , transfer, or do any other operation on the files.

I was thinking to use rsync to sync the files every week with the target backup, and rsync will update only the changes which will take less time.

Will that work, or there is better approach ?

Best Answer

Filesystem snapshots are the right way to go about doing something like this.