Tar gzip slowing down server

backupgziptar

I have a backup script that:

  1. compress some files
  2. generate md5
  3. copy the compressed file to another server.
  4. the other server finishes comparing MD5 (to find copy errors).

Here it's the core script:

nice -n 15 tar -czvf $BKP $PATH_BKP/*.* \
| xargs -I '{}' sh -c "test -f '{}' && md5sum '{}'" \
| tee $MD5
scp -l 80000 $BKP $SCP_BKP
scp $MD5 $SCP_BKP

This routine got CPU at 90% at gzip routine, slowing down the production server. I tried to add a nice -n 15 but server still hangs.

I've already read 1 but the conversation didn't help me.

What is the best approach to solve this issue ?
I am open to new architectures/solutions 🙂

Best Answer

If you use nice, you change the priority, but this will have a noticeable impact only if the CPU is close to 100% usage.

The server becomes slow, in your case, not because of the CPU usage, but because of the I/O on the storage. Use ionice to change the I/O priority and keep the nice for CPU priority.

Related Topic