MySQL – Backup Large Database and Avoid Website Slowness and Downtime

backupmaintenanceMySQL

My goal is to be able to backup a large database on a fairly popular Magento site without causing performance problems to the live site. Data integrity is also important to me. Hoping that somebody has some pro tips for me.

Details of my current situation

  • Current database size is approx. 5GB when the import/export tables, log tables, etc are cleared
  • We are setup with a dedicated box for our mysql instance. But only one mysql instance, not a master/slave setup.
  • We typically use a command like this to take our backup mysqldump -u $USERNAME -h $HOSTNAME -p $DB > $FILENAME. Sometimes we also use the --opt --skip-lock-tables option.
  • When the db was smaller 2 years ago this wasn't a problem since it ran fast, but now that the db is large and the dump takes quite some time to finish, users experience slowness and downtime during the backup window.

What can I do to make my database backups better, faster, less impactful on the website users while still maintaining a good, usable backup?

Best Answer

I think this is more related to http://dba.stackexchange.com and has already been answered there quite often.

Take a look:

For a quick and easy solution where you could stick to mysqldump you could also try splitting your dump down to specific tables to reduce the size. Tables that don't change much or don't contain critical data don't have to be backed up that frequently: https://dba.stackexchange.com/questions/9306/how-do-you-mysqldump-specific-tables