We have an online system using MySQL database. We need to merge an older backed up database (Full tables, not a dump file) with the current one. Without experiencing any server downtime, how would you do it? Note that the system is online 24/7. What are the issues might rise?
Thank you.
Best Answer
When messing with MySQL's filesystem you have to stop the MySQL server. To avoid downtime on your live machine, use a backup/virtualised machine with the SAME VERSION of MySQL server. While the BACKUP MySQL server is stopped, copy the tables (I assume .FRM, .MYI etc?) into the filesystem in /var/lib/mysql/BACKUP_DB (the BACKUP_DB's corresponding directory).
Start the BACKUP MySQL server and ensure the data has loaded correctly using scripts or CLI. When verified, mysqldump the BACKUP_DB database so it can be loaded into the live server:
You have now converted your raw backup data into SQL statements which can be loaded into MySQL without downtime (unlike the raw data). Move
BACKUP_DB.sql
to the live machine.Import
BACKUP_DB.sql
into your LIVE MySQL instance as a different database:You should now have the backup database loaded into MySQL as BACKUP_DB.
Now, dependent on INSERT IGNORE, or REPLACE INTO statements (are you overwriting old data or "filling the blanks" in your indexes?):
Or, for REPLACE INTO action:
Alternatively, instead of piping output back into MySQL, send to a file and review the SQL statements.
Finally, to avoid downtime, dump the first database over the second:
You could always lock the database first to prevent data being written to (for example) the z table with a foreign key in the a table (which has already been written over):
(perform dump as previous step)
Add the FLUSH command to the start of your dump .sql file and UNLOCK to the end.
Be sure to quadruple check your DB names in this scenario! Ask any follow up question you're unsure of, as this is high risk data mashing stuff. Perform (and note in detail) the exact steps required on a dev server if possible, or virtualize your test, or create a small scale test. Just test. And, of course, take enough backups to cover every data loss eventuality.