Mysql – fast method for exporting a large table from thesql to postgresql

MySQLpostgresql

I've got a table with around 2.7 million records, 63 fields, and several indexes. I need to be able to get this table from mysql to postgresql quickly (say within an hour). I know mysql can export the data quickly, but postgresql seems to be very slow when importing the data. Any ideas? I've been importing a CSV produced from mysqldump for about 3 hours now.

Best Answer

The fastest way I can think of would be to drop all the indexes and triggers (or at least ALTER TABLE foo DISABLE TRIGGER ALL;) on the table in postgresql, add the \. end-of-input line to the end of your CSV file, then put the file on the server (preferably on a drive other than the one being used for the DB, or maybe somewhere on the network the server can access via NFS) and COPY foo FROM '/some/file.csv' WITH CSV; (8.x syntax, which is supported in 9.0. Depending on the CSV file format you may need to make modifications.) Afterwards, recreate the indexes and re-enable the triggers.

Some of the configuration options for postgres can affect data load times. Disabling autovacuum and increasing the time between checkpoints can help with huge loads, but you'll need to restart postgresql in order to change the checkpoint_segments or autovacuum settings, and you'll want to set autovacuum back on when you're done.