Mysql – Import a 260GB csv file into MySQL

csvfileslarge-dataMySQL

I have a really big csv ~260GB and I want to import it into MySQL. I use the following mysql script in MacOS

DROP TABLE IF EXISTS tmp_catpath_5;

create table tmp_catpath_5(
a1 BIGINT(20),a2 BIGINT(20),a3 BIGINT(20),a4 BIGINT(20),a5 BIGINT(20),
c1 BIGINT(20),c2 BIGINT(20),c3 BIGINT(20),c4 BIGINT(20),c5 BIGINT(20),
INDEX (a5)
);

load data local infile '/Volumes/Time Machine Backups 1/tmp_catpath_5.csv'
into table tmp_catpath_5
FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n' ;

It works for smaller files e.g. 150GB, but for this size the database disappears immediately after the load command

ERROR 1049 (42000): Unknown database

Is there a MySQL size limit? Is it something else? I use the MySQL from MAMP application (LAMP for MacOS), could it be different from the official MySQL application?

Best Answer

I don't know if there are limits for particular DB engines but I've met with some Department of Defense admins who manage a MySQL system which imports 10TB every day and they are not experiencing this kind of problem.

Rather than use macros I recommend writing a script to parse the input file and create insert statements from it, which can be fed straight to MySQL. That way you are not dealing with the entire file in one hit, just one line at a time. I like to use Perl for such things.

You may also like to consider using truncate instead of dropping he table and recreating it each time.

Related Topic