We have to insert 2 millions of records across multiple tables and right now we are writing into a CSV file and using db2 import to load into database.
We wanted to change this logic to some kind of JDBC. While looking into multiple options, I am confused with Spring JDBC template and plain JDBC.
Let us say I wanted to insert 1 million records into 10 tables, each of which will have 100 thousand, and all these are simple JDBC statements (not prepared statements because I don't know which table I am dealing with at runtime).
Whatever system we choose will need to handle inserting up to 15 million records for a peak request.
Which framework will be better?
Best Answer
If you want to move a lot of data, then using JDBC (or any library building on top of JDBC) may be a bad choice, compared to using bulk copy tools (like db2import). JDBC is going to be orders of magnitude slower, because
JDBC is a very chatty protocol, and
usually bulk copy tools relax constraints during the copy process.
The difference in time can be extreme: what takes the bulk copy tool 10 minutes can take hours using JDBC. You'll want to create a prototype and do some timings and be certain about what kind of performance you'll get before you commit to something like this.