How to use SQLAlchemy to dump an SQL file from query expressions to bulk-insert into a DBMS?

Posted by Mahmoud Abdelkader on Stack Overflow See other posts from Stack Overflow or by Mahmoud Abdelkader
Published on 2010-05-21T08:19:27Z Indexed on 2010/05/21 8:20 UTC
Read the original article Hit count: 373

Please bear with me as I explain the problem, how I tried to solve it, and my question on how to improve it is at the end.

I have a 100,000 line csv file from an offline batch job and I needed to insert it into the database as its proper models. Ordinarily, if this is a fairly straight-forward load, this can be trivially loaded by just munging the CSV file to fit a schema, but I had to do some external processing that requires querying and it's just much more convenient to use SQLAlchemy to generate the data I want.

The data I want here is 3 models that represent 3 pre-exiting tables in the database and each subsequent model depends on the previous model. For example:

Model C --> Foreign Key --> Model B --> Foreign Key --> Model A

So, the models must be inserted in the order A, B, and C. I came up with a producer/consumer approach:

 - instantiate a multiprocessing.Process which contains a
 threadpool of 50 persister threads that have a threadlocal 
 connection to a database

 - read a line from the file using the csv DictReader

 - enqueue the dictionary to the process, where each thread creates
 the appropriate models by querying the right values and each
 thread persists the models in the appropriate order

This was faster than a non-threaded read/persist but it is way slower than bulk-loading a file into the database. The job finished persisting after about 45 minutes. For fun, I decided to write it in SQL statements, it took 5 minutes.

Writing the SQL statements took me a couple of hours, though. So my question is, could I have used a faster method to insert rows using SQLAlchemy? As I understand it, SQLAlchemy is not designed for bulk insert operations, so this is less than ideal.

This follows to my question, is there a way to generate the SQL statements using SQLAlchemy, throw them in a file, and then just use a bulk-load into the database? I know about str(model_object) but it does not show the interpolated values.

I would appreciate any guidance for how to do this faster.

Thanks!

© Stack Overflow or respective owner

Related posts about Performance

Related posts about python