Python+MySQL - Bulk Insert

I'm working with the MySQLdb module in Python to interact with a database. I have a situation where there is a very large list (tens of thousands of elements) which I need to insert as rows into a table.html

My solution right now is to generate a large INSERT statement as a string and execute it.mysql

Is there a smarter way?sql



There is a smarter way.this

The problem with bulk insertions is that by default autocommit is enabled thus causing each insert statement to be saved to stable store before the next insert can initiate.spa

As the manual page notes:code

By default, MySQL runs with autocommit  mode enabled. This means that as soon  as you execute a statement that  updates (modifies) a table, MySQL  stores the update on disk to make it  permanent. To disable autocommit mode,  use the following statement:htm

SET autocommit=0;

After disabling  autocommit mode by setting the  autocommit variable to zero, changes  to transaction-safe tables (such as  those for InnoDB, BDB, or NDBCLUSTER)  are not made permanent immediately.  You must use COMMIT to store your  changes to disk or ROLLBACK to ignore  the changes.element

This is a pretty common feature of RDBMs systems which presume that database integrity is paramount. It does make bulk inserts take on the order of 1s per insert instead of 1ms. The alternative of making an overlarge insert statement tries to achieve this single commit at risk of overloading the SQL parser.get

相關文章
相關標籤/搜索