Slow loading of large in-memory tables
Philipp K. Janert, Ph.D.
janert at ieee.org
Tue Sep 7 07:14:50 CEST 2004
I am trying to load a relatively large table (about 1 Million
rows) into an sqlite table, which is kept in memory. The
load process is very slow - on the order of 15 minutes or
I am accessing sqlite from Python, using the pysqlite driver.
I am loading all records first using cx.execute( "insert ..." ).
Only once I have run cx.execute() for all records, I commit all
the preceding inserts with conn.commit()
I have tried using cx.executemany(), but if anything, this
makes the process slower.
I have not tried mucking manually with transactions.
I have sufficiently convinced myself that the slow part
is in fact the cx.execute() - not reading the data from file
or anything else.
My system specs and versions:
1 GB memory (I am not swapping, this is not the problem).
Are there ways to make this process faster?
Also, I want to keep the DB in memory, since I use it later
to run a very DB-intensive simulation against it. However,
this implies that I need to load the DB from the same python
program which will later run the simulation - I think.
Any hints appreciated!
(Please cc me when replying to the list in regards to this
More information about the Python-list