sqlite3 db update extremely slow

coldpizza vriolk at gmail.com
Mon Jul 16 21:22:30 CEST 2007


I am trying to fill a sqlite3 database with records, and to this end I
have written a class that creates the db, the table and adds rows to
the table.

The problem is that the updating process is *extremely* slow, and
occasionally I get the message "database locked".

I tried removing "self.con.commit()" in the add_record method, but
then nothing is saved in the db. I don't know whether this has
anything to do with it, but I have found no option to enable
autocommit.

This is the class that I am using:

class sqliteDB(object):
    "Wrapper for SQLite methods"
    def __init__(self, db_file="sqlite3.db"):
        'Intialize SQLite database, sqlite_db_init("db_file_name.db")'
        print 'SQLite db init: ', db_file
        self.con = sqlite3.connect(db_file)
        self.cur = self.con.cursor()

    def create_table(self, table):
        "create table (table_name)"

        query ='CREATE TABLE %s (hword VARCHAR(256) PRIMARY KEY,
definition TEXT)' % table
        try:
            self.cur.execute(query)
            self.con.commit()
        except Exception, e:
            print e

    def add_record (self, table, headWord, definition):

        try:
            self.cur.execute('INSERT INTO ' + table + '(hword,
definition) VALUES(?, ?)', (headWord, definition))
            self.con.commit()
        except Exception,  e:
            print e

And this is the actual code that I use to write to the db file:

db = sqliteDB()
db.create_table("table_name")

for k, v in myData:
      db.add_record(table, k,v)

This works extremely slow (~10KB of data per second) and takes ages to
complete even with small files. Where did I go wrong?

Would it be faster (or possible) to import a text file to sqlite using
something like the mysql's command
LOAD DATA INFILE "myfile.csv"...?




More information about the Python-list mailing list