kent3737 at yahoo.com
Fri Dec 3 12:06:11 CET 2004
> Hello All,
> I wanted to thank Roger Binn for his email. He had
> the answer to my issue with writing speed. It's
> actual made an incredible change in the preformace. I
> didn't have to go all the way to implementing the
> synchronous mode(for my app). Previously, I was
> insert one record at a time. The key was to write
> them all at one time. I moved up to a 13 meg file and
> wrote it to the db in secs. Now the issue is the 120
> meg of RAM consumed by PyParse to read in a 13 meg
> file. If anyone has thoughts on that, it would be
> great. Otherwise, I will repost under a more specific
If your data is (or can be) created by an iterator, you can use this recipe to group the data into
batches of whatever size you choose and write the individual batches to the db.
> while i < TriNum
> db.execute("""insert into TABLE(V1_x)
> values(%f),""" (data[i]))
> i = i + 1
> Do you Yahoo!?
> Yahoo! Mail - You care about security. So do we.
More information about the Python-list