[Tutor] Updating MySQL Database
wormwood_3 at yahoo.com
Mon Oct 8 20:24:04 CEST 2007
Thanks to everyone who responded for the information and tips.
* That line I had:
> for line in inputlist:
> updatequery = "update resultstable set fqdn = line.split(",") where ip = line.split(",");"
was totally bogus. I was typing and thinking, and not at the same rate:-) The real thing would be something more like:
for line in inputlist:
fqdn = line.split(",")
ip = line.split(",")
updatequery = "update resultstable set fqdn = '%s' where ip = '%s';" % (fqdn, ip)
* I will try a first version with a single connection and close, looping through executes in the middle. Should have thought of that first...
----- Original Message ----
From: Eric Walstad <eric at ericwalstad.com>
To: wormwood_3 <wormwood_3 at yahoo.com>
Cc: Python Tutorlist <tutor at python.org>
Sent: Sunday, October 7, 2007 11:07:12 PM
Subject: Re: [Tutor] Updating MySQL Database
> Hello all,
> I have a script which takes data from a file or MySQL DB, looks up some stuff, then can print results to console or file. I would also like it to be able to update a MySQL database with the results. Does anyone have any ideas on how to do this? I can update records just fine, but what is the best way to do LOTS of updates aside from running an update statement per record?
I recommend not worrying about speed on your first iteration of your
script. I'd first define 'too slow', then write the script the way that
feels intuitive to you. If your script passes your threshold of 'too
slow', then look at optimizing it.
If optimization is really necessary, I'd look into .executemany(). If
that is still too slow for you then I'd consider writing the update SQL
to a file and then calling mysql, passing it the sql file your script
created. I found this last approach the fastest for a data import
script I once wrote for importing millions of records to a PostgreSQL
More information about the Tutor