Python, MS SQL, and batch inserts
ericwoodworth at gmail.com
ericwoodworth at gmail.com
Tue Apr 21 16:01:27 EDT 2009
On Apr 21, 3:36 pm, Scott David Daniels <Scott.Dani... at Acm.Org> wrote:
> Philip Semanchuk wrote:
> > ... If you're doing a mass insert to populate a blank table it also often
> > helps to postpone index creation until after the table is populated....
>
> I forget the name of the SQL Server bulk loader, but for large loads, I
> used to populate a fresh table with the bulk data, then do UPDATEs and
> INSERTs to get the data spread out into the main tables. You (the OP)
> might try a scheme like that.
>
> --Scott David Daniels
> Scott.Dani... at Acm.Org
Hmm..I think I'm going to move my question over to a SQL forum because
this is starting to feel like a SQL, rather than a python issue to me.
Three times now after letting the system "rest" where I go off an do
other things and then run my script it completes in 10 seconds. If I
drop tables and start fresh immediately after that it takes 35
seconds. If I drop the tables and wait an hour and then run the
script it'll finish in 10 seconds again.
That makes me think it's a SQL config or optimization issue more than
a python issue.
oh and the times I listed above were totals from the start of
execution so the string.join() was taking 0.047 seconds to run. It
was taking 9 seconds to get my data from the com object and format it
but the join was quite fast.
More information about the Python-list
mailing list