Performance Problems when selecting HUGE amounts of data from MySQL dbs

Hans Nowak wurmy at earthlink.net
Wed Jan 9 11:29:24 EST 2002


Gabriel Ambuehl wrote:
> 
> -----BEGIN PGP SIGNED MESSAGE-----
> 
> Hello,
> I need to export HUGE amounts (upto 1Mio, but normally some 10000) of
> say 150 byte fields from a MySQL db and in the end, I need them as
> ONE giant concatenated
> string. Working with mysql-python, the only way I see is
> 
> db.execute("SELECT myfield FROM table")
> data=db.fetchall()
> giantstring=""
> for record in data:
>     giantstring+=record

Yes, string concatenation is slow this way. Try this:

    giantlist = []
    for record in data:
        giantlist.append(record)
    giantstring = string.join(giantlist, "")

or, if 'data' is a list of strings already, you could
even do

    giantstring = string.join(data, "")

--Hans (base64.decodestring('d3VybXlAZWFydGhsaW5rLm5ldA==') 
       # decode for email address ;-)
Site:: http://www.awaretek.com/nowak/



More information about the Python-list mailing list