Performance Problems when selecting HUGE amounts of data from MySQL dbs

Gabriel Ambuehl gabriel_ambuehl at buz.ch
Wed Jan 9 16:25:59 CET 2002


-----BEGIN PGP SIGNED MESSAGE-----

Hello,
I need to export HUGE amounts (upto 1Mio, but normally some 10000) of
say 150 byte fields from a MySQL db and in the end, I need them as
ONE giant concatenated
string. Working with mysql-python, the only way I see is

db.execute("SELECT myfield FROM table")
data=db.fetchall()
giantstring=""
for record in data:
    giantstring+=record

but this is way too slow. Now I wondering whether there's any other
approach I can't seem to able to find or some other trick to speed
things up (maybe there's some way to have MySQL concatenate the
fields
itself and return only giantstring?)?

An approach for optimizing might be to select only X fields at a time
(using LIMIT) and do multiple selects, but I feel the real problem is
the concatenation, not the select itself...


Any comments, thoughts, tricks would be greatly appreciated.


Best regards,
 Gabriel

-----BEGIN PGP SIGNATURE-----
Version: PGP 6.5i

iQEVAwUBPDxS+8Za2WpymlDxAQExUwf/eJVpK200oZl9E737JT9vVAfcDCGR9EQm
62gcGRGu+0MV6VNJlurSF889ps4oXjGj3S1L1qUD7IssJkEJ0wTNr8fvNHTNUkbl
FVp8XyezuDsQXcyHeRpSz+DxPlt4eGaaSp0Y8mVgqylk9KrYpNBcvQrRXu1GE+2x
cCio+ywpxP2XbZwnyfYVK1KHZb52zhn8GreGDuhCG/329Ee6btJHMQ2w9jR2YIz+
UyVZ2ki3lCj2gwKgrHlTLCREU53HDlWvcLIha1K1GG83n+TLjM51tpjqUmdAujEf
IVOa1fYFq5RLMRV2M+qKXWW5c6BNo8HEmIMjDvhkujMNG8J9H+xBpw==
=5GJt
-----END PGP SIGNATURE-----





More information about the Python-list mailing list