Performance Problems when selecting HUGE amounts of data from MySQL dbs

Steve Holden sholden at holdenweb.com
Wed Jan 9 11:32:31 EST 2002


"Gabriel Ambuehl" <gabriel_ambuehl at buz.ch> wrote in message
news:mailman.1010591533.22806.python-list at python.org...
> -----BEGIN PGP SIGNED MESSAGE-----
>
> Hello Skip,
>
> 9 Jan 2002, 16:45:45, you wrote:
> >     Gabriel> but this is way too slow.
>
> > How about
>
> >     db.execute("SELECT myfield FROM table")
> >     data=db.fetchall()
> >     giantstring=[]
> >     for record in data:
> >         giantstring.append(record)
> >     giantstring = "".join(giantstring)
>
>
> How's that different from
>       [...]
>       data=db.fetchall()
>       data="".join(data)?
> which I assume should be way faster as it doesn't need to copy the
> data to another list first (which is nice from a memory requirement
> point of view as well)?
>
Bzzzt.

>>> "".join(data)
Traceback (innermost last):
  File "<interactive input>", line 1, in ?
TypeError: sequence item 0: expected string, tuple found
>>>

regards
 Steve
--
http://www.holdenweb.com/








More information about the Python-list mailing list