urllib2 script slowing and stopping

Dantium danatk at gmail.com
Tue Oct 12 00:29:12 CEST 2010


On Oct 11, 10:07 pm, Ian <ian.g.ke... at gmail.com> wrote:
> On Oct 11, 2:48 pm, Dantium <dan... at gmail.com> wrote:
>
> > I have a small script that reads several CSV files in a directory and
> > puts the data in a DB using Django.
>
> > There are about 1.7 million records in 120 CSV files, I am running the
> > script on a VPS with about 512mb of memory python 2.6.5 on ubuntu
> > 10.04.
>
> > The script gets slow and seems to lock after about 870000 records.
> > running top show that the memory is all being used up y the python
> > process, is there someway I can improve on this script?
>
> Probably you have "DEBUG = True" in your Django settings.py file.  In
> debug mode, Django records every query that is executed in
> django.db.connection.queries.  To fix it, either disable debug mode or
> just have your script go in and clear out that list from time to time.
>
> HTH,
> Ian

Yeah thanks that helped!

It was still running really low on memory by the end though but they
all got added.



More information about the Python-list mailing list