[CentralOH] Django Management Command Memory Usage: Chunkifying Worked

jep200404 at columbus.rr.com jep200404 at columbus.rr.com
Tue Jun 5 04:45:36 CEST 2012


On Mon, 4 Jun 2012 15:34:56 -0400, jep200404 at columbus.rr.com wrote:

> Places.objects.all().iterator() seems to have solved my problem. 

Well maybe not. The database was not getting updated. 

However, the following ugly code works 
and does so without gobbling too much memory. 
It seems to max out at 610 MB. 

    def handle(self, *args, **options):
        chunk = 100000
        min_id = Places.objects.all().aggregate(Min('id'))['id__min']
        max_id = Places.objects.all().aggregate(Max('id'))['id__max']
        for j in xrange(min_id, max_id + 1, chunk):
            for record in Places.objects.all().filter(
                    id__gte=j).filter(id__lt=j + chunk):
                if record.x is not None and record.y is not None:
                    record.point = (
                        Point(float(record.x)/1000.,
                        float(record.y)/1000.))
                else:
                    record.point = None
                record.save()
        django.db.connection.close()

I'll play with Paginator per Nitin Hayaran another time. 



More information about the CentralOH mailing list