[pypy-issue] Issue #2911: high memory usage calling json.dumps (pypy/pypy)

Aaron Wise issues-reply at bitbucket.org
Fri Nov 9 22:09:42 EST 2018


New issue 2911: high memory usage calling json.dumps
https://bitbucket.org/pypy/pypy/issues/2911/high-memory-usage-calling-jsondumps

Aaron Wise:

When running a simple script (below) pypy3 has about 20x memory usage compared to cpython when performing json.dumps().

Seems like this might be related to issue [1124](https://bitbucket.org/pypy/pypy/issues/1124/memory-usage-parsing-json)

Here are the memory-profiler traces:

python3-3.6.5
Line #    Mem usage    Increment   Line Contents
================================================
    12    999.1 MiB    999.1 MiB   @profile(stream=fp)
    13                             def my_func(my_map):
    14   1041.2 MiB      0.0 MiB       for i in range(1):
    15    999.1 MiB      0.0 MiB           print(i)
    16                                     #start = time.time()
    17   1041.2 MiB     42.1 MiB           a = json.dumps(my_map)


pypy3-6.0.0
Line #    Mem usage    Increment   Line Contents
================================================
    12   1359.5 MiB   1359.5 MiB   @profile(stream=fp)
    13                             def my_func(my_map):
    14   2314.0 MiB      0.0 MiB       for i in range(1):
    15   1359.5 MiB      0.0 MiB           print(i)
    16                                     #start = time.time()
    17   2314.0 MiB    954.5 MiB           a = json.dumps(my_map)

------

```
#!python

import random
import json
from memory_profiler import profile

fp=open('memory_profiler.log','w+')
@profile(stream=fp)
def my_func(my_map):
    for i in range(1):
        print(i)
        a = json.dumps(my_map)

if __name__ == '__main__':
    my_map = {}
    for i in range(10000000):
        my_map[''.join(str(random.randint(0,10000000)))] = random.uniform(0,1)
    my_func(my_map)
```




More information about the pypy-issue mailing list