New GitHub issue #121569 from Pitirus:<br>
<hr>
<pre>
# Bug report
### Bug description:
I noticed that the `pickle.load` method leaves more memory allocated than just the size of the returned object when unpickling geojson dictionaries.
I initially noticed the leak in my docker container running on x86. Here is a code snippet that reproduced the problem for me on M1 MacOS Sonoma (python 3.11.9).
```python
import io
import pickle
import psutil
process = psutil.Process()
to_pickle = {
"type": "FeatureCollection",
"crs": {"properties": {"name": "EPSG:4326"}, "type": "name"},
"features": [
{
"type": "Feature",
"id": f"id{idx}",
"properties": {
"str_prop": f"str_var{idx}",
"int_prop": idx,
"bool_prop": bool(idx % 2),
"none_prop": None,
"float_prop": float(idx),
},
}
for idx in range(9000)
],
}
b = pickle.dumps(to_pickle)
del to_pickle
print(
f"Initial memory usage: {round(int(process.memory_info().rss / 1024) / 1024, 3)}MiB"
)
pickle.Unpickler(io.BytesIO(b)).load()
print(
f"After 1st unpickle memory usage: {round(int(process.memory_info().rss / 1024) / 1024, 3)}MiB"
)
for i in range(1000):
pickle.Unpickler(io.BytesIO(b)).load()
print(
f"Final memory usage: {round(int(process.memory_info().rss / 1024) / 1024, 3)}MiB"
)
```
### CPython versions tested on:
3.11
### Operating systems tested on:
Linux, macOS
</pre>
<hr>
<a href="https://github.com/python/cpython/issues/121569">View on GitHub</a>
<p>Labels: type-bug</p>
<p>Assignee: </p>