Hello! I'm using the sph-viz branch of yt and I'm trying to make a density projection plot from a gadget-2 snapshot. This works like a charm when I use a snapshot with 428^3 particles in a 15 Mpc box. However, when I try to do it on a snapshot with 1024^3 particles in a 40 Mpc box, my job keeps getting killed because it is trying to exceed the available system memory. I'm submitting it to a node with 256 GB of RAM, and I'm requesting all of it, which means that more than this is trying to be used. The snapshot itself is ~ 40 GB. The memory usage appears to spike when allocating for the KDTree. Before that, it appears to generate the .ewah file without issue. I was just wondering if anyone had any thoughts on why this large spike in memory might be occurring and how I might go about fixing it? If not, no worries, I can try and investigate it myself, but it's a pain to do with pdb. My gadget simulation used a maximum of ~ 173 GB of RAM, so I feel like the kdtree shouldn't be using as much memory as it's trying to. Thanks, and sorry for the bother!
Sincerely,
-Jared
_______________________________________________