Hi Jared,

What operation are you doing? We don’t actually need to generate the kdtree for many operations for Gadget-2 data. We do need it to load in tipsy data though. We may be generating it here unnecessarily depending on what you’re doing.

In general though I think we’re going to need to make it so we can use cykdtree’s MPI-parallelized kdtree for really big datasets like yours.

Thanks for the feedback! It’s good to hear about scaling issues like this.

Nathan

On Tue, Aug 14, 2018 at 11:44 AM Jared Coughlin <Jared.W.Coughlin.29@nd.edu> wrote:
Hello! I'm using the sph-viz branch of yt and I'm trying to make a density projection plot from a gadget-2 snapshot. This works like a charm when I use a snapshot with 428^3 particles in a 15 Mpc box. However, when I try to do it on a snapshot with 1024^3 particles in a 40 Mpc box, my job keeps getting killed because it is trying to exceed the available system memory. I'm submitting it to a node with 256 GB of RAM, and I'm requesting all of it, which means that more than this is trying to be used. The snapshot itself is ~ 40 GB. The memory usage appears to spike when allocating for the KDTree. Before that, it appears to generate the .ewah file without issue. I was just wondering if anyone had any thoughts on why this large spike in memory might be occurring  and how I might go about fixing it? If not, no worries, I can try and investigate it myself, but it's a pain to do with pdb. My gadget simulation used a maximum of ~ 173 GB of RAM, so I feel like the kdtree shouldn't be using as much memory as it's trying to. Thanks, and sorry for the bother!

Sincerely,
-Jared
_______________________________________________
yt-users mailing list -- yt-users@python.org
To unsubscribe send an email to yt-users-leave@python.org