Projection Plot on a Large Simulation
2 Mar
2018
2 Mar
'18
11:10 a.m.
Hello! I have a large gadget simulation (1024^3 particles) that I am trying to make a projection plot of. I thought it would make sense to run in in parallel, so I'm currently running it on a node with 24 processors and 256GB of RAM. I know for a fact that this is enough RAM to hold the particle data because, well, the gadget simulation ran. However, the job keeps getting killed because it is using all of the system memory. Does yt try and give each processor a copy of the whole particle data set when run in parallel, or does it farm the data out more efficiently? I've tried lowering n_ref, but I can try lowering it more. Thanks! -Jared
2219
Age (days ago)
2219
Last active (days ago)
1 comments
2 participants
participants (2)
-
Jared Coughlin
-
Nathan Goldbaum