Hi, I think I found a little bug when calling get_data() in parallel jobs. I was trying to make basic slices of Density, Temperature, and Electron_Fraction. When it gets to the Electron_Fraction slice, it already had the data in memory, so temp_data == {}. I'm not sure why it was already in memory, I hadn't accessed Electron_Density yet. Perhaps it was loaded from the .yt file? Then the routine tries to accumulate temp_data with _mpi_catdict(), and this is where things go wrong because temp_data is empty! An if-statement before this call avoided this bug, diff -r c248cd4313a8 yt/lagos/BaseDataTypes.py --- a/yt/lagos/BaseDataTypes.py Sun Mar 28 23:31:52 2010 -0700 +++ b/yt/lagos/BaseDataTypes.py Mon Mar 29 12:47:20 2010 -0400 @@ -593,7 +593,8 @@ # Now the next field can use this field self[field] = temp_data[field] # We finalize - temp_data = self._mpi_catdict(temp_data) + if temp_data != {}: + temp_data = self._mpi_catdict(temp_data) # And set, for the next group for field in temp_data.keys(): self[field] = temp_data[field] I just wanted to make sure that this is the right thing to do before committing. John
participants (2)
-
John Wise
-
Matthew Turk