I am trying to make projections and slices in the same script and an
iterating over several DD datasets, but after a while I get this error
pasted below, it seems that there's some hdf5 error where h5py cannot open
a file. I've made sure the file exists and the script continues to run if
I change the DD#### to continue where it failed. I've already modified the
script so that each DD is using a new python instace intead of iterating
through the files with one python call. I tried this in queue and
interactive and get the same error, sometimes after many DD, but sometimes
after only 1 or 2. Is this indicative of memory issue? If it is, I'm not
getting the OOM messages even in interactive node. Has anyone encountered
this before? I've stuck in import gc at the top and gc.collect() at the
end of the script, too, but doesn't help.
From
G.S.
Traceback (most recent call last):
File "moviesGrey.py", line 35, in <module>
proj = pf.h.proj(0, field, weight_field=field)
File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/data_containers.py",
line 2001, in __init__
if self._okay_to_serialize and self.serialize:
self._serialize(node_name=self._node_name)
File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/data_containers.py",
line 970, in _serialize
self._store_fields(self._key_fields, node_name, force)
File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/data_containers.py",
line 947, in _store_fields
passthrough = True)
File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/utilities/parallel_tools/parallel_analysis_interface.py",
line 241, in in_order
f2(*args, **kwargs)
File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/hierarchy.py",
line 278, in _reload_data_file
self._data_file = h5py.File(self.__data_filename, self._data_mode)
File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/h5py/_hl/files.py", line
150, in __init__
fid = make_fid(name, mode, fapl)
File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/h5py/_hl/files.py", line
45, in make_fid
fid = h5f.open(name, h5f.ACC_RDONLY, fapl=plist)
File "h5f.pyx", line 70, in h5py.h5f.open (h5py/h5f.c:1618)
IOError: unable to open file (File accessability: Unable to open file)
Exception AttributeError: "'EnzoHierarchy' object has no attribute
'_data_file'" in
Hi Geoffrey,
If you're using a script that iterates with an individual python
instance (i.e., calling "python my_script.py") for each data dump,
then I don't think it's an OOM unless somehow multiple instances are
running simultaneously.
I'm not sure what to suggest other than talking to the help desk at
the particular supercomputer center.
-Matt
On Tue, Jul 3, 2012 at 5:32 AM, Geoffrey So
I am trying to make projections and slices in the same script and an iterating over several DD datasets, but after a while I get this error pasted below, it seems that there's some hdf5 error where h5py cannot open a file. I've made sure the file exists and the script continues to run if I change the DD#### to continue where it failed. I've already modified the script so that each DD is using a new python instace intead of iterating through the files with one python call. I tried this in queue and interactive and get the same error, sometimes after many DD, but sometimes after only 1 or 2. Is this indicative of memory issue? If it is, I'm not getting the OOM messages even in interactive node. Has anyone encountered this before? I've stuck in import gc at the top and gc.collect() at the end of the script, too, but doesn't help.
From G.S.
Traceback (most recent call last): File "moviesGrey.py", line 35, in <module> proj = pf.h.proj(0, field, weight_field=field) File "/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/data_containers.py", line 2001, in __init__ if self._okay_to_serialize and self.serialize: self._serialize(node_name=self._node_name) File "/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/data_containers.py", line 970, in _serialize self._store_fields(self._key_fields, node_name, force) File "/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/data_containers.py", line 947, in _store_fields passthrough = True) File "/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/utilities/parallel_tools/parallel_analysis_interface.py", line 241, in in_order f2(*args, **kwargs) File "/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/hierarchy.py", line 278, in _reload_data_file self._data_file = h5py.File(self.__data_filename, self._data_mode) File "/home/ux455076/dev-yt/lib/python2.7/site-packages/h5py/_hl/files.py", line 150, in __init__ fid = make_fid(name, mode, fapl) File "/home/ux455076/dev-yt/lib/python2.7/site-packages/h5py/_hl/files.py", line 45, in make_fid fid = h5f.open(name, h5f.ACC_RDONLY, fapl=plist) File "h5f.pyx", line 70, in h5py.h5f.open (h5py/h5f.c:1618) IOError: unable to open file (File accessability: Unable to open file) Exception AttributeError: "'EnzoHierarchy' object has no attribute '_data_file'" in
> ignored _______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Already submitted the ticket yesterday just haven't heard back.
Yeah I didn't think it was OOM, just wanted to confirm it with the experts,
thanks!
From
G.S.
On Tue, Jul 3, 2012 at 3:22 AM, Matthew Turk
Hi Geoffrey,
If you're using a script that iterates with an individual python instance (i.e., calling "python my_script.py") for each data dump, then I don't think it's an OOM unless somehow multiple instances are running simultaneously.
I'm not sure what to suggest other than talking to the help desk at the particular supercomputer center.
-Matt
On Tue, Jul 3, 2012 at 5:32 AM, Geoffrey So
wrote: I am trying to make projections and slices in the same script and an iterating over several DD datasets, but after a while I get this error pasted below, it seems that there's some hdf5 error where h5py cannot open a file. I've made sure the file exists and the script continues to run if I change the DD#### to continue where it failed. I've already modified the script so that each DD is using a new python instace intead of iterating through the files with one python call. I tried this in queue and interactive and get the same error, sometimes after many DD, but sometimes after only 1 or 2. Is this indicative of memory issue? If it is, I'm not getting the OOM messages even in interactive node. Has anyone encountered this before? I've stuck in import gc at the top and gc.collect() at the end of the script, too, but doesn't help.
From G.S.
Traceback (most recent call last): File "moviesGrey.py", line 35, in <module> proj = pf.h.proj(0, field, weight_field=field) File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/data_containers.py",
line 2001, in __init__ if self._okay_to_serialize and self.serialize: self._serialize(node_name=self._node_name) File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/data_containers.py",
line 970, in _serialize self._store_fields(self._key_fields, node_name, force) File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/data_containers.py",
line 947, in _store_fields passthrough = True) File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/utilities/parallel_tools/parallel_analysis_interface.py",
line 241, in in_order f2(*args, **kwargs) File
line 278, in _reload_data_file self._data_file = h5py.File(self.__data_filename, self._data_mode) File "/home/ux455076/dev-yt/lib/python2.7/site-packages/h5py/_hl/files.py",
150, in __init__ fid = make_fid(name, mode, fapl) File "/home/ux455076/dev-yt/lib/python2.7/site-packages/h5py/_hl/files.py",
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/hierarchy.py", line line
45, in make_fid fid = h5f.open(name, h5f.ACC_RDONLY, fapl=plist) File "h5f.pyx", line 70, in h5py.h5f.open (h5py/h5f.c:1618) IOError: unable to open file (File accessability: Unable to open file) Exception AttributeError: "'EnzoHierarchy' object has no attribute '_data_file'" in
> ignored _______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Hi Geoffrey,
This could also be due simply to a single corrupt hdf5 file. I have seen
these errors before where a file did not get fully written. You might want
to verify that all of the files associated with that dataset are readable.
Britton
On Tue, Jul 3, 2012 at 1:28 PM, Geoffrey So
Already submitted the ticket yesterday just haven't heard back. Yeah I didn't think it was OOM, just wanted to confirm it with the experts, thanks!
From G.S.
On Tue, Jul 3, 2012 at 3:22 AM, Matthew Turk
wrote: Hi Geoffrey,
If you're using a script that iterates with an individual python instance (i.e., calling "python my_script.py") for each data dump, then I don't think it's an OOM unless somehow multiple instances are running simultaneously.
I'm not sure what to suggest other than talking to the help desk at the particular supercomputer center.
-Matt
I am trying to make projections and slices in the same script and an iterating over several DD datasets, but after a while I get this error pasted below, it seems that there's some hdf5 error where h5py cannot open a file. I've made sure the file exists and the script continues to run if I change the DD#### to continue where it failed. I've already modified
script so that each DD is using a new python instace intead of iterating through the files with one python call. I tried this in queue and interactive and get the same error, sometimes after many DD, but sometimes after only 1 or 2. Is this indicative of memory issue? If it is, I'm not getting the OOM messages even in interactive node. Has anyone encountered this before? I've stuck in import gc at the top and gc.collect() at
On Tue, Jul 3, 2012 at 5:32 AM, Geoffrey So
wrote: the the end of the script, too, but doesn't help.
From G.S.
Traceback (most recent call last): File "moviesGrey.py", line 35, in <module> proj = pf.h.proj(0, field, weight_field=field) File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/data_containers.py",
line 2001, in __init__ if self._okay_to_serialize and self.serialize: self._serialize(node_name=self._node_name) File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/data_containers.py",
line 970, in _serialize self._store_fields(self._key_fields, node_name, force) File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/data_containers.py",
line 947, in _store_fields passthrough = True) File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/utilities/parallel_tools/parallel_analysis_interface.py",
line 241, in in_order f2(*args, **kwargs) File
line 278, in _reload_data_file self._data_file = h5py.File(self.__data_filename, self._data_mode) File "/home/ux455076/dev-yt/lib/python2.7/site-packages/h5py/_hl/files.py",
150, in __init__ fid = make_fid(name, mode, fapl) File "/home/ux455076/dev-yt/lib/python2.7/site-packages/h5py/_hl/files.py",
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/hierarchy.py", line line
45, in make_fid fid = h5f.open(name, h5f.ACC_RDONLY, fapl=plist) File "h5f.pyx", line 70, in h5py.h5f.open (h5py/h5f.c:1618) IOError: unable to open file (File accessability: Unable to open file) Exception AttributeError: "'EnzoHierarchy' object has no attribute '_data_file'" in
> ignored _______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
I've verified that there's nothing wrong with the HDF5 files themselves,
because I can restart the script at the DD of failure and it would continue
without problem, until it dies again with similar error. I've encountered
corrupted files before, and it would be just the same error, except I
cannot restart because it fails at the same spot again every time, this
isn't one of those times though.
From
G.S.
On Tue, Jul 3, 2012 at 10:32 AM, Britton Smith
Hi Geoffrey,
This could also be due simply to a single corrupt hdf5 file. I have seen these errors before where a file did not get fully written. You might want to verify that all of the files associated with that dataset are readable.
Britton
On Tue, Jul 3, 2012 at 1:28 PM, Geoffrey So
wrote: Already submitted the ticket yesterday just haven't heard back. Yeah I didn't think it was OOM, just wanted to confirm it with the experts, thanks!
From G.S.
On Tue, Jul 3, 2012 at 3:22 AM, Matthew Turk
wrote: Hi Geoffrey,
If you're using a script that iterates with an individual python instance (i.e., calling "python my_script.py") for each data dump, then I don't think it's an OOM unless somehow multiple instances are running simultaneously.
I'm not sure what to suggest other than talking to the help desk at the particular supercomputer center.
-Matt
I am trying to make projections and slices in the same script and an iterating over several DD datasets, but after a while I get this error pasted below, it seems that there's some hdf5 error where h5py cannot open a file. I've made sure the file exists and the script continues to run if I change the DD#### to continue where it failed. I've already modified
script so that each DD is using a new python instace intead of iterating through the files with one python call. I tried this in queue and interactive and get the same error, sometimes after many DD, but sometimes after only 1 or 2. Is this indicative of memory issue? If it is, I'm not getting the OOM messages even in interactive node. Has anyone encountered this before? I've stuck in import gc at the top and gc.collect() at
On Tue, Jul 3, 2012 at 5:32 AM, Geoffrey So
wrote: the the end of the script, too, but doesn't help.
From G.S.
Traceback (most recent call last): File "moviesGrey.py", line 35, in <module> proj = pf.h.proj(0, field, weight_field=field) File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/data_containers.py",
line 2001, in __init__ if self._okay_to_serialize and self.serialize: self._serialize(node_name=self._node_name) File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/data_containers.py",
line 970, in _serialize self._store_fields(self._key_fields, node_name, force) File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/data_containers.py",
line 947, in _store_fields passthrough = True) File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/utilities/parallel_tools/parallel_analysis_interface.py",
line 241, in in_order f2(*args, **kwargs) File
line 278, in _reload_data_file self._data_file = h5py.File(self.__data_filename, self._data_mode) File "/home/ux455076/dev-yt/lib/python2.7/site-packages/h5py/_hl/files.py",
150, in __init__ fid = make_fid(name, mode, fapl) File "/home/ux455076/dev-yt/lib/python2.7/site-packages/h5py/_hl/files.py",
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/hierarchy.py", line line
45, in make_fid fid = h5f.open(name, h5f.ACC_RDONLY, fapl=plist) File "h5f.pyx", line 70, in h5py.h5f.open (h5py/h5f.c:1618) IOError: unable to open file (File accessability: Unable to open file) Exception AttributeError: "'EnzoHierarchy' object has no attribute '_data_file'" in
> ignored _______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Hi Geoffrey,
Yeah, seems weird. But without knowing more about how you are
launching your script or what it does, I think we might be out of
ideas. If you hear back and it's definitely funny business with
yt/Python/etc, write back and let us know.
-Matt
On Tue, Jul 3, 2012 at 2:05 PM, Geoffrey So
I've verified that there's nothing wrong with the HDF5 files themselves, because I can restart the script at the DD of failure and it would continue without problem, until it dies again with similar error. I've encountered corrupted files before, and it would be just the same error, except I cannot restart because it fails at the same spot again every time, this isn't one of those times though.
From G.S.
On Tue, Jul 3, 2012 at 10:32 AM, Britton Smith
wrote: Hi Geoffrey,
This could also be due simply to a single corrupt hdf5 file. I have seen these errors before where a file did not get fully written. You might want to verify that all of the files associated with that dataset are readable.
Britton
On Tue, Jul 3, 2012 at 1:28 PM, Geoffrey So
wrote: Already submitted the ticket yesterday just haven't heard back. Yeah I didn't think it was OOM, just wanted to confirm it with the experts, thanks!
From G.S.
On Tue, Jul 3, 2012 at 3:22 AM, Matthew Turk
wrote: Hi Geoffrey,
If you're using a script that iterates with an individual python instance (i.e., calling "python my_script.py") for each data dump, then I don't think it's an OOM unless somehow multiple instances are running simultaneously.
I'm not sure what to suggest other than talking to the help desk at the particular supercomputer center.
-Matt
On Tue, Jul 3, 2012 at 5:32 AM, Geoffrey So
wrote: I am trying to make projections and slices in the same script and an iterating over several DD datasets, but after a while I get this error pasted below, it seems that there's some hdf5 error where h5py cannot open a file. I've made sure the file exists and the script continues to run if I change the DD#### to continue where it failed. I've already modified the script so that each DD is using a new python instace intead of iterating through the files with one python call. I tried this in queue and interactive and get the same error, sometimes after many DD, but sometimes after only 1 or 2. Is this indicative of memory issue? If it is, I'm not getting the OOM messages even in interactive node. Has anyone encountered this before? I've stuck in import gc at the top and gc.collect() at the end of the script, too, but doesn't help.
From G.S.
Traceback (most recent call last): File "moviesGrey.py", line 35, in <module> proj = pf.h.proj(0, field, weight_field=field) File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/data_containers.py", line 2001, in __init__ if self._okay_to_serialize and self.serialize: self._serialize(node_name=self._node_name) File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/data_containers.py", line 970, in _serialize self._store_fields(self._key_fields, node_name, force) File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/data_containers.py", line 947, in _store_fields passthrough = True) File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/utilities/parallel_tools/parallel_analysis_interface.py", line 241, in in_order f2(*args, **kwargs) File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/hierarchy.py", line 278, in _reload_data_file self._data_file = h5py.File(self.__data_filename, self._data_mode) File "/home/ux455076/dev-yt/lib/python2.7/site-packages/h5py/_hl/files.py", line 150, in __init__ fid = make_fid(name, mode, fapl) File "/home/ux455076/dev-yt/lib/python2.7/site-packages/h5py/_hl/files.py", line 45, in make_fid fid = h5f.open(name, h5f.ACC_RDONLY, fapl=plist) File "h5f.pyx", line 70, in h5py.h5f.open (h5py/h5f.c:1618) IOError: unable to open file (File accessability: Unable to open file) Exception AttributeError: "'EnzoHierarchy' object has no attribute '_data_file'" in
> ignored _______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Will do.
From
G.S.
On Tue, Jul 3, 2012 at 11:09 AM, Matthew Turk
Hi Geoffrey,
Yeah, seems weird. But without knowing more about how you are launching your script or what it does, I think we might be out of ideas. If you hear back and it's definitely funny business with yt/Python/etc, write back and let us know.
-Matt
On Tue, Jul 3, 2012 at 2:05 PM, Geoffrey So
wrote: I've verified that there's nothing wrong with the HDF5 files themselves, because I can restart the script at the DD of failure and it would continue without problem, until it dies again with similar error. I've encountered corrupted files before, and it would be just the same error, except I cannot restart because it fails at the same spot again every time, this isn't one of those times though.
From G.S.
On Tue, Jul 3, 2012 at 10:32 AM, Britton Smith
wrote: Hi Geoffrey,
This could also be due simply to a single corrupt hdf5 file. I have
seen
these errors before where a file did not get fully written. You might want to verify that all of the files associated with that dataset are readable.
Britton
On Tue, Jul 3, 2012 at 1:28 PM, Geoffrey So
wrote: Already submitted the ticket yesterday just haven't heard back. Yeah I didn't think it was OOM, just wanted to confirm it with the experts, thanks!
From G.S.
On Tue, Jul 3, 2012 at 3:22 AM, Matthew Turk
wrote: Hi Geoffrey,
If you're using a script that iterates with an individual python instance (i.e., calling "python my_script.py") for each data dump, then I don't think it's an OOM unless somehow multiple instances are running simultaneously.
I'm not sure what to suggest other than talking to the help desk at the particular supercomputer center.
-Matt
On Tue, Jul 3, 2012 at 5:32 AM, Geoffrey So
wrote: I am trying to make projections and slices in the same script and an iterating over several DD datasets, but after a while I get this
error
pasted below, it seems that there's some hdf5 error where h5py cannot open a file. I've made sure the file exists and the script continues to run if I change the DD#### to continue where it failed. I've already modified the script so that each DD is using a new python instace intead of iterating through the files with one python call. I tried this in queue and interactive and get the same error, sometimes after many DD, but sometimes after only 1 or 2. Is this indicative of memory issue? If it is, I'm not getting the OOM messages even in interactive node. Has anyone encountered this before? I've stuck in import gc at the top and gc.collect() at the end of the script, too, but doesn't help.
From G.S.
Traceback (most recent call last): File "moviesGrey.py", line 35, in <module> proj = pf.h.proj(0, field, weight_field=field) File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/data_containers.py",
line 2001, in __init__ if self._okay_to_serialize and self.serialize: self._serialize(node_name=self._node_name) File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/data_containers.py",
line 970, in _serialize self._store_fields(self._key_fields, node_name, force) File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/data_containers.py",
line 947, in _store_fields passthrough = True) File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/utilities/parallel_tools/parallel_analysis_interface.py",
line 241, in in_order f2(*args, **kwargs) File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/yt-2.4dev-py2.7-linux-x86_64.egg/yt/data_objects/hierarchy.py",
line 278, in _reload_data_file self._data_file = h5py.File(self.__data_filename, self._data_mode) File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/h5py/_hl/files.py",
line 150, in __init__ fid = make_fid(name, mode, fapl) File
"/home/ux455076/dev-yt/lib/python2.7/site-packages/h5py/_hl/files.py",
line 45, in make_fid fid = h5f.open(name, h5f.ACC_RDONLY, fapl=plist) File "h5f.pyx", line 70, in h5py.h5f.open (h5py/h5f.c:1618) IOError: unable to open file (File accessability: Unable to open file) Exception AttributeError: "'EnzoHierarchy' object has no attribute '_data_file'" in
> ignored _______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
participants (3)
-
Britton Smith
-
Geoffrey So
-
Matthew Turk