Hi, A bigger problem.... The lastest version of yt The current version of the code is: --- a519b8754ba8 (yt) tip --- has problems with pickle. If I pickle a data set and then try and unpickle it, I see: unpickle core data --------------------------------------------------------------------------- KeyError Traceback (most recent call last) /home/tasker/yt/src/yt-hg/scripts/iyt in <module>() 338 339 #file = open('cores.pickle','rb') --> 340 allcloudcores = cPickle.load(file('cores.pickle', 'rb')) 341 file.close() 342 /home/tasker/yt/src/yt-hg/yt/data_objects/data_containers.pyc in _reconstruct_object(*args, **kwargs) 3676 else: new_args.append(arg) 3677 pfs = ParameterFileStore() -> 3678 pf = pfs.get_pf_hash(pfid) 3679 cls = getattr(pf.h, dtype) 3680 obj = cls(*new_args) /home/tasker/yt/src/yt-hg/yt/utilities/parameter_file_storage.pyc in get_pf_hash(self, hash) 106 def get_pf_hash(self, hash): 107 """ This returns a parameter file based on a hash. """ --> 108 return self._convert_pf(self._records[hash]) 109 110 def get_pf_ctid(self, ctid): KeyError: (('283b7c4d88671dbff7acf083098da6ae',), <function _reconstruct_object at 0x272c938>, ('283b7c4d88671dbff7acf083098da6ae', 'region', array([ 16., 16., 16.]), array([ 0., 0., 0.]), array([ 32., 32., 32.]), {'disk_center': array([16, 16, 16]), 'center': array([ 16., 16., 16.]), 'bulk_velocity': array([ 0., 0., 0.]), 'disk_vector': array([0, 0, 1]), 'disk_radius': array([ 0.1 , 0.1358, 0.1716, 0.2074, 0.2432, 0.279 , The fact it mentions the field parameters in the last line, might mean it's an error introduced because of the corrections to them (that I have been demanding!)? The script I have works fine (pickles and all) on yt version: --- 16e8d749a806 (yt) tip --- Elizabeth
Hi Elizabeth, We recently changed the mechanism of storing parameter files to be off-by-default, which means that rather than dumping them to a .csv file, they only stay in memory. This functionality can be turned back on (this change is documented in the development documentation.) Can you try: 1) Loading your parameter file, *then* loading your pickle. I am optimistic this will fix it. 2) If it doesn't, at the very top of your python script, put this: from yt.config import ytcfg; ytcfg["yt","storeparameterfiles"] = "True" -Matt On Wed, Nov 30, 2011 at 4:46 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Hi,
A bigger problem....
The lastest version of yt
The current version of the code is:
--- a519b8754ba8 (yt) tip ---
has problems with pickle. If I pickle a data set and then try and unpickle it, I see:
unpickle core data --------------------------------------------------------------------------- KeyError Traceback (most recent call last)
/home/tasker/yt/src/yt-hg/scripts/iyt in <module>() 338 339 #file = open('cores.pickle','rb') --> 340 allcloudcores = cPickle.load(file('cores.pickle', 'rb')) 341 file.close() 342
/home/tasker/yt/src/yt-hg/yt/data_objects/data_containers.pyc in _reconstruct_object(*args, **kwargs) 3676 else: new_args.append(arg) 3677 pfs = ParameterFileStore() -> 3678 pf = pfs.get_pf_hash(pfid) 3679 cls = getattr(pf.h, dtype) 3680 obj = cls(*new_args)
/home/tasker/yt/src/yt-hg/yt/utilities/parameter_file_storage.pyc in get_pf_hash(self, hash) 106 def get_pf_hash(self, hash): 107 """ This returns a parameter file based on a hash. """ --> 108 return self._convert_pf(self._records[hash]) 109 110 def get_pf_ctid(self, ctid):
KeyError: (('283b7c4d88671dbff7acf083098da6ae',), <function _reconstruct_object at 0x272c938>, ('283b7c4d88671dbff7acf083098da6ae', 'region', array([ 16., 16., 16.]), array([ 0., 0., 0.]), array([ 32., 32., 32.]), {'disk_center': array([16, 16, 16]), 'center': array([ 16., 16., 16.]), 'bulk_velocity': array([ 0., 0., 0.]), 'disk_vector': array([0, 0, 1]), 'disk_radius': array([ 0.1 , 0.1358, 0.1716, 0.2074, 0.2432, 0.279 ,
The fact it mentions the field parameters in the last line, might mean it's an error introduced because of the corrections to them (that I have been demanding!)?
The script I have works fine (pickles and all) on yt version:
--- 16e8d749a806 (yt) tip ---
Elizabeth _______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Hi Matt, Sorry, that still doesn't work.... unless I've misunderstood your suggestion? The code is here: http://paste.yt-project.org/show/1974/ Pickle + unpickle on a pile of properties I calculate from the connected sets is fine, but pickling the connected sets themselves is a problem. Could it be the later pickle doesn't require the parameter file? Or is it possible there is an unrelated write error happening? Elizabeth On 30 November 2011 23:43, Matthew Turk <matthewturk@gmail.com> wrote:
Hi Elizabeth,
We recently changed the mechanism of storing parameter files to be off-by-default, which means that rather than dumping them to a .csv file, they only stay in memory. This functionality can be turned back on (this change is documented in the development documentation.) Can you try:
1) Loading your parameter file, *then* loading your pickle. I am optimistic this will fix it. 2) If it doesn't, at the very top of your python script, put this:
from yt.config import ytcfg; ytcfg["yt","storeparameterfiles"] = "True"
-Matt
On Wed, Nov 30, 2011 at 4:46 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Hi,
A bigger problem....
The lastest version of yt
The current version of the code is:
--- a519b8754ba8 (yt) tip ---
has problems with pickle. If I pickle a data set and then try and unpickle it, I see:
unpickle core data --------------------------------------------------------------------------- KeyError Traceback (most recent call last)
/home/tasker/yt/src/yt-hg/scripts/iyt in <module>() 338 339 #file = open('cores.pickle','rb') --> 340 allcloudcores = cPickle.load(file('cores.pickle', 'rb')) 341 file.close() 342
/home/tasker/yt/src/yt-hg/yt/data_objects/data_containers.pyc in _reconstruct_object(*args, **kwargs) 3676 else: new_args.append(arg) 3677 pfs = ParameterFileStore() -> 3678 pf = pfs.get_pf_hash(pfid) 3679 cls = getattr(pf.h, dtype) 3680 obj = cls(*new_args)
/home/tasker/yt/src/yt-hg/yt/utilities/parameter_file_storage.pyc in get_pf_hash(self, hash) 106 def get_pf_hash(self, hash): 107 """ This returns a parameter file based on a hash. """ --> 108 return self._convert_pf(self._records[hash]) 109 110 def get_pf_ctid(self, ctid):
KeyError: (('283b7c4d88671dbff7acf083098da6ae',), <function _reconstruct_object at 0x272c938>, ('283b7c4d88671dbff7acf083098da6ae', 'region', array([ 16., 16., 16.]), array([ 0., 0., 0.]), array([ 32., 32., 32.]), {'disk_center': array([16, 16, 16]), 'center': array([ 16., 16., 16.]), 'bulk_velocity': array([ 0., 0., 0.]), 'disk_vector': array([0, 0, 1]), 'disk_radius': array([ 0.1 , 0.1358, 0.1716, 0.2074, 0.2432, 0.279 ,
The fact it mentions the field parameters in the last line, might mean it's an error introduced because of the corrections to them (that I have been demanding!)?
The script I have works fine (pickles and all) on yt version:
--- 16e8d749a806 (yt) tip ---
Elizabeth _______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
I also tried switching out pickle for the yt method of storing objects. Unfortunately, it doesn't seem to save the object I want. I tried: contours = dd.extract_connected_sets("NegEscapeVelocity", 1, 30.0, maxv, log_space=False) contours.save_object("cloudcores", "storedcores.cpkl") In [1]: contours.save_object("cloudcores", "storedcores.cpkl") --------------------------------------------------------------------------- AttributeError Traceback (most recent call last) /home/tasker/yt/src/yt-hg/scripts/iyt in <module>() ----> 1 2 3 4 5 AttributeError: 'tuple' object has no attribute 'save_object' The alternative method --of saving it via the hierarchy-- is slightly less good since I'm running multiple yt scripts on the same data set. (As least, when I tried to load the values back in, it produced an empty set which I presume was because I'd inadvertently overwritten the .yt file?). If I could get one of these working again, then I'd be super happy. I don't need the all :) Elizabeth On 1 December 2011 12:35, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Hi Matt,
Sorry, that still doesn't work.... unless I've misunderstood your suggestion? The code is here:
http://paste.yt-project.org/show/1974/
Pickle + unpickle on a pile of properties I calculate from the connected sets is fine, but pickling the connected sets themselves is a problem. Could it be the later pickle doesn't require the parameter file? Or is it possible there is an unrelated write error happening?
Elizabeth
On 30 November 2011 23:43, Matthew Turk <matthewturk@gmail.com> wrote:
Hi Elizabeth,
We recently changed the mechanism of storing parameter files to be off-by-default, which means that rather than dumping them to a .csv file, they only stay in memory. This functionality can be turned back on (this change is documented in the development documentation.) Can you try:
1) Loading your parameter file, *then* loading your pickle. I am optimistic this will fix it. 2) If it doesn't, at the very top of your python script, put this:
from yt.config import ytcfg; ytcfg["yt","storeparameterfiles"] = "True"
-Matt
On Wed, Nov 30, 2011 at 4:46 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Hi,
A bigger problem....
The lastest version of yt
The current version of the code is:
--- a519b8754ba8 (yt) tip ---
has problems with pickle. If I pickle a data set and then try and unpickle it, I see:
unpickle core data --------------------------------------------------------------------------- KeyError Traceback (most recent call last)
/home/tasker/yt/src/yt-hg/scripts/iyt in <module>() 338 339 #file = open('cores.pickle','rb') --> 340 allcloudcores = cPickle.load(file('cores.pickle', 'rb')) 341 file.close() 342
/home/tasker/yt/src/yt-hg/yt/data_objects/data_containers.pyc in _reconstruct_object(*args, **kwargs) 3676 else: new_args.append(arg) 3677 pfs = ParameterFileStore() -> 3678 pf = pfs.get_pf_hash(pfid) 3679 cls = getattr(pf.h, dtype) 3680 obj = cls(*new_args)
/home/tasker/yt/src/yt-hg/yt/utilities/parameter_file_storage.pyc in get_pf_hash(self, hash) 106 def get_pf_hash(self, hash): 107 """ This returns a parameter file based on a hash. """ --> 108 return self._convert_pf(self._records[hash]) 109 110 def get_pf_ctid(self, ctid):
KeyError: (('283b7c4d88671dbff7acf083098da6ae',), <function _reconstruct_object at 0x272c938>, ('283b7c4d88671dbff7acf083098da6ae', 'region', array([ 16., 16., 16.]), array([ 0., 0., 0.]), array([ 32., 32., 32.]), {'disk_center': array([16, 16, 16]), 'center': array([ 16., 16., 16.]), 'bulk_velocity': array([ 0., 0., 0.]), 'disk_vector': array([0, 0, 1]), 'disk_radius': array([ 0.1 , 0.1358, 0.1716, 0.2074, 0.2432, 0.279 ,
The fact it mentions the field parameters in the last line, might mean it's an error introduced because of the corrections to them (that I have been demanding!)?
The script I have works fine (pickles and all) on yt version:
--- 16e8d749a806 (yt) tip ---
Elizabeth _______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Thanks, Elizabeth. The move to in-memory brought up a funny bug where the initialization process was reseting the list of known parameter hashes. This is fixed in c1a446be3a7a. Also, save_object won't work except on yt objects; the result of extract_connected_sets is a tuple of values and python objects. Pickle will, tho. On Thu, Dec 1, 2011 at 4:33 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
I also tried switching out pickle for the yt method of storing objects. Unfortunately, it doesn't seem to save the object I want. I tried:
contours = dd.extract_connected_sets("NegEscapeVelocity", 1, 30.0, maxv, log_space=False) contours.save_object("cloudcores", "storedcores.cpkl")
In [1]: contours.save_object("cloudcores", "storedcores.cpkl") --------------------------------------------------------------------------- AttributeError Traceback (most recent call last)
/home/tasker/yt/src/yt-hg/scripts/iyt in <module>() ----> 1 2 3 4 5
AttributeError: 'tuple' object has no attribute 'save_object'
The alternative method --of saving it via the hierarchy-- is slightly less good since I'm running multiple yt scripts on the same data set. (As least, when I tried to load the values back in, it produced an empty set which I presume was because I'd inadvertently overwritten the .yt file?).
If I could get one of these working again, then I'd be super happy. I don't need the all :)
Elizabeth
On 1 December 2011 12:35, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Hi Matt,
Sorry, that still doesn't work.... unless I've misunderstood your suggestion? The code is here:
http://paste.yt-project.org/show/1974/
Pickle + unpickle on a pile of properties I calculate from the connected sets is fine, but pickling the connected sets themselves is a problem. Could it be the later pickle doesn't require the parameter file? Or is it possible there is an unrelated write error happening?
Elizabeth
On 30 November 2011 23:43, Matthew Turk <matthewturk@gmail.com> wrote:
Hi Elizabeth,
We recently changed the mechanism of storing parameter files to be off-by-default, which means that rather than dumping them to a .csv file, they only stay in memory. This functionality can be turned back on (this change is documented in the development documentation.) Can you try:
1) Loading your parameter file, *then* loading your pickle. I am optimistic this will fix it. 2) If it doesn't, at the very top of your python script, put this:
from yt.config import ytcfg; ytcfg["yt","storeparameterfiles"] = "True"
-Matt
On Wed, Nov 30, 2011 at 4:46 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Hi,
A bigger problem....
The lastest version of yt
The current version of the code is:
--- a519b8754ba8 (yt) tip ---
has problems with pickle. If I pickle a data set and then try and unpickle it, I see:
unpickle core data --------------------------------------------------------------------------- KeyError Traceback (most recent call last)
/home/tasker/yt/src/yt-hg/scripts/iyt in <module>() 338 339 #file = open('cores.pickle','rb') --> 340 allcloudcores = cPickle.load(file('cores.pickle', 'rb')) 341 file.close() 342
/home/tasker/yt/src/yt-hg/yt/data_objects/data_containers.pyc in _reconstruct_object(*args, **kwargs) 3676 else: new_args.append(arg) 3677 pfs = ParameterFileStore() -> 3678 pf = pfs.get_pf_hash(pfid) 3679 cls = getattr(pf.h, dtype) 3680 obj = cls(*new_args)
/home/tasker/yt/src/yt-hg/yt/utilities/parameter_file_storage.pyc in get_pf_hash(self, hash) 106 def get_pf_hash(self, hash): 107 """ This returns a parameter file based on a hash. """ --> 108 return self._convert_pf(self._records[hash]) 109 110 def get_pf_ctid(self, ctid):
KeyError: (('283b7c4d88671dbff7acf083098da6ae',), <function _reconstruct_object at 0x272c938>, ('283b7c4d88671dbff7acf083098da6ae', 'region', array([ 16., 16., 16.]), array([ 0., 0., 0.]), array([ 32., 32., 32.]), {'disk_center': array([16, 16, 16]), 'center': array([ 16., 16., 16.]), 'bulk_velocity': array([ 0., 0., 0.]), 'disk_vector': array([0, 0, 1]), 'disk_radius': array([ 0.1 , 0.1358, 0.1716, 0.2074, 0.2432, 0.279 ,
The fact it mentions the field parameters in the last line, might mean it's an error introduced because of the corrections to them (that I have been demanding!)?
The script I have works fine (pickles and all) on yt version:
--- 16e8d749a806 (yt) tip ---
Elizabeth _______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Sorry, still no luck! The current version of the code is: --- c1a446be3a7a (yt) tip --- unpickle core data --------------------------------------------------------------------------- KeyError Traceback (most recent call last) /home/taskere/yt-2/src/yt-hg/scripts/iyt in <module>() 199 200 file = open('cores.pickle','rb') --> 201 allcloudcores = cPickle.load(file) 202 file.close() 203 /home/taskere/yt-2/src/yt-hg/yt/data_objects/data_containers.pyc in _reconstruct_object(*args, **kwargs) 3640 else: new_args.append(arg) 3641 pfs = ParameterFileStore() -> 3642 pf = pfs.get_pf_hash(pfid) 3643 cls = getattr(pf.h, dtype) 3644 obj = cls(*new_args) /home/taskere/yt-2/src/yt-hg/yt/utilities/parameter_file_storage.pyc in get_pf_hash(self, hash) 109 def get_pf_hash(self, hash): 110 """ This returns a parameter file based on a hash. """ --> 111 return self._convert_pf(self._records[hash]) 112 113 def get_pf_ctid(self, ctid): KeyError: (('283b7c4d88671dbff7acf083098da6ae',), <function _reconstruct_object at 0xa8cd848>, ('283b7c4d88671dbff7acf083098da6ae', 'region', array([ 16., 16., 16.]), array([ 0., 0., 0.]), array([ 32., 32., 32.]), {'disk_center': array([16, 16, 16]), 'center': array([ 16., 16., 16.]), 'bulk_velocity': array([ 0., 0., 0.]), 'disk_vector': array([0, 0, 1]), 'disk_radius': array([ 0.1 , 0.1238, 0.1476, 0.1714, 0.1952, 0.219 , 0.2428, 0.2666, 0.2904, 0.3142, 0.33 Unless I need to re-pickle first? Elizabeth On 2011-12-01, at 8:20 PM, Matthew Turk wrote:
Thanks, Elizabeth. The move to in-memory brought up a funny bug where the initialization process was reseting the list of known parameter hashes. This is fixed in c1a446be3a7a.
Also, save_object won't work except on yt objects; the result of extract_connected_sets is a tuple of values and python objects. Pickle will, tho.
On Thu, Dec 1, 2011 at 4:33 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
I also tried switching out pickle for the yt method of storing objects. Unfortunately, it doesn't seem to save the object I want. I tried:
contours = dd.extract_connected_sets("NegEscapeVelocity", 1, 30.0, maxv, log_space=False) contours.save_object("cloudcores", "storedcores.cpkl")
In [1]: contours.save_object("cloudcores", "storedcores.cpkl") --------------------------------------------------------------------------- AttributeError Traceback (most recent call last)
/home/tasker/yt/src/yt-hg/scripts/iyt in <module>() ----> 1 2 3 4 5
AttributeError: 'tuple' object has no attribute 'save_object'
The alternative method --of saving it via the hierarchy-- is slightly less good since I'm running multiple yt scripts on the same data set. (As least, when I tried to load the values back in, it produced an empty set which I presume was because I'd inadvertently overwritten the .yt file?).
If I could get one of these working again, then I'd be super happy. I don't need the all :)
Elizabeth
On 1 December 2011 12:35, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Hi Matt,
Sorry, that still doesn't work.... unless I've misunderstood your suggestion? The code is here:
http://paste.yt-project.org/show/1974/
Pickle + unpickle on a pile of properties I calculate from the connected sets is fine, but pickling the connected sets themselves is a problem. Could it be the later pickle doesn't require the parameter file? Or is it possible there is an unrelated write error happening?
Elizabeth
On 30 November 2011 23:43, Matthew Turk <matthewturk@gmail.com> wrote:
Hi Elizabeth,
We recently changed the mechanism of storing parameter files to be off-by-default, which means that rather than dumping them to a .csv file, they only stay in memory. This functionality can be turned back on (this change is documented in the development documentation.) Can you try:
1) Loading your parameter file, *then* loading your pickle. I am optimistic this will fix it. 2) If it doesn't, at the very top of your python script, put this:
from yt.config import ytcfg; ytcfg["yt","storeparameterfiles"] = "True"
-Matt
On Wed, Nov 30, 2011 at 4:46 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Hi,
A bigger problem....
The lastest version of yt
The current version of the code is:
--- a519b8754ba8 (yt) tip ---
has problems with pickle. If I pickle a data set and then try and unpickle it, I see:
unpickle core data --------------------------------------------------------------------------- KeyError Traceback (most recent call last)
/home/tasker/yt/src/yt-hg/scripts/iyt in <module>() 338 339 #file = open('cores.pickle','rb') --> 340 allcloudcores = cPickle.load(file('cores.pickle', 'rb')) 341 file.close() 342
/home/tasker/yt/src/yt-hg/yt/data_objects/data_containers.pyc in _reconstruct_object(*args, **kwargs) 3676 else: new_args.append(arg) 3677 pfs = ParameterFileStore() -> 3678 pf = pfs.get_pf_hash(pfid) 3679 cls = getattr(pf.h, dtype) 3680 obj = cls(*new_args)
/home/tasker/yt/src/yt-hg/yt/utilities/parameter_file_storage.pyc in get_pf_hash(self, hash) 106 def get_pf_hash(self, hash): 107 """ This returns a parameter file based on a hash. """ --> 108 return self._convert_pf(self._records[hash]) 109 110 def get_pf_ctid(self, ctid):
KeyError: (('283b7c4d88671dbff7acf083098da6ae',), <function _reconstruct_object at 0x272c938>, ('283b7c4d88671dbff7acf083098da6ae', 'region', array([ 16., 16., 16.]), array([ 0., 0., 0.]), array([ 32., 32., 32.]), {'disk_center': array([16, 16, 16]), 'center': array([ 16., 16., 16.]), 'bulk_velocity': array([ 0., 0., 0.]), 'disk_vector': array([0, 0, 1]), 'disk_radius': array([ 0.1 , 0.1358, 0.1716, 0.2074, 0.2432, 0.279 ,
The fact it mentions the field parameters in the last line, might mean it's an error introduced because of the corrections to them (that I have been demanding!)?
The script I have works fine (pickles and all) on yt version:
--- 16e8d749a806 (yt) tip ---
Elizabeth _______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Hi Elizabeth, What is the output of pf._hash() ? -Matt On Thu, Dec 1, 2011 at 9:32 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Sorry, still no luck!
The current version of the code is:
--- c1a446be3a7a (yt) tip ---
unpickle core data --------------------------------------------------------------------------- KeyError Traceback (most recent call last)
/home/taskere/yt-2/src/yt-hg/scripts/iyt in <module>() 199 200 file = open('cores.pickle','rb') --> 201 allcloudcores = cPickle.load(file) 202 file.close() 203
/home/taskere/yt-2/src/yt-hg/yt/data_objects/data_containers.pyc in _reconstruct_object(*args, **kwargs) 3640 else: new_args.append(arg) 3641 pfs = ParameterFileStore() -> 3642 pf = pfs.get_pf_hash(pfid) 3643 cls = getattr(pf.h, dtype) 3644 obj = cls(*new_args)
/home/taskere/yt-2/src/yt-hg/yt/utilities/parameter_file_storage.pyc in get_pf_hash(self, hash) 109 def get_pf_hash(self, hash): 110 """ This returns a parameter file based on a hash. """ --> 111 return self._convert_pf(self._records[hash]) 112 113 def get_pf_ctid(self, ctid):
KeyError: (('283b7c4d88671dbff7acf083098da6ae',), <function _reconstruct_object at 0xa8cd848>, ('283b7c4d88671dbff7acf083098da6ae', 'region', array([ 16., 16., 16.]), array([ 0., 0., 0.]), array([ 32., 32., 32.]), {'disk_center': array([16, 16, 16]), 'center': array([ 16., 16., 16.]), 'bulk_velocity': array([ 0., 0., 0.]), 'disk_vector': array([0, 0, 1]), 'disk_radius': array([ 0.1 , 0.1238, 0.1476, 0.1714, 0.1952, 0.219 , 0.2428, 0.2666, 0.2904, 0.3142, 0.33
Unless I need to re-pickle first?
Elizabeth
On 2011-12-01, at 8:20 PM, Matthew Turk wrote:
Thanks, Elizabeth. The move to in-memory brought up a funny bug where the initialization process was reseting the list of known parameter hashes. This is fixed in c1a446be3a7a.
Also, save_object won't work except on yt objects; the result of extract_connected_sets is a tuple of values and python objects. Pickle will, tho.
On Thu, Dec 1, 2011 at 4:33 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
I also tried switching out pickle for the yt method of storing objects. Unfortunately, it doesn't seem to save the object I want. I tried:
contours = dd.extract_connected_sets("NegEscapeVelocity", 1, 30.0, maxv, log_space=False) contours.save_object("cloudcores", "storedcores.cpkl")
In [1]: contours.save_object("cloudcores", "storedcores.cpkl") --------------------------------------------------------------------------- AttributeError Traceback (most recent call last)
/home/tasker/yt/src/yt-hg/scripts/iyt in <module>() ----> 1 2 3 4 5
AttributeError: 'tuple' object has no attribute 'save_object'
The alternative method --of saving it via the hierarchy-- is slightly less good since I'm running multiple yt scripts on the same data set. (As least, when I tried to load the values back in, it produced an empty set which I presume was because I'd inadvertently overwritten the .yt file?).
If I could get one of these working again, then I'd be super happy. I don't need the all :)
Elizabeth
On 1 December 2011 12:35, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Hi Matt,
Sorry, that still doesn't work.... unless I've misunderstood your suggestion? The code is here:
http://paste.yt-project.org/show/1974/
Pickle + unpickle on a pile of properties I calculate from the connected sets is fine, but pickling the connected sets themselves is a problem. Could it be the later pickle doesn't require the parameter file? Or is it possible there is an unrelated write error happening?
Elizabeth
On 30 November 2011 23:43, Matthew Turk <matthewturk@gmail.com> wrote:
Hi Elizabeth,
We recently changed the mechanism of storing parameter files to be off-by-default, which means that rather than dumping them to a .csv file, they only stay in memory. This functionality can be turned back on (this change is documented in the development documentation.) Can you try:
1) Loading your parameter file, *then* loading your pickle. I am optimistic this will fix it. 2) If it doesn't, at the very top of your python script, put this:
from yt.config import ytcfg; ytcfg["yt","storeparameterfiles"] = "True"
-Matt
On Wed, Nov 30, 2011 at 4:46 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Hi,
A bigger problem....
The lastest version of yt
The current version of the code is:
--- a519b8754ba8 (yt) tip ---
has problems with pickle. If I pickle a data set and then try and unpickle it, I see:
unpickle core data --------------------------------------------------------------------------- KeyError Traceback (most recent call last)
/home/tasker/yt/src/yt-hg/scripts/iyt in <module>() 338 339 #file = open('cores.pickle','rb') --> 340 allcloudcores = cPickle.load(file('cores.pickle', 'rb')) 341 file.close() 342
/home/tasker/yt/src/yt-hg/yt/data_objects/data_containers.pyc in _reconstruct_object(*args, **kwargs) 3676 else: new_args.append(arg) 3677 pfs = ParameterFileStore() -> 3678 pf = pfs.get_pf_hash(pfid) 3679 cls = getattr(pf.h, dtype) 3680 obj = cls(*new_args)
/home/tasker/yt/src/yt-hg/yt/utilities/parameter_file_storage.pyc in get_pf_hash(self, hash) 106 def get_pf_hash(self, hash): 107 """ This returns a parameter file based on a hash. """ --> 108 return self._convert_pf(self._records[hash]) 109 110 def get_pf_ctid(self, ctid):
KeyError: (('283b7c4d88671dbff7acf083098da6ae',), <function _reconstruct_object at 0x272c938>, ('283b7c4d88671dbff7acf083098da6ae', 'region', array([ 16., 16., 16.]), array([ 0., 0., 0.]), array([ 32., 32., 32.]), {'disk_center': array([16, 16, 16]), 'center': array([ 16., 16., 16.]), 'bulk_velocity': array([ 0., 0., 0.]), 'disk_vector': array([0, 0, 1]), 'disk_radius': array([ 0.1 , 0.1358, 0.1716, 0.2074, 0.2432, 0.279 ,
The fact it mentions the field parameters in the last line, might mean it's an error introduced because of the corrections to them (that I have been demanding!)?
The script I have works fine (pickles and all) on yt version:
--- 16e8d749a806 (yt) tip ---
Elizabeth _______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Tis: In [1]: pf._hash() Out[1]: '283b7c4d88671dbff7acf083098da6ae' E. On 2011-12-01, at 11:57 PM, Matthew Turk wrote:
Hi Elizabeth,
What is the output of pf._hash() ?
-Matt
On Thu, Dec 1, 2011 at 9:32 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Sorry, still no luck!
The current version of the code is:
--- c1a446be3a7a (yt) tip ---
unpickle core data --------------------------------------------------------------------------- KeyError Traceback (most recent call last)
/home/taskere/yt-2/src/yt-hg/scripts/iyt in <module>() 199 200 file = open('cores.pickle','rb') --> 201 allcloudcores = cPickle.load(file) 202 file.close() 203
/home/taskere/yt-2/src/yt-hg/yt/data_objects/data_containers.pyc in _reconstruct_object(*args, **kwargs) 3640 else: new_args.append(arg) 3641 pfs = ParameterFileStore() -> 3642 pf = pfs.get_pf_hash(pfid) 3643 cls = getattr(pf.h, dtype) 3644 obj = cls(*new_args)
/home/taskere/yt-2/src/yt-hg/yt/utilities/parameter_file_storage.pyc in get_pf_hash(self, hash) 109 def get_pf_hash(self, hash): 110 """ This returns a parameter file based on a hash. """ --> 111 return self._convert_pf(self._records[hash]) 112 113 def get_pf_ctid(self, ctid):
KeyError: (('283b7c4d88671dbff7acf083098da6ae',), <function _reconstruct_object at 0xa8cd848>, ('283b7c4d88671dbff7acf083098da6ae', 'region', array([ 16., 16., 16.]), array([ 0., 0., 0.]), array([ 32., 32., 32.]), {'disk_center': array([16, 16, 16]), 'center': array([ 16., 16., 16.]), 'bulk_velocity': array([ 0., 0., 0.]), 'disk_vector': array([0, 0, 1]), 'disk_radius': array([ 0.1 , 0.1238, 0.1476, 0.1714, 0.1952, 0.219 , 0.2428, 0.2666, 0.2904, 0.3142, 0.33
Unless I need to re-pickle first?
Elizabeth
On 2011-12-01, at 8:20 PM, Matthew Turk wrote:
Thanks, Elizabeth. The move to in-memory brought up a funny bug where the initialization process was reseting the list of known parameter hashes. This is fixed in c1a446be3a7a.
Also, save_object won't work except on yt objects; the result of extract_connected_sets is a tuple of values and python objects. Pickle will, tho.
On Thu, Dec 1, 2011 at 4:33 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
I also tried switching out pickle for the yt method of storing objects. Unfortunately, it doesn't seem to save the object I want. I tried:
contours = dd.extract_connected_sets("NegEscapeVelocity", 1, 30.0, maxv, log_space=False) contours.save_object("cloudcores", "storedcores.cpkl")
In [1]: contours.save_object("cloudcores", "storedcores.cpkl") --------------------------------------------------------------------------- AttributeError Traceback (most recent call last)
/home/tasker/yt/src/yt-hg/scripts/iyt in <module>() ----> 1 2 3 4 5
AttributeError: 'tuple' object has no attribute 'save_object'
The alternative method --of saving it via the hierarchy-- is slightly less good since I'm running multiple yt scripts on the same data set. (As least, when I tried to load the values back in, it produced an empty set which I presume was because I'd inadvertently overwritten the .yt file?).
If I could get one of these working again, then I'd be super happy. I don't need the all :)
Elizabeth
On 1 December 2011 12:35, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Hi Matt,
Sorry, that still doesn't work.... unless I've misunderstood your suggestion? The code is here:
http://paste.yt-project.org/show/1974/
Pickle + unpickle on a pile of properties I calculate from the connected sets is fine, but pickling the connected sets themselves is a problem. Could it be the later pickle doesn't require the parameter file? Or is it possible there is an unrelated write error happening?
Elizabeth
On 30 November 2011 23:43, Matthew Turk <matthewturk@gmail.com> wrote:
Hi Elizabeth,
We recently changed the mechanism of storing parameter files to be off-by-default, which means that rather than dumping them to a .csv file, they only stay in memory. This functionality can be turned back on (this change is documented in the development documentation.) Can you try:
1) Loading your parameter file, *then* loading your pickle. I am optimistic this will fix it. 2) If it doesn't, at the very top of your python script, put this:
from yt.config import ytcfg; ytcfg["yt","storeparameterfiles"] = "True"
-Matt
On Wed, Nov 30, 2011 at 4:46 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote: > Hi, > > A bigger problem.... > > The lastest version of yt > > The current version of the code is: > > --- > a519b8754ba8 (yt) tip > --- > > > has problems with pickle. If I pickle a data set and then try and > unpickle it, I see: > > unpickle core data > --------------------------------------------------------------------------- > KeyError Traceback (most recent call last) > > /home/tasker/yt/src/yt-hg/scripts/iyt in <module>() > 338 > 339 #file = open('cores.pickle','rb') > --> 340 allcloudcores = cPickle.load(file('cores.pickle', 'rb')) > 341 file.close() > 342 > > /home/tasker/yt/src/yt-hg/yt/data_objects/data_containers.pyc in > _reconstruct_object(*args, **kwargs) > 3676 else: new_args.append(arg) > 3677 pfs = ParameterFileStore() > -> 3678 pf = pfs.get_pf_hash(pfid) > 3679 cls = getattr(pf.h, dtype) > 3680 obj = cls(*new_args) > > /home/tasker/yt/src/yt-hg/yt/utilities/parameter_file_storage.pyc in > get_pf_hash(self, hash) > 106 def get_pf_hash(self, hash): > 107 """ This returns a parameter file based on a hash. """ > --> 108 return self._convert_pf(self._records[hash]) > 109 > 110 def get_pf_ctid(self, ctid): > > KeyError: (('283b7c4d88671dbff7acf083098da6ae',), <function > _reconstruct_object at 0x272c938>, > ('283b7c4d88671dbff7acf083098da6ae', 'region', array([ 16., 16., > 16.]), array([ 0., 0., 0.]), array([ 32., 32., 32.]), > {'disk_center': array([16, 16, 16]), 'center': array([ 16., 16., > 16.]), 'bulk_velocity': array([ 0., 0., 0.]), 'disk_vector': > array([0, 0, 1]), 'disk_radius': array([ 0.1 , 0.1358, 0.1716, > 0.2074, 0.2432, 0.279 , > > > > > The fact it mentions the field parameters in the last line, might mean > it's an error introduced because of the corrections to them (that I > have been demanding!)? > > The script I have works fine (pickles and all) on yt version: > > --- > 16e8d749a806 (yt) tip > --- > > > Elizabeth > _______________________________________________ > yt-users mailing list > yt-users@lists.spacepope.org > http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org _______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Hi Elizabeth, The hashes don't match, so it thinks you're loading a pickle for a file it doesn't have in memory. So, two possibilities: 1) The pickles were generated from a different static output than the one you are pf = load()ing earlier. If you load() the one that generated them, this will work. 2) The hash for your parameter file changed. The hash is generated by taking the md5sum of the following three items: basename current_time (for enzo this is InitialTime in the parameter file) unique_identifier The third one is the important one that you should check, and the one that has the most possibility of variation. I assume you're running Enzo. If your simulation has the parameter CurrentTimeIdentifier (which was added to the LCA enzo 1.5, and has persisted since then) this is used. If it does not, then the modification time of the parameter file is used. This can be changed if you edit or copy (without using -p) the file. -Matt On Thu, Dec 1, 2011 at 10:03 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Tis:
In [1]: pf._hash() Out[1]: '283b7c4d88671dbff7acf083098da6ae'
E.
On 2011-12-01, at 11:57 PM, Matthew Turk wrote:
Hi Elizabeth,
What is the output of pf._hash() ?
-Matt
On Thu, Dec 1, 2011 at 9:32 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Sorry, still no luck!
The current version of the code is:
--- c1a446be3a7a (yt) tip ---
unpickle core data --------------------------------------------------------------------------- KeyError Traceback (most recent call last)
/home/taskere/yt-2/src/yt-hg/scripts/iyt in <module>() 199 200 file = open('cores.pickle','rb') --> 201 allcloudcores = cPickle.load(file) 202 file.close() 203
/home/taskere/yt-2/src/yt-hg/yt/data_objects/data_containers.pyc in _reconstruct_object(*args, **kwargs) 3640 else: new_args.append(arg) 3641 pfs = ParameterFileStore() -> 3642 pf = pfs.get_pf_hash(pfid) 3643 cls = getattr(pf.h, dtype) 3644 obj = cls(*new_args)
/home/taskere/yt-2/src/yt-hg/yt/utilities/parameter_file_storage.pyc in get_pf_hash(self, hash) 109 def get_pf_hash(self, hash): 110 """ This returns a parameter file based on a hash. """ --> 111 return self._convert_pf(self._records[hash]) 112 113 def get_pf_ctid(self, ctid):
KeyError: (('283b7c4d88671dbff7acf083098da6ae',), <function _reconstruct_object at 0xa8cd848>, ('283b7c4d88671dbff7acf083098da6ae', 'region', array([ 16., 16., 16.]), array([ 0., 0., 0.]), array([ 32., 32., 32.]), {'disk_center': array([16, 16, 16]), 'center': array([ 16., 16., 16.]), 'bulk_velocity': array([ 0., 0., 0.]), 'disk_vector': array([0, 0, 1]), 'disk_radius': array([ 0.1 , 0.1238, 0.1476, 0.1714, 0.1952, 0.219 , 0.2428, 0.2666, 0.2904, 0.3142, 0.33
Unless I need to re-pickle first?
Elizabeth
On 2011-12-01, at 8:20 PM, Matthew Turk wrote:
Thanks, Elizabeth. The move to in-memory brought up a funny bug where the initialization process was reseting the list of known parameter hashes. This is fixed in c1a446be3a7a.
Also, save_object won't work except on yt objects; the result of extract_connected_sets is a tuple of values and python objects. Pickle will, tho.
On Thu, Dec 1, 2011 at 4:33 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
I also tried switching out pickle for the yt method of storing objects. Unfortunately, it doesn't seem to save the object I want. I tried:
contours = dd.extract_connected_sets("NegEscapeVelocity", 1, 30.0, maxv, log_space=False) contours.save_object("cloudcores", "storedcores.cpkl")
In [1]: contours.save_object("cloudcores", "storedcores.cpkl") --------------------------------------------------------------------------- AttributeError Traceback (most recent call last)
/home/tasker/yt/src/yt-hg/scripts/iyt in <module>() ----> 1 2 3 4 5
AttributeError: 'tuple' object has no attribute 'save_object'
The alternative method --of saving it via the hierarchy-- is slightly less good since I'm running multiple yt scripts on the same data set. (As least, when I tried to load the values back in, it produced an empty set which I presume was because I'd inadvertently overwritten the .yt file?).
If I could get one of these working again, then I'd be super happy. I don't need the all :)
Elizabeth
On 1 December 2011 12:35, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Hi Matt,
Sorry, that still doesn't work.... unless I've misunderstood your suggestion? The code is here:
http://paste.yt-project.org/show/1974/
Pickle + unpickle on a pile of properties I calculate from the connected sets is fine, but pickling the connected sets themselves is a problem. Could it be the later pickle doesn't require the parameter file? Or is it possible there is an unrelated write error happening?
Elizabeth
On 30 November 2011 23:43, Matthew Turk <matthewturk@gmail.com> wrote: > Hi Elizabeth, > > We recently changed the mechanism of storing parameter files to be > off-by-default, which means that rather than dumping them to a .csv > file, they only stay in memory. This functionality can be turned back > on (this change is documented in the development documentation.) Can > you try: > > 1) Loading your parameter file, *then* loading your pickle. I am > optimistic this will fix it. > 2) If it doesn't, at the very top of your python script, put this: > > from yt.config import ytcfg; ytcfg["yt","storeparameterfiles"] = "True" > > -Matt > > On Wed, Nov 30, 2011 at 4:46 AM, Elizabeth Tasker > <tasker@astro1.sci.hokudai.ac.jp> wrote: >> Hi, >> >> A bigger problem.... >> >> The lastest version of yt >> >> The current version of the code is: >> >> --- >> a519b8754ba8 (yt) tip >> --- >> >> >> has problems with pickle. If I pickle a data set and then try and >> unpickle it, I see: >> >> unpickle core data >> --------------------------------------------------------------------------- >> KeyError Traceback (most recent call last) >> >> /home/tasker/yt/src/yt-hg/scripts/iyt in <module>() >> 338 >> 339 #file = open('cores.pickle','rb') >> --> 340 allcloudcores = cPickle.load(file('cores.pickle', 'rb')) >> 341 file.close() >> 342 >> >> /home/tasker/yt/src/yt-hg/yt/data_objects/data_containers.pyc in >> _reconstruct_object(*args, **kwargs) >> 3676 else: new_args.append(arg) >> 3677 pfs = ParameterFileStore() >> -> 3678 pf = pfs.get_pf_hash(pfid) >> 3679 cls = getattr(pf.h, dtype) >> 3680 obj = cls(*new_args) >> >> /home/tasker/yt/src/yt-hg/yt/utilities/parameter_file_storage.pyc in >> get_pf_hash(self, hash) >> 106 def get_pf_hash(self, hash): >> 107 """ This returns a parameter file based on a hash. """ >> --> 108 return self._convert_pf(self._records[hash]) >> 109 >> 110 def get_pf_ctid(self, ctid): >> >> KeyError: (('283b7c4d88671dbff7acf083098da6ae',), <function >> _reconstruct_object at 0x272c938>, >> ('283b7c4d88671dbff7acf083098da6ae', 'region', array([ 16., 16., >> 16.]), array([ 0., 0., 0.]), array([ 32., 32., 32.]), >> {'disk_center': array([16, 16, 16]), 'center': array([ 16., 16., >> 16.]), 'bulk_velocity': array([ 0., 0., 0.]), 'disk_vector': >> array([0, 0, 1]), 'disk_radius': array([ 0.1 , 0.1358, 0.1716, >> 0.2074, 0.2432, 0.279 , >> >> >> >> >> The fact it mentions the field parameters in the last line, might mean >> it's an error introduced because of the corrections to them (that I >> have been demanding!)? >> >> The script I have works fine (pickles and all) on yt version: >> >> --- >> 16e8d749a806 (yt) tip >> --- >> >> >> Elizabeth >> _______________________________________________ >> yt-users mailing list >> yt-users@lists.spacepope.org >> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org > _______________________________________________ > yt-users mailing list > yt-users@lists.spacepope.org > http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Hi Matt, The odd thing is that if I use exactly the same .yt program, on the same machine with the same pickle file, it works fine with the older version of yt. I will check the unique identifier though. I'm using an old version of Enzo, so it's possible something got changed :/ Elizabeth On 2011-12-02, at 12:31 AM, Matthew Turk wrote:
Hi Elizabeth,
The hashes don't match, so it thinks you're loading a pickle for a file it doesn't have in memory. So, two possibilities:
1) The pickles were generated from a different static output than the one you are pf = load()ing earlier. If you load() the one that generated them, this will work.
2) The hash for your parameter file changed. The hash is generated by taking the md5sum of the following three items:
basename current_time (for enzo this is InitialTime in the parameter file) unique_identifier
The third one is the important one that you should check, and the one that has the most possibility of variation. I assume you're running Enzo. If your simulation has the parameter CurrentTimeIdentifier (which was added to the LCA enzo 1.5, and has persisted since then) this is used. If it does not, then the modification time of the parameter file is used. This can be changed if you edit or copy (without using -p) the file.
-Matt
On Thu, Dec 1, 2011 at 10:03 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Tis:
In [1]: pf._hash() Out[1]: '283b7c4d88671dbff7acf083098da6ae'
E.
On 2011-12-01, at 11:57 PM, Matthew Turk wrote:
Hi Elizabeth,
What is the output of pf._hash() ?
-Matt
On Thu, Dec 1, 2011 at 9:32 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Sorry, still no luck!
The current version of the code is:
--- c1a446be3a7a (yt) tip ---
unpickle core data --------------------------------------------------------------------------- KeyError Traceback (most recent call last)
/home/taskere/yt-2/src/yt-hg/scripts/iyt in <module>() 199 200 file = open('cores.pickle','rb') --> 201 allcloudcores = cPickle.load(file) 202 file.close() 203
/home/taskere/yt-2/src/yt-hg/yt/data_objects/data_containers.pyc in _reconstruct_object(*args, **kwargs) 3640 else: new_args.append(arg) 3641 pfs = ParameterFileStore() -> 3642 pf = pfs.get_pf_hash(pfid) 3643 cls = getattr(pf.h, dtype) 3644 obj = cls(*new_args)
/home/taskere/yt-2/src/yt-hg/yt/utilities/parameter_file_storage.pyc in get_pf_hash(self, hash) 109 def get_pf_hash(self, hash): 110 """ This returns a parameter file based on a hash. """ --> 111 return self._convert_pf(self._records[hash]) 112 113 def get_pf_ctid(self, ctid):
KeyError: (('283b7c4d88671dbff7acf083098da6ae',), <function _reconstruct_object at 0xa8cd848>, ('283b7c4d88671dbff7acf083098da6ae', 'region', array([ 16., 16., 16.]), array([ 0., 0., 0.]), array([ 32., 32., 32.]), {'disk_center': array([16, 16, 16]), 'center': array([ 16., 16., 16.]), 'bulk_velocity': array([ 0., 0., 0.]), 'disk_vector': array([0, 0, 1]), 'disk_radius': array([ 0.1 , 0.1238, 0.1476, 0.1714, 0.1952, 0.219 , 0.2428, 0.2666, 0.2904, 0.3142, 0.33
Unless I need to re-pickle first?
Elizabeth
On 2011-12-01, at 8:20 PM, Matthew Turk wrote:
Thanks, Elizabeth. The move to in-memory brought up a funny bug where the initialization process was reseting the list of known parameter hashes. This is fixed in c1a446be3a7a.
Also, save_object won't work except on yt objects; the result of extract_connected_sets is a tuple of values and python objects. Pickle will, tho.
On Thu, Dec 1, 2011 at 4:33 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
I also tried switching out pickle for the yt method of storing objects. Unfortunately, it doesn't seem to save the object I want. I tried:
contours = dd.extract_connected_sets("NegEscapeVelocity", 1, 30.0, maxv, log_space=False) contours.save_object("cloudcores", "storedcores.cpkl")
In [1]: contours.save_object("cloudcores", "storedcores.cpkl") --------------------------------------------------------------------------- AttributeError Traceback (most recent call last)
/home/tasker/yt/src/yt-hg/scripts/iyt in <module>() ----> 1 2 3 4 5
AttributeError: 'tuple' object has no attribute 'save_object'
The alternative method --of saving it via the hierarchy-- is slightly less good since I'm running multiple yt scripts on the same data set. (As least, when I tried to load the values back in, it produced an empty set which I presume was because I'd inadvertently overwritten the .yt file?).
If I could get one of these working again, then I'd be super happy. I don't need the all :)
Elizabeth
On 1 December 2011 12:35, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote: > Hi Matt, > > Sorry, that still doesn't work.... unless I've misunderstood your > suggestion? The code is here: > > http://paste.yt-project.org/show/1974/ > > Pickle + unpickle on a pile of properties I calculate from the > connected sets is fine, but pickling the connected sets themselves is > a problem. Could it be the later pickle doesn't require the parameter > file? Or is it possible there is an unrelated write error happening? > > Elizabeth > > On 30 November 2011 23:43, Matthew Turk <matthewturk@gmail.com> wrote: >> Hi Elizabeth, >> >> We recently changed the mechanism of storing parameter files to be >> off-by-default, which means that rather than dumping them to a .csv >> file, they only stay in memory. This functionality can be turned back >> on (this change is documented in the development documentation.) Can >> you try: >> >> 1) Loading your parameter file, *then* loading your pickle. I am >> optimistic this will fix it. >> 2) If it doesn't, at the very top of your python script, put this: >> >> from yt.config import ytcfg; ytcfg["yt","storeparameterfiles"] = "True" >> >> -Matt >> >> On Wed, Nov 30, 2011 at 4:46 AM, Elizabeth Tasker >> <tasker@astro1.sci.hokudai.ac.jp> wrote: >>> Hi, >>> >>> A bigger problem.... >>> >>> The lastest version of yt >>> >>> The current version of the code is: >>> >>> --- >>> a519b8754ba8 (yt) tip >>> --- >>> >>> >>> has problems with pickle. If I pickle a data set and then try and >>> unpickle it, I see: >>> >>> unpickle core data >>> --------------------------------------------------------------------------- >>> KeyError Traceback (most recent call last) >>> >>> /home/tasker/yt/src/yt-hg/scripts/iyt in <module>() >>> 338 >>> 339 #file = open('cores.pickle','rb') >>> --> 340 allcloudcores = cPickle.load(file('cores.pickle', 'rb')) >>> 341 file.close() >>> 342 >>> >>> /home/tasker/yt/src/yt-hg/yt/data_objects/data_containers.pyc in >>> _reconstruct_object(*args, **kwargs) >>> 3676 else: new_args.append(arg) >>> 3677 pfs = ParameterFileStore() >>> -> 3678 pf = pfs.get_pf_hash(pfid) >>> 3679 cls = getattr(pf.h, dtype) >>> 3680 obj = cls(*new_args) >>> >>> /home/tasker/yt/src/yt-hg/yt/utilities/parameter_file_storage.pyc in >>> get_pf_hash(self, hash) >>> 106 def get_pf_hash(self, hash): >>> 107 """ This returns a parameter file based on a hash. """ >>> --> 108 return self._convert_pf(self._records[hash]) >>> 109 >>> 110 def get_pf_ctid(self, ctid): >>> >>> KeyError: (('283b7c4d88671dbff7acf083098da6ae',), <function >>> _reconstruct_object at 0x272c938>, >>> ('283b7c4d88671dbff7acf083098da6ae', 'region', array([ 16., 16., >>> 16.]), array([ 0., 0., 0.]), array([ 32., 32., 32.]), >>> {'disk_center': array([16, 16, 16]), 'center': array([ 16., 16., >>> 16.]), 'bulk_velocity': array([ 0., 0., 0.]), 'disk_vector': >>> array([0, 0, 1]), 'disk_radius': array([ 0.1 , 0.1358, 0.1716, >>> 0.2074, 0.2432, 0.279 , >>> >>> >>> >>> >>> The fact it mentions the field parameters in the last line, might mean >>> it's an error introduced because of the corrections to them (that I >>> have been demanding!)? >>> >>> The script I have works fine (pickles and all) on yt version: >>> >>> --- >>> 16e8d749a806 (yt) tip >>> --- >>> >>> >>> Elizabeth >>> _______________________________________________ >>> yt-users mailing list >>> yt-users@lists.spacepope.org >>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org >> _______________________________________________ >> yt-users mailing list >> yt-users@lists.spacepope.org >> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org _______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Hi Matt, Just as one further update: even though the version of Enzo that made this simulation is older, it does still contain the CurrentTimeIdentifier parameter. Moreover, the result of pf._hash() is the same on two machines with a copy of the same data set, even though the time stamp is different on the parameter file. It's safe to assume that the hash must be created from this parameter and also even more odd, given this consistency, that it is a problem? It's definitely not problem (1): I only have one data set that I'm currently analysing this way, so I cannot have accidentally switched them up. Elizabeth On 2 December 2011 07:41, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Hi Matt,
The odd thing is that if I use exactly the same .yt program, on the same machine with the same pickle file, it works fine with the older version of yt.
I will check the unique identifier though. I'm using an old version of Enzo, so it's possible something got changed :/
Elizabeth
On 2011-12-02, at 12:31 AM, Matthew Turk wrote:
Hi Elizabeth,
The hashes don't match, so it thinks you're loading a pickle for a file it doesn't have in memory. So, two possibilities:
1) The pickles were generated from a different static output than the one you are pf = load()ing earlier. If you load() the one that generated them, this will work.
2) The hash for your parameter file changed. The hash is generated by taking the md5sum of the following three items:
basename current_time (for enzo this is InitialTime in the parameter file) unique_identifier
The third one is the important one that you should check, and the one that has the most possibility of variation. I assume you're running Enzo. If your simulation has the parameter CurrentTimeIdentifier (which was added to the LCA enzo 1.5, and has persisted since then) this is used. If it does not, then the modification time of the parameter file is used. This can be changed if you edit or copy (without using -p) the file.
-Matt
On Thu, Dec 1, 2011 at 10:03 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Tis:
In [1]: pf._hash() Out[1]: '283b7c4d88671dbff7acf083098da6ae'
E.
On 2011-12-01, at 11:57 PM, Matthew Turk wrote:
Hi Elizabeth,
What is the output of pf._hash() ?
-Matt
On Thu, Dec 1, 2011 at 9:32 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Sorry, still no luck!
The current version of the code is:
--- c1a446be3a7a (yt) tip ---
unpickle core data --------------------------------------------------------------------------- KeyError Traceback (most recent call last)
/home/taskere/yt-2/src/yt-hg/scripts/iyt in <module>() 199 200 file = open('cores.pickle','rb') --> 201 allcloudcores = cPickle.load(file) 202 file.close() 203
/home/taskere/yt-2/src/yt-hg/yt/data_objects/data_containers.pyc in _reconstruct_object(*args, **kwargs) 3640 else: new_args.append(arg) 3641 pfs = ParameterFileStore() -> 3642 pf = pfs.get_pf_hash(pfid) 3643 cls = getattr(pf.h, dtype) 3644 obj = cls(*new_args)
/home/taskere/yt-2/src/yt-hg/yt/utilities/parameter_file_storage.pyc in get_pf_hash(self, hash) 109 def get_pf_hash(self, hash): 110 """ This returns a parameter file based on a hash. """ --> 111 return self._convert_pf(self._records[hash]) 112 113 def get_pf_ctid(self, ctid):
KeyError: (('283b7c4d88671dbff7acf083098da6ae',), <function _reconstruct_object at 0xa8cd848>, ('283b7c4d88671dbff7acf083098da6ae', 'region', array([ 16., 16., 16.]), array([ 0., 0., 0.]), array([ 32., 32., 32.]), {'disk_center': array([16, 16, 16]), 'center': array([ 16., 16., 16.]), 'bulk_velocity': array([ 0., 0., 0.]), 'disk_vector': array([0, 0, 1]), 'disk_radius': array([ 0.1 , 0.1238, 0.1476, 0.1714, 0.1952, 0.219 , 0.2428, 0.2666, 0.2904, 0.3142, 0.33
Unless I need to re-pickle first?
Elizabeth
On 2011-12-01, at 8:20 PM, Matthew Turk wrote:
Thanks, Elizabeth. The move to in-memory brought up a funny bug where the initialization process was reseting the list of known parameter hashes. This is fixed in c1a446be3a7a.
Also, save_object won't work except on yt objects; the result of extract_connected_sets is a tuple of values and python objects. Pickle will, tho.
On Thu, Dec 1, 2011 at 4:33 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote: > I also tried switching out pickle for the yt method of storing > objects. Unfortunately, it doesn't seem to save the object I want. I > tried: > > contours = dd.extract_connected_sets("NegEscapeVelocity", 1, 30.0, > maxv, log_space=False) > contours.save_object("cloudcores", "storedcores.cpkl") > > In [1]: contours.save_object("cloudcores", "storedcores.cpkl") > --------------------------------------------------------------------------- > AttributeError Traceback (most recent call last) > > /home/tasker/yt/src/yt-hg/scripts/iyt in <module>() > ----> 1 > 2 > 3 > 4 > 5 > > AttributeError: 'tuple' object has no attribute 'save_object' > > The alternative method --of saving it via the hierarchy-- is slightly > less good since I'm running multiple yt scripts on the same data set. > (As least, when I tried to load the values back in, it produced an > empty set which I presume was because I'd inadvertently overwritten > the .yt file?). > > If I could get one of these working again, then I'd be super happy. I > don't need the all :) > > Elizabeth > > > On 1 December 2011 12:35, Elizabeth Tasker > <tasker@astro1.sci.hokudai.ac.jp> wrote: >> Hi Matt, >> >> Sorry, that still doesn't work.... unless I've misunderstood your >> suggestion? The code is here: >> >> http://paste.yt-project.org/show/1974/ >> >> Pickle + unpickle on a pile of properties I calculate from the >> connected sets is fine, but pickling the connected sets themselves is >> a problem. Could it be the later pickle doesn't require the parameter >> file? Or is it possible there is an unrelated write error happening? >> >> Elizabeth >> >> On 30 November 2011 23:43, Matthew Turk <matthewturk@gmail.com> wrote: >>> Hi Elizabeth, >>> >>> We recently changed the mechanism of storing parameter files to be >>> off-by-default, which means that rather than dumping them to a .csv >>> file, they only stay in memory. This functionality can be turned back >>> on (this change is documented in the development documentation.) Can >>> you try: >>> >>> 1) Loading your parameter file, *then* loading your pickle. I am >>> optimistic this will fix it. >>> 2) If it doesn't, at the very top of your python script, put this: >>> >>> from yt.config import ytcfg; ytcfg["yt","storeparameterfiles"] = "True" >>> >>> -Matt >>> >>> On Wed, Nov 30, 2011 at 4:46 AM, Elizabeth Tasker >>> <tasker@astro1.sci.hokudai.ac.jp> wrote: >>>> Hi, >>>> >>>> A bigger problem.... >>>> >>>> The lastest version of yt >>>> >>>> The current version of the code is: >>>> >>>> --- >>>> a519b8754ba8 (yt) tip >>>> --- >>>> >>>> >>>> has problems with pickle. If I pickle a data set and then try and >>>> unpickle it, I see: >>>> >>>> unpickle core data >>>> --------------------------------------------------------------------------- >>>> KeyError Traceback (most recent call last) >>>> >>>> /home/tasker/yt/src/yt-hg/scripts/iyt in <module>() >>>> 338 >>>> 339 #file = open('cores.pickle','rb') >>>> --> 340 allcloudcores = cPickle.load(file('cores.pickle', 'rb')) >>>> 341 file.close() >>>> 342 >>>> >>>> /home/tasker/yt/src/yt-hg/yt/data_objects/data_containers.pyc in >>>> _reconstruct_object(*args, **kwargs) >>>> 3676 else: new_args.append(arg) >>>> 3677 pfs = ParameterFileStore() >>>> -> 3678 pf = pfs.get_pf_hash(pfid) >>>> 3679 cls = getattr(pf.h, dtype) >>>> 3680 obj = cls(*new_args) >>>> >>>> /home/tasker/yt/src/yt-hg/yt/utilities/parameter_file_storage.pyc in >>>> get_pf_hash(self, hash) >>>> 106 def get_pf_hash(self, hash): >>>> 107 """ This returns a parameter file based on a hash. """ >>>> --> 108 return self._convert_pf(self._records[hash]) >>>> 109 >>>> 110 def get_pf_ctid(self, ctid): >>>> >>>> KeyError: (('283b7c4d88671dbff7acf083098da6ae',), <function >>>> _reconstruct_object at 0x272c938>, >>>> ('283b7c4d88671dbff7acf083098da6ae', 'region', array([ 16., 16., >>>> 16.]), array([ 0., 0., 0.]), array([ 32., 32., 32.]), >>>> {'disk_center': array([16, 16, 16]), 'center': array([ 16., 16., >>>> 16.]), 'bulk_velocity': array([ 0., 0., 0.]), 'disk_vector': >>>> array([0, 0, 1]), 'disk_radius': array([ 0.1 , 0.1358, 0.1716, >>>> 0.2074, 0.2432, 0.279 , >>>> >>>> >>>> >>>> >>>> The fact it mentions the field parameters in the last line, might mean >>>> it's an error introduced because of the corrections to them (that I >>>> have been demanding!)? >>>> >>>> The script I have works fine (pickles and all) on yt version: >>>> >>>> --- >>>> 16e8d749a806 (yt) tip >>>> --- >>>> >>>> >>>> Elizabeth >>>> _______________________________________________ >>>> yt-users mailing list >>>> yt-users@lists.spacepope.org >>>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org >>> _______________________________________________ >>> yt-users mailing list >>> yt-users@lists.spacepope.org >>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org > _______________________________________________ > yt-users mailing list > yt-users@lists.spacepope.org > http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org _______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Ah wait! The shiny new pickle I created this morning seems to load back in OK - hooray!! Thanks so much! Sorry for the hassle. Elizabeth On 2 December 2011 11:03, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Hi Matt,
Just as one further update: even though the version of Enzo that made this simulation is older, it does still contain the CurrentTimeIdentifier parameter. Moreover, the result of pf._hash() is the same on two machines with a copy of the same data set, even though the time stamp is different on the parameter file. It's safe to assume that the hash must be created from this parameter and also even more odd, given this consistency, that it is a problem?
It's definitely not problem (1): I only have one data set that I'm currently analysing this way, so I cannot have accidentally switched them up.
Elizabeth
On 2 December 2011 07:41, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Hi Matt,
The odd thing is that if I use exactly the same .yt program, on the same machine with the same pickle file, it works fine with the older version of yt.
I will check the unique identifier though. I'm using an old version of Enzo, so it's possible something got changed :/
Elizabeth
On 2011-12-02, at 12:31 AM, Matthew Turk wrote:
Hi Elizabeth,
The hashes don't match, so it thinks you're loading a pickle for a file it doesn't have in memory. So, two possibilities:
1) The pickles were generated from a different static output than the one you are pf = load()ing earlier. If you load() the one that generated them, this will work.
2) The hash for your parameter file changed. The hash is generated by taking the md5sum of the following three items:
basename current_time (for enzo this is InitialTime in the parameter file) unique_identifier
The third one is the important one that you should check, and the one that has the most possibility of variation. I assume you're running Enzo. If your simulation has the parameter CurrentTimeIdentifier (which was added to the LCA enzo 1.5, and has persisted since then) this is used. If it does not, then the modification time of the parameter file is used. This can be changed if you edit or copy (without using -p) the file.
-Matt
On Thu, Dec 1, 2011 at 10:03 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Tis:
In [1]: pf._hash() Out[1]: '283b7c4d88671dbff7acf083098da6ae'
E.
On 2011-12-01, at 11:57 PM, Matthew Turk wrote:
Hi Elizabeth,
What is the output of pf._hash() ?
-Matt
On Thu, Dec 1, 2011 at 9:32 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Sorry, still no luck!
The current version of the code is:
--- c1a446be3a7a (yt) tip ---
unpickle core data --------------------------------------------------------------------------- KeyError Traceback (most recent call last)
/home/taskere/yt-2/src/yt-hg/scripts/iyt in <module>() 199 200 file = open('cores.pickle','rb') --> 201 allcloudcores = cPickle.load(file) 202 file.close() 203
/home/taskere/yt-2/src/yt-hg/yt/data_objects/data_containers.pyc in _reconstruct_object(*args, **kwargs) 3640 else: new_args.append(arg) 3641 pfs = ParameterFileStore() -> 3642 pf = pfs.get_pf_hash(pfid) 3643 cls = getattr(pf.h, dtype) 3644 obj = cls(*new_args)
/home/taskere/yt-2/src/yt-hg/yt/utilities/parameter_file_storage.pyc in get_pf_hash(self, hash) 109 def get_pf_hash(self, hash): 110 """ This returns a parameter file based on a hash. """ --> 111 return self._convert_pf(self._records[hash]) 112 113 def get_pf_ctid(self, ctid):
KeyError: (('283b7c4d88671dbff7acf083098da6ae',), <function _reconstruct_object at 0xa8cd848>, ('283b7c4d88671dbff7acf083098da6ae', 'region', array([ 16., 16., 16.]), array([ 0., 0., 0.]), array([ 32., 32., 32.]), {'disk_center': array([16, 16, 16]), 'center': array([ 16., 16., 16.]), 'bulk_velocity': array([ 0., 0., 0.]), 'disk_vector': array([0, 0, 1]), 'disk_radius': array([ 0.1 , 0.1238, 0.1476, 0.1714, 0.1952, 0.219 , 0.2428, 0.2666, 0.2904, 0.3142, 0.33
Unless I need to re-pickle first?
Elizabeth
On 2011-12-01, at 8:20 PM, Matthew Turk wrote:
> Thanks, Elizabeth. The move to in-memory brought up a funny bug where > the initialization process was reseting the list of known parameter > hashes. This is fixed in c1a446be3a7a. > > Also, save_object won't work except on yt objects; the result of > extract_connected_sets is a tuple of values and python objects. > Pickle will, tho. > > On Thu, Dec 1, 2011 at 4:33 AM, Elizabeth Tasker > <tasker@astro1.sci.hokudai.ac.jp> wrote: >> I also tried switching out pickle for the yt method of storing >> objects. Unfortunately, it doesn't seem to save the object I want. I >> tried: >> >> contours = dd.extract_connected_sets("NegEscapeVelocity", 1, 30.0, >> maxv, log_space=False) >> contours.save_object("cloudcores", "storedcores.cpkl") >> >> In [1]: contours.save_object("cloudcores", "storedcores.cpkl") >> --------------------------------------------------------------------------- >> AttributeError Traceback (most recent call last) >> >> /home/tasker/yt/src/yt-hg/scripts/iyt in <module>() >> ----> 1 >> 2 >> 3 >> 4 >> 5 >> >> AttributeError: 'tuple' object has no attribute 'save_object' >> >> The alternative method --of saving it via the hierarchy-- is slightly >> less good since I'm running multiple yt scripts on the same data set. >> (As least, when I tried to load the values back in, it produced an >> empty set which I presume was because I'd inadvertently overwritten >> the .yt file?). >> >> If I could get one of these working again, then I'd be super happy. I >> don't need the all :) >> >> Elizabeth >> >> >> On 1 December 2011 12:35, Elizabeth Tasker >> <tasker@astro1.sci.hokudai.ac.jp> wrote: >>> Hi Matt, >>> >>> Sorry, that still doesn't work.... unless I've misunderstood your >>> suggestion? The code is here: >>> >>> http://paste.yt-project.org/show/1974/ >>> >>> Pickle + unpickle on a pile of properties I calculate from the >>> connected sets is fine, but pickling the connected sets themselves is >>> a problem. Could it be the later pickle doesn't require the parameter >>> file? Or is it possible there is an unrelated write error happening? >>> >>> Elizabeth >>> >>> On 30 November 2011 23:43, Matthew Turk <matthewturk@gmail.com> wrote: >>>> Hi Elizabeth, >>>> >>>> We recently changed the mechanism of storing parameter files to be >>>> off-by-default, which means that rather than dumping them to a .csv >>>> file, they only stay in memory. This functionality can be turned back >>>> on (this change is documented in the development documentation.) Can >>>> you try: >>>> >>>> 1) Loading your parameter file, *then* loading your pickle. I am >>>> optimistic this will fix it. >>>> 2) If it doesn't, at the very top of your python script, put this: >>>> >>>> from yt.config import ytcfg; ytcfg["yt","storeparameterfiles"] = "True" >>>> >>>> -Matt >>>> >>>> On Wed, Nov 30, 2011 at 4:46 AM, Elizabeth Tasker >>>> <tasker@astro1.sci.hokudai.ac.jp> wrote: >>>>> Hi, >>>>> >>>>> A bigger problem.... >>>>> >>>>> The lastest version of yt >>>>> >>>>> The current version of the code is: >>>>> >>>>> --- >>>>> a519b8754ba8 (yt) tip >>>>> --- >>>>> >>>>> >>>>> has problems with pickle. If I pickle a data set and then try and >>>>> unpickle it, I see: >>>>> >>>>> unpickle core data >>>>> --------------------------------------------------------------------------- >>>>> KeyError Traceback (most recent call last) >>>>> >>>>> /home/tasker/yt/src/yt-hg/scripts/iyt in <module>() >>>>> 338 >>>>> 339 #file = open('cores.pickle','rb') >>>>> --> 340 allcloudcores = cPickle.load(file('cores.pickle', 'rb')) >>>>> 341 file.close() >>>>> 342 >>>>> >>>>> /home/tasker/yt/src/yt-hg/yt/data_objects/data_containers.pyc in >>>>> _reconstruct_object(*args, **kwargs) >>>>> 3676 else: new_args.append(arg) >>>>> 3677 pfs = ParameterFileStore() >>>>> -> 3678 pf = pfs.get_pf_hash(pfid) >>>>> 3679 cls = getattr(pf.h, dtype) >>>>> 3680 obj = cls(*new_args) >>>>> >>>>> /home/tasker/yt/src/yt-hg/yt/utilities/parameter_file_storage.pyc in >>>>> get_pf_hash(self, hash) >>>>> 106 def get_pf_hash(self, hash): >>>>> 107 """ This returns a parameter file based on a hash. """ >>>>> --> 108 return self._convert_pf(self._records[hash]) >>>>> 109 >>>>> 110 def get_pf_ctid(self, ctid): >>>>> >>>>> KeyError: (('283b7c4d88671dbff7acf083098da6ae',), <function >>>>> _reconstruct_object at 0x272c938>, >>>>> ('283b7c4d88671dbff7acf083098da6ae', 'region', array([ 16., 16., >>>>> 16.]), array([ 0., 0., 0.]), array([ 32., 32., 32.]), >>>>> {'disk_center': array([16, 16, 16]), 'center': array([ 16., 16., >>>>> 16.]), 'bulk_velocity': array([ 0., 0., 0.]), 'disk_vector': >>>>> array([0, 0, 1]), 'disk_radius': array([ 0.1 , 0.1358, 0.1716, >>>>> 0.2074, 0.2432, 0.279 , >>>>> >>>>> >>>>> >>>>> >>>>> The fact it mentions the field parameters in the last line, might mean >>>>> it's an error introduced because of the corrections to them (that I >>>>> have been demanding!)? >>>>> >>>>> The script I have works fine (pickles and all) on yt version: >>>>> >>>>> --- >>>>> 16e8d749a806 (yt) tip >>>>> --- >>>>> >>>>> >>>>> Elizabeth >>>>> _______________________________________________ >>>>> yt-users mailing list >>>>> yt-users@lists.spacepope.org >>>>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org >>>> _______________________________________________ >>>> yt-users mailing list >>>> yt-users@lists.spacepope.org >>>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org >> _______________________________________________ >> yt-users mailing list >> yt-users@lists.spacepope.org >> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org > _______________________________________________ > yt-users mailing list > yt-users@lists.spacepope.org > http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
No, sorry my bad -- I had told the code to not read in the old data but create it fresh. This remains an issue. Elizabeth On 2 December 2011 11:09, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Ah wait!
The shiny new pickle I created this morning seems to load back in OK - hooray!!
Thanks so much! Sorry for the hassle.
Elizabeth
On 2 December 2011 11:03, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Hi Matt,
Just as one further update: even though the version of Enzo that made this simulation is older, it does still contain the CurrentTimeIdentifier parameter. Moreover, the result of pf._hash() is the same on two machines with a copy of the same data set, even though the time stamp is different on the parameter file. It's safe to assume that the hash must be created from this parameter and also even more odd, given this consistency, that it is a problem?
It's definitely not problem (1): I only have one data set that I'm currently analysing this way, so I cannot have accidentally switched them up.
Elizabeth
On 2 December 2011 07:41, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Hi Matt,
The odd thing is that if I use exactly the same .yt program, on the same machine with the same pickle file, it works fine with the older version of yt.
I will check the unique identifier though. I'm using an old version of Enzo, so it's possible something got changed :/
Elizabeth
On 2011-12-02, at 12:31 AM, Matthew Turk wrote:
Hi Elizabeth,
The hashes don't match, so it thinks you're loading a pickle for a file it doesn't have in memory. So, two possibilities:
1) The pickles were generated from a different static output than the one you are pf = load()ing earlier. If you load() the one that generated them, this will work.
2) The hash for your parameter file changed. The hash is generated by taking the md5sum of the following three items:
basename current_time (for enzo this is InitialTime in the parameter file) unique_identifier
The third one is the important one that you should check, and the one that has the most possibility of variation. I assume you're running Enzo. If your simulation has the parameter CurrentTimeIdentifier (which was added to the LCA enzo 1.5, and has persisted since then) this is used. If it does not, then the modification time of the parameter file is used. This can be changed if you edit or copy (without using -p) the file.
-Matt
On Thu, Dec 1, 2011 at 10:03 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Tis:
In [1]: pf._hash() Out[1]: '283b7c4d88671dbff7acf083098da6ae'
E.
On 2011-12-01, at 11:57 PM, Matthew Turk wrote:
Hi Elizabeth,
What is the output of pf._hash() ?
-Matt
On Thu, Dec 1, 2011 at 9:32 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote: > Sorry, still no luck! > > The current version of the code is: > > --- > c1a446be3a7a (yt) tip > --- > > > unpickle core data > --------------------------------------------------------------------------- > KeyError Traceback (most recent call last) > > /home/taskere/yt-2/src/yt-hg/scripts/iyt in <module>() > 199 > 200 file = open('cores.pickle','rb') > --> 201 allcloudcores = cPickle.load(file) > 202 file.close() > 203 > > /home/taskere/yt-2/src/yt-hg/yt/data_objects/data_containers.pyc in _reconstruct_object(*args, **kwargs) > 3640 else: new_args.append(arg) > 3641 pfs = ParameterFileStore() > -> 3642 pf = pfs.get_pf_hash(pfid) > 3643 cls = getattr(pf.h, dtype) > 3644 obj = cls(*new_args) > > /home/taskere/yt-2/src/yt-hg/yt/utilities/parameter_file_storage.pyc in get_pf_hash(self, hash) > 109 def get_pf_hash(self, hash): > 110 """ This returns a parameter file based on a hash. """ > --> 111 return self._convert_pf(self._records[hash]) > 112 > 113 def get_pf_ctid(self, ctid): > > KeyError: (('283b7c4d88671dbff7acf083098da6ae',), <function _reconstruct_object at 0xa8cd848>, ('283b7c4d88671dbff7acf083098da6ae', 'region', array([ 16., 16., 16.]), array([ 0., 0., 0.]), array([ 32., 32., 32.]), {'disk_center': array([16, 16, 16]), 'center': array([ 16., 16., 16.]), 'bulk_velocity': array([ 0., 0., 0.]), 'disk_vector': array([0, 0, 1]), 'disk_radius': array([ 0.1 , 0.1238, 0.1476, 0.1714, 0.1952, 0.219 , > 0.2428, 0.2666, 0.2904, 0.3142, 0.33 > > > > Unless I need to re-pickle first? > > Elizabeth > > > > > On 2011-12-01, at 8:20 PM, Matthew Turk wrote: > >> Thanks, Elizabeth. The move to in-memory brought up a funny bug where >> the initialization process was reseting the list of known parameter >> hashes. This is fixed in c1a446be3a7a. >> >> Also, save_object won't work except on yt objects; the result of >> extract_connected_sets is a tuple of values and python objects. >> Pickle will, tho. >> >> On Thu, Dec 1, 2011 at 4:33 AM, Elizabeth Tasker >> <tasker@astro1.sci.hokudai.ac.jp> wrote: >>> I also tried switching out pickle for the yt method of storing >>> objects. Unfortunately, it doesn't seem to save the object I want. I >>> tried: >>> >>> contours = dd.extract_connected_sets("NegEscapeVelocity", 1, 30.0, >>> maxv, log_space=False) >>> contours.save_object("cloudcores", "storedcores.cpkl") >>> >>> In [1]: contours.save_object("cloudcores", "storedcores.cpkl") >>> --------------------------------------------------------------------------- >>> AttributeError Traceback (most recent call last) >>> >>> /home/tasker/yt/src/yt-hg/scripts/iyt in <module>() >>> ----> 1 >>> 2 >>> 3 >>> 4 >>> 5 >>> >>> AttributeError: 'tuple' object has no attribute 'save_object' >>> >>> The alternative method --of saving it via the hierarchy-- is slightly >>> less good since I'm running multiple yt scripts on the same data set. >>> (As least, when I tried to load the values back in, it produced an >>> empty set which I presume was because I'd inadvertently overwritten >>> the .yt file?). >>> >>> If I could get one of these working again, then I'd be super happy. I >>> don't need the all :) >>> >>> Elizabeth >>> >>> >>> On 1 December 2011 12:35, Elizabeth Tasker >>> <tasker@astro1.sci.hokudai.ac.jp> wrote: >>>> Hi Matt, >>>> >>>> Sorry, that still doesn't work.... unless I've misunderstood your >>>> suggestion? The code is here: >>>> >>>> http://paste.yt-project.org/show/1974/ >>>> >>>> Pickle + unpickle on a pile of properties I calculate from the >>>> connected sets is fine, but pickling the connected sets themselves is >>>> a problem. Could it be the later pickle doesn't require the parameter >>>> file? Or is it possible there is an unrelated write error happening? >>>> >>>> Elizabeth >>>> >>>> On 30 November 2011 23:43, Matthew Turk <matthewturk@gmail.com> wrote: >>>>> Hi Elizabeth, >>>>> >>>>> We recently changed the mechanism of storing parameter files to be >>>>> off-by-default, which means that rather than dumping them to a .csv >>>>> file, they only stay in memory. This functionality can be turned back >>>>> on (this change is documented in the development documentation.) Can >>>>> you try: >>>>> >>>>> 1) Loading your parameter file, *then* loading your pickle. I am >>>>> optimistic this will fix it. >>>>> 2) If it doesn't, at the very top of your python script, put this: >>>>> >>>>> from yt.config import ytcfg; ytcfg["yt","storeparameterfiles"] = "True" >>>>> >>>>> -Matt >>>>> >>>>> On Wed, Nov 30, 2011 at 4:46 AM, Elizabeth Tasker >>>>> <tasker@astro1.sci.hokudai.ac.jp> wrote: >>>>>> Hi, >>>>>> >>>>>> A bigger problem.... >>>>>> >>>>>> The lastest version of yt >>>>>> >>>>>> The current version of the code is: >>>>>> >>>>>> --- >>>>>> a519b8754ba8 (yt) tip >>>>>> --- >>>>>> >>>>>> >>>>>> has problems with pickle. If I pickle a data set and then try and >>>>>> unpickle it, I see: >>>>>> >>>>>> unpickle core data >>>>>> --------------------------------------------------------------------------- >>>>>> KeyError Traceback (most recent call last) >>>>>> >>>>>> /home/tasker/yt/src/yt-hg/scripts/iyt in <module>() >>>>>> 338 >>>>>> 339 #file = open('cores.pickle','rb') >>>>>> --> 340 allcloudcores = cPickle.load(file('cores.pickle', 'rb')) >>>>>> 341 file.close() >>>>>> 342 >>>>>> >>>>>> /home/tasker/yt/src/yt-hg/yt/data_objects/data_containers.pyc in >>>>>> _reconstruct_object(*args, **kwargs) >>>>>> 3676 else: new_args.append(arg) >>>>>> 3677 pfs = ParameterFileStore() >>>>>> -> 3678 pf = pfs.get_pf_hash(pfid) >>>>>> 3679 cls = getattr(pf.h, dtype) >>>>>> 3680 obj = cls(*new_args) >>>>>> >>>>>> /home/tasker/yt/src/yt-hg/yt/utilities/parameter_file_storage.pyc in >>>>>> get_pf_hash(self, hash) >>>>>> 106 def get_pf_hash(self, hash): >>>>>> 107 """ This returns a parameter file based on a hash. """ >>>>>> --> 108 return self._convert_pf(self._records[hash]) >>>>>> 109 >>>>>> 110 def get_pf_ctid(self, ctid): >>>>>> >>>>>> KeyError: (('283b7c4d88671dbff7acf083098da6ae',), <function >>>>>> _reconstruct_object at 0x272c938>, >>>>>> ('283b7c4d88671dbff7acf083098da6ae', 'region', array([ 16., 16., >>>>>> 16.]), array([ 0., 0., 0.]), array([ 32., 32., 32.]), >>>>>> {'disk_center': array([16, 16, 16]), 'center': array([ 16., 16., >>>>>> 16.]), 'bulk_velocity': array([ 0., 0., 0.]), 'disk_vector': >>>>>> array([0, 0, 1]), 'disk_radius': array([ 0.1 , 0.1358, 0.1716, >>>>>> 0.2074, 0.2432, 0.279 , >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> The fact it mentions the field parameters in the last line, might mean >>>>>> it's an error introduced because of the corrections to them (that I >>>>>> have been demanding!)? >>>>>> >>>>>> The script I have works fine (pickles and all) on yt version: >>>>>> >>>>>> --- >>>>>> 16e8d749a806 (yt) tip >>>>>> --- >>>>>> >>>>>> >>>>>> Elizabeth >>>>>> _______________________________________________ >>>>>> yt-users mailing list >>>>>> yt-users@lists.spacepope.org >>>>>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org >>>>> _______________________________________________ >>>>> yt-users mailing list >>>>> yt-users@lists.spacepope.org >>>>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org >>> _______________________________________________ >>> yt-users mailing list >>> yt-users@lists.spacepope.org >>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org >> _______________________________________________ >> yt-users mailing list >> yt-users@lists.spacepope.org >> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org > > _______________________________________________ > yt-users mailing list > yt-users@lists.spacepope.org > http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org _______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
Hi all, Elizabeth signed on to IRC and we worked through this. The issue was that she set "serialize" to be false as well, which had the unwanted side effect of disabling keeping parameter files in cached memory. I have disabled this check; "serialize" and "storeparameterfiles" are no longer overlapping configuration options. Hopefully the rework of the parameter file caching Stephen and I have been chatting about, and the somewhat more distant yt 3.0 rework of data storage *in general* will help to make this kind of thing less tricky and more straightforward! -Matt On Fri, Dec 2, 2011 at 6:44 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
No, sorry my bad -- I had told the code to not read in the old data but create it fresh. This remains an issue.
Elizabeth
On 2 December 2011 11:09, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Ah wait!
The shiny new pickle I created this morning seems to load back in OK - hooray!!
Thanks so much! Sorry for the hassle.
Elizabeth
On 2 December 2011 11:03, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Hi Matt,
Just as one further update: even though the version of Enzo that made this simulation is older, it does still contain the CurrentTimeIdentifier parameter. Moreover, the result of pf._hash() is the same on two machines with a copy of the same data set, even though the time stamp is different on the parameter file. It's safe to assume that the hash must be created from this parameter and also even more odd, given this consistency, that it is a problem?
It's definitely not problem (1): I only have one data set that I'm currently analysing this way, so I cannot have accidentally switched them up.
Elizabeth
On 2 December 2011 07:41, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Hi Matt,
The odd thing is that if I use exactly the same .yt program, on the same machine with the same pickle file, it works fine with the older version of yt.
I will check the unique identifier though. I'm using an old version of Enzo, so it's possible something got changed :/
Elizabeth
On 2011-12-02, at 12:31 AM, Matthew Turk wrote:
Hi Elizabeth,
The hashes don't match, so it thinks you're loading a pickle for a file it doesn't have in memory. So, two possibilities:
1) The pickles were generated from a different static output than the one you are pf = load()ing earlier. If you load() the one that generated them, this will work.
2) The hash for your parameter file changed. The hash is generated by taking the md5sum of the following three items:
basename current_time (for enzo this is InitialTime in the parameter file) unique_identifier
The third one is the important one that you should check, and the one that has the most possibility of variation. I assume you're running Enzo. If your simulation has the parameter CurrentTimeIdentifier (which was added to the LCA enzo 1.5, and has persisted since then) this is used. If it does not, then the modification time of the parameter file is used. This can be changed if you edit or copy (without using -p) the file.
-Matt
On Thu, Dec 1, 2011 at 10:03 AM, Elizabeth Tasker <tasker@astro1.sci.hokudai.ac.jp> wrote:
Tis:
In [1]: pf._hash() Out[1]: '283b7c4d88671dbff7acf083098da6ae'
E.
On 2011-12-01, at 11:57 PM, Matthew Turk wrote:
> Hi Elizabeth, > > What is the output of pf._hash() ? > > -Matt > > On Thu, Dec 1, 2011 at 9:32 AM, Elizabeth Tasker > <tasker@astro1.sci.hokudai.ac.jp> wrote: >> Sorry, still no luck! >> >> The current version of the code is: >> >> --- >> c1a446be3a7a (yt) tip >> --- >> >> >> unpickle core data >> --------------------------------------------------------------------------- >> KeyError Traceback (most recent call last) >> >> /home/taskere/yt-2/src/yt-hg/scripts/iyt in <module>() >> 199 >> 200 file = open('cores.pickle','rb') >> --> 201 allcloudcores = cPickle.load(file) >> 202 file.close() >> 203 >> >> /home/taskere/yt-2/src/yt-hg/yt/data_objects/data_containers.pyc in _reconstruct_object(*args, **kwargs) >> 3640 else: new_args.append(arg) >> 3641 pfs = ParameterFileStore() >> -> 3642 pf = pfs.get_pf_hash(pfid) >> 3643 cls = getattr(pf.h, dtype) >> 3644 obj = cls(*new_args) >> >> /home/taskere/yt-2/src/yt-hg/yt/utilities/parameter_file_storage.pyc in get_pf_hash(self, hash) >> 109 def get_pf_hash(self, hash): >> 110 """ This returns a parameter file based on a hash. """ >> --> 111 return self._convert_pf(self._records[hash]) >> 112 >> 113 def get_pf_ctid(self, ctid): >> >> KeyError: (('283b7c4d88671dbff7acf083098da6ae',), <function _reconstruct_object at 0xa8cd848>, ('283b7c4d88671dbff7acf083098da6ae', 'region', array([ 16., 16., 16.]), array([ 0., 0., 0.]), array([ 32., 32., 32.]), {'disk_center': array([16, 16, 16]), 'center': array([ 16., 16., 16.]), 'bulk_velocity': array([ 0., 0., 0.]), 'disk_vector': array([0, 0, 1]), 'disk_radius': array([ 0.1 , 0.1238, 0.1476, 0.1714, 0.1952, 0.219 , >> 0.2428, 0.2666, 0.2904, 0.3142, 0.33 >> >> >> >> Unless I need to re-pickle first? >> >> Elizabeth >> >> >> >> >> On 2011-12-01, at 8:20 PM, Matthew Turk wrote: >> >>> Thanks, Elizabeth. The move to in-memory brought up a funny bug where >>> the initialization process was reseting the list of known parameter >>> hashes. This is fixed in c1a446be3a7a. >>> >>> Also, save_object won't work except on yt objects; the result of >>> extract_connected_sets is a tuple of values and python objects. >>> Pickle will, tho. >>> >>> On Thu, Dec 1, 2011 at 4:33 AM, Elizabeth Tasker >>> <tasker@astro1.sci.hokudai.ac.jp> wrote: >>>> I also tried switching out pickle for the yt method of storing >>>> objects. Unfortunately, it doesn't seem to save the object I want. I >>>> tried: >>>> >>>> contours = dd.extract_connected_sets("NegEscapeVelocity", 1, 30.0, >>>> maxv, log_space=False) >>>> contours.save_object("cloudcores", "storedcores.cpkl") >>>> >>>> In [1]: contours.save_object("cloudcores", "storedcores.cpkl") >>>> --------------------------------------------------------------------------- >>>> AttributeError Traceback (most recent call last) >>>> >>>> /home/tasker/yt/src/yt-hg/scripts/iyt in <module>() >>>> ----> 1 >>>> 2 >>>> 3 >>>> 4 >>>> 5 >>>> >>>> AttributeError: 'tuple' object has no attribute 'save_object' >>>> >>>> The alternative method --of saving it via the hierarchy-- is slightly >>>> less good since I'm running multiple yt scripts on the same data set. >>>> (As least, when I tried to load the values back in, it produced an >>>> empty set which I presume was because I'd inadvertently overwritten >>>> the .yt file?). >>>> >>>> If I could get one of these working again, then I'd be super happy. I >>>> don't need the all :) >>>> >>>> Elizabeth >>>> >>>> >>>> On 1 December 2011 12:35, Elizabeth Tasker >>>> <tasker@astro1.sci.hokudai.ac.jp> wrote: >>>>> Hi Matt, >>>>> >>>>> Sorry, that still doesn't work.... unless I've misunderstood your >>>>> suggestion? The code is here: >>>>> >>>>> http://paste.yt-project.org/show/1974/ >>>>> >>>>> Pickle + unpickle on a pile of properties I calculate from the >>>>> connected sets is fine, but pickling the connected sets themselves is >>>>> a problem. Could it be the later pickle doesn't require the parameter >>>>> file? Or is it possible there is an unrelated write error happening? >>>>> >>>>> Elizabeth >>>>> >>>>> On 30 November 2011 23:43, Matthew Turk <matthewturk@gmail.com> wrote: >>>>>> Hi Elizabeth, >>>>>> >>>>>> We recently changed the mechanism of storing parameter files to be >>>>>> off-by-default, which means that rather than dumping them to a .csv >>>>>> file, they only stay in memory. This functionality can be turned back >>>>>> on (this change is documented in the development documentation.) Can >>>>>> you try: >>>>>> >>>>>> 1) Loading your parameter file, *then* loading your pickle. I am >>>>>> optimistic this will fix it. >>>>>> 2) If it doesn't, at the very top of your python script, put this: >>>>>> >>>>>> from yt.config import ytcfg; ytcfg["yt","storeparameterfiles"] = "True" >>>>>> >>>>>> -Matt >>>>>> >>>>>> On Wed, Nov 30, 2011 at 4:46 AM, Elizabeth Tasker >>>>>> <tasker@astro1.sci.hokudai.ac.jp> wrote: >>>>>>> Hi, >>>>>>> >>>>>>> A bigger problem.... >>>>>>> >>>>>>> The lastest version of yt >>>>>>> >>>>>>> The current version of the code is: >>>>>>> >>>>>>> --- >>>>>>> a519b8754ba8 (yt) tip >>>>>>> --- >>>>>>> >>>>>>> >>>>>>> has problems with pickle. If I pickle a data set and then try and >>>>>>> unpickle it, I see: >>>>>>> >>>>>>> unpickle core data >>>>>>> --------------------------------------------------------------------------- >>>>>>> KeyError Traceback (most recent call last) >>>>>>> >>>>>>> /home/tasker/yt/src/yt-hg/scripts/iyt in <module>() >>>>>>> 338 >>>>>>> 339 #file = open('cores.pickle','rb') >>>>>>> --> 340 allcloudcores = cPickle.load(file('cores.pickle', 'rb')) >>>>>>> 341 file.close() >>>>>>> 342 >>>>>>> >>>>>>> /home/tasker/yt/src/yt-hg/yt/data_objects/data_containers.pyc in >>>>>>> _reconstruct_object(*args, **kwargs) >>>>>>> 3676 else: new_args.append(arg) >>>>>>> 3677 pfs = ParameterFileStore() >>>>>>> -> 3678 pf = pfs.get_pf_hash(pfid) >>>>>>> 3679 cls = getattr(pf.h, dtype) >>>>>>> 3680 obj = cls(*new_args) >>>>>>> >>>>>>> /home/tasker/yt/src/yt-hg/yt/utilities/parameter_file_storage.pyc in >>>>>>> get_pf_hash(self, hash) >>>>>>> 106 def get_pf_hash(self, hash): >>>>>>> 107 """ This returns a parameter file based on a hash. """ >>>>>>> --> 108 return self._convert_pf(self._records[hash]) >>>>>>> 109 >>>>>>> 110 def get_pf_ctid(self, ctid): >>>>>>> >>>>>>> KeyError: (('283b7c4d88671dbff7acf083098da6ae',), <function >>>>>>> _reconstruct_object at 0x272c938>, >>>>>>> ('283b7c4d88671dbff7acf083098da6ae', 'region', array([ 16., 16., >>>>>>> 16.]), array([ 0., 0., 0.]), array([ 32., 32., 32.]), >>>>>>> {'disk_center': array([16, 16, 16]), 'center': array([ 16., 16., >>>>>>> 16.]), 'bulk_velocity': array([ 0., 0., 0.]), 'disk_vector': >>>>>>> array([0, 0, 1]), 'disk_radius': array([ 0.1 , 0.1358, 0.1716, >>>>>>> 0.2074, 0.2432, 0.279 , >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> The fact it mentions the field parameters in the last line, might mean >>>>>>> it's an error introduced because of the corrections to them (that I >>>>>>> have been demanding!)? >>>>>>> >>>>>>> The script I have works fine (pickles and all) on yt version: >>>>>>> >>>>>>> --- >>>>>>> 16e8d749a806 (yt) tip >>>>>>> --- >>>>>>> >>>>>>> >>>>>>> Elizabeth >>>>>>> _______________________________________________ >>>>>>> yt-users mailing list >>>>>>> yt-users@lists.spacepope.org >>>>>>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org >>>>>> _______________________________________________ >>>>>> yt-users mailing list >>>>>> yt-users@lists.spacepope.org >>>>>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org >>>> _______________________________________________ >>>> yt-users mailing list >>>> yt-users@lists.spacepope.org >>>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org >>> _______________________________________________ >>> yt-users mailing list >>> yt-users@lists.spacepope.org >>> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org >> >> _______________________________________________ >> yt-users mailing list >> yt-users@lists.spacepope.org >> http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org > _______________________________________________ > yt-users mailing list > yt-users@lists.spacepope.org > http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
_______________________________________________ yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
yt-users mailing list yt-users@lists.spacepope.org http://lists.spacepope.org/listinfo.cgi/yt-users-spacepope.org
participants (2)
-
Elizabeth Tasker
-
Matthew Turk