And it might be easier if you figured out why it was broken ratehr than waiting on me. The issue is in the .pyx, not the .py file, and it is that the buffer of potential contour linkages is not long enough. I thought it was, based on the loop structure that followed. If you expand it to the correct length it will work.
I've done some work on this, and you're right that the buffer in construct_boundary_relationships needs to be made bigger. In some cases, much bigger. The multiplier was set to 9:
s = (ny*nx + nx*nz + nx*nz - 4) * 9
In my dataset I've found that I need at least 52 for it to work in all cases. I did some statistics and the mean minimum multiplier required is 18 with a standard deviation of 9, so 52 is about 3 sigma out, so it's not that rare. I don't know how globally-applicable these statistics are; this is just one dataset.
What I'm wondering is what we should do about this? Make the multiplier large enough to capture (N)sigma likelihood? Or wrap the call to construct_boundary_relationships in countour_finder.py in a while loop with a try/except that increases the multiplier with each failure?
Stephen Skory firstname.lastname@example.org http://stephenskory.com/ 510.621.3687 (google voice)