<font color="#000000">Hello,</font><div><font color="#000000"><br></font></div><div><font color="#000000">I'm running into an unexpected issue in a program I'm writing, and I was hoping someone could provide some clarification for me. I'm trying to subclass numpy.ndarray (basically create a class to handle a 3D grid). When I instantiate a numpy.ndarray, everything works as expected. When I call numpy.ndarray's constructor directly within my subclass, I get a deprecation warning about object.__init__ not taking arguments. Presumably this means that ndarray's __init__ is somehow (for some reason?) calling object's __init__...</font></div>
<div><font color="#000000"><br></font></div><div><font color="#000000">This is some sample code:</font></div><div><font color="#000000"><br></font></div><div><font color="#000000"><div>>>> import numpy as np</div>
<div>>>> class derived(np.ndarray):</div><div>... def __init__(self, stuff):</div><div>... np.ndarray.__init__(self, stuff)</div><div>... </div><div>>>> l = derived((2,3))</div><div>__main__:3: DeprecationWarning: object.__init__() takes no parameters</div>
<div>>>> l</div><div>derived([[ 8.87744455e+159, 6.42896975e-109, 5.56218818e+180],</div><div> [ 1.79996515e+219, 2.41625066e+198, 5.15855295e+307]])</div><div>>>> </div><div><br></div><div>
Am I doing something blatantly stupid? Is there a better way of going about this? I suppose I could create a normal class and just put the grid points in a ndarray as an attribute to the class, but I would rather subclass ndarray directly (not sure I have a good reason for it, though). Suggestions on what I should do?</div>
<div><br></div><div>Thanks!</div><div>Jason</div></font>
</div>