On Mi, 2016-07-06 at 10:22 -0400, Marten van Kerkwijk wrote:
I'm with Nathaniel here, in that I don't really see the point of these routines in the first place: broadcasting takes care of many of the initial use cases one might think of, and others are generally not all that well served by them: the examples from scipy to me do not really support `at_least?d`, but rather suggest that little thought has been put into higher-dimensional objects which should be treated as stacks of row or column vectors. My sense is that we're better off developing the direction started with `matmul`, perhaps adding `matvecmul` etc.
More to the point of the initial inquiry: what is the advantage of having a general `np.at_leastnd` routine over doing
There is another wonky reason for using the atleast_?d functions, in that they use reshape to be fully duck typed ;) (in newer versions at least, probably mostly for sparse arrays, not sure).
Tend to agree though, especially considering the confusing order of 3d, which I suppose is likely due to some linalg considerations. Of course you could supply something like an insertion order of (1, 0, 2) to denote the current 3D case in the nd one, but frankly it seems to me likely harder to understand how it works then to write your own functions to just do it.
Scipy uses the 3D case exactly never (once in a test). I have my doubts many would notice if we deprecate the 3D case, but then it is likely more trouble then gain.
np.array(a, copy=False, ndim=n)
or, for a list of inputs,
[np.array(a, copy=False, ndim=n) for a in input_list]
All the best,
Marten _______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org https://mail.scipy.org/mailman/listinfo/numpy-discussion