j
k
j a
j l
Consider this (on a 64-bit platform):
......numpy.dtype('q') == numpy.dtype('l') True
...numpy.dtype('q') == numpy.dtype('l') True
numpy.dtype('q') == numpy.dtype('l') True
but
......numpy.dtype('q').char == numpy.dtype('l').char False
...numpy.dtype('q').char == numpy.dtype('l').char False
numpy.dtype('q').char == numpy.dtype('l').char False
Is that intended? Shouldn't dtype constructor "normalize" 'l' to 'q' (or 'i')?
Attachments:
Back to the thread
Back to the list