This issue recently came up on Stack Overflow: http://stackoverflow.com/questions/39145795/masking-a-series-with-a-boolean-.... The poster attempted to index an ndarray with a pandas boolean Series object (all False), but the result was as if he had indexed with an array of integer zeros.
Can someone explain this behavior? I can see two obvious possibilities:
1. ndarray checks if the input to __getitem__ is of exactly the right type, not using instanceof. 2. pandas actually uses a wider datatype than boolean internally, so indexing with the series is in fact indexing with an integer array.
In my attempt to reproduce the poster's results, I got the following warning:
FutureWarning: in the future, boolean array-likes will be handled as a boolean array index
This indicates that the issue is probably #1 and that a fix is already on the way. Please correct me if I am wrong. Also, where does the code for ndarray.__getitem__ live?
Thanks,
-Joe