Set FutureWarnings to error in (dev) tests?

Hi all, should we try to set FutureWarnings to errors in dev tests? I am seriously annoyed by FutureWarnings getting lost all over for two reasons. First, it is hard to impossible to find even our own errors for our own FutureWarning changes. Secondly, we currently would not even see any Futurewarnings from someone else. For numpy that may not be a big issue, but still. So, I was starting to try it, and it is annoying obviously. The only serious issue, though, seems actually MA [1]. Sometimes one would add a filter for a whole test file (for my start I did this for the NaT stuff) [2]. Anyway, should we attempt to do this? I admit that trying to make it work, even *with* the change FutureWarnings suddenly pop up when you make the warnings being given less often (I will guess that warning was already issued at import time somewhere). - Sebastian [1] And at that a brand new future warning there, which seems too agressive in any case, though that won't change much. [2] One annoying thing about this, the filter might never be removed. One could add a canary maybe to error out when the filter is not needed anymore.

On Thu, Jan 21, 2016 at 4:05 PM, Sebastian Berg <sebastian@sipsolutions.net> wrote:
Hi all,
should we try to set FutureWarnings to errors in dev tests? I am seriously annoyed by FutureWarnings getting lost all over for two reasons. First, it is hard to impossible to find even our own errors for our own FutureWarning changes. Secondly, we currently would not even see any Futurewarnings from someone else. For numpy that may not be a big issue, but still.
Yeah, I noticed this recently too :-(. Definitely it is the right thing to do, I think. And this is actually more true the more annoying it is, because if we're triggering lots of FutureWarnings then we should fix that :-). -n -- Nathaniel J. Smith -- https://vorpus.org

On Do, 2016-01-21 at 16:15 -0800, Nathaniel Smith wrote:
On Thu, Jan 21, 2016 at 4:05 PM, Sebastian Berg <sebastian@sipsolutions.net> wrote:
Hi all,
should we try to set FutureWarnings to errors in dev tests? I am seriously annoyed by FutureWarnings getting lost all over for two reasons. First, it is hard to impossible to find even our own errors for our own FutureWarning changes. Secondly, we currently would not even see any Futurewarnings from someone else. For numpy that may not be a big issue, but still.
Yeah, I noticed this recently too :-(. Definitely it is the right thing to do, I think. And this is actually more true the more annoying it is, because if we're triggering lots of FutureWarnings then we should fix that :-).
Yeah, the problem is that some FutureWarnings that are given in the dozens. Injecting the filter on the module level is possible, but not quite correct. Maybe one could do evil things similar to a "module decorator" to add the warning context + filter to every single function in a module starting with "test_". Another method could be to abuse `__warningregistry__`, but there are at least two reasons why this is probably not viable (nevermind that it would be ugly as well). Doesn't nose maybe provide *something*? I mean seriously, testing warnings tends to be hell broke lose? Change one thing, suddenly dozens appear from nowhere, never sure you found all cases, etc. - Sebastian
-n

On Thu, Jan 21, 2016 at 4:44 PM, Sebastian Berg <sebastian@sipsolutions.net> wrote:
On Do, 2016-01-21 at 16:15 -0800, Nathaniel Smith wrote:
On Thu, Jan 21, 2016 at 4:05 PM, Sebastian Berg <sebastian@sipsolutions.net> wrote:
Hi all,
should we try to set FutureWarnings to errors in dev tests? I am seriously annoyed by FutureWarnings getting lost all over for two reasons. First, it is hard to impossible to find even our own errors for our own FutureWarning changes. Secondly, we currently would not even see any Futurewarnings from someone else. For numpy that may not be a big issue, but still.
Yeah, I noticed this recently too :-(. Definitely it is the right thing to do, I think. And this is actually more true the more annoying it is, because if we're triggering lots of FutureWarnings then we should fix that :-).
Yeah, the problem is that some FutureWarnings that are given in the dozens. Injecting the filter on the module level is possible, but not quite correct. Maybe one could do evil things similar to a "module decorator" to add the warning context + filter to every single function in a module starting with "test_".
Can we remove the FutureWarnings by making whatever change they're warning about? :-)
Another method could be to abuse `__warningregistry__`, but there are at least two reasons why this is probably not viable (nevermind that it would be ugly as well).
Doesn't nose maybe provide *something*? I mean seriously, testing warnings tends to be hell broke lose? Change one thing, suddenly dozens appear from nowhere, never sure you found all cases, etc.
AFAICT nose doesn't provide much of anything for working with warnings. :-/ -n -- Nathaniel J. Smith -- https://vorpus.org

On Do, 2016-01-21 at 16:51 -0800, Nathaniel Smith wrote:
On Thu, Jan 21, 2016 at 4:44 PM, Sebastian Berg <sebastian@sipsolutions.net> wrote:
On Do, 2016-01-21 at 16:15 -0800, Nathaniel Smith wrote:
On Thu, Jan 21, 2016 at 4:05 PM, Sebastian Berg <sebastian@sipsolutions.net> wrote:
Hi all,
should we try to set FutureWarnings to errors in dev tests? I am seriously annoyed by FutureWarnings getting lost all over for two reasons. First, it is hard to impossible to find even our own errors for our own FutureWarning changes. Secondly, we currently would not even see any Futurewarnings from someone else. For numpy that may not be a big issue, but still.
Yeah, I noticed this recently too :-(. Definitely it is the right thing to do, I think. And this is actually more true the more annoying it is, because if we're triggering lots of FutureWarnings then we should fix that :-).
Yeah, the problem is that some FutureWarnings that are given in the dozens. Injecting the filter on the module level is possible, but not quite correct. Maybe one could do evil things similar to a "module decorator" to add the warning context + filter to every single function in a module starting with "test_".
Can we remove the FutureWarnings by making whatever change they're warning about? :-)
That would be equivalent of not issueing the futurewarning at all ;). Well, you can write a context manager and put this stuff into the if __name__ == '__main__': block actually. It is still annoying, since ideally we want to filter a single warning, which means the necessaty to install our own warning printer.
Another method could be to abuse `__warningregistry__`, but there are at least two reasons why this is probably not viable (nevermind that it would be ugly as well).
Doesn't nose maybe provide *something*? I mean seriously, testing warnings tends to be hell broke lose? Change one thing, suddenly dozens appear from nowhere, never sure you found all cases, etc.
AFAICT nose doesn't provide much of anything for working with warnings. :-/
-n

Warnings filters can be given a regex matching the warning text, I think? On Jan 21, 2016 5:00 PM, "Sebastian Berg" <sebastian@sipsolutions.net> wrote:
On Do, 2016-01-21 at 16:51 -0800, Nathaniel Smith wrote:
On Thu, Jan 21, 2016 at 4:44 PM, Sebastian Berg <sebastian@sipsolutions.net> wrote:
On Do, 2016-01-21 at 16:15 -0800, Nathaniel Smith wrote:
On Thu, Jan 21, 2016 at 4:05 PM, Sebastian Berg <sebastian@sipsolutions.net> wrote:
Hi all,
should we try to set FutureWarnings to errors in dev tests? I am seriously annoyed by FutureWarnings getting lost all over for two reasons. First, it is hard to impossible to find even our own errors for our own FutureWarning changes. Secondly, we currently would not even see any Futurewarnings from someone else. For numpy that may not be a big issue, but still.
Yeah, I noticed this recently too :-(. Definitely it is the right thing to do, I think. And this is actually more true the more annoying it is, because if we're triggering lots of FutureWarnings then we should fix that :-).
Yeah, the problem is that some FutureWarnings that are given in the dozens. Injecting the filter on the module level is possible, but not quite correct. Maybe one could do evil things similar to a "module decorator" to add the warning context + filter to every single function in a module starting with "test_".
Can we remove the FutureWarnings by making whatever change they're warning about? :-)
That would be equivalent of not issueing the futurewarning at all ;). Well, you can write a context manager and put this stuff into the
if __name__ == '__main__':
block actually. It is still annoying, since ideally we want to filter a single warning, which means the necessaty to install our own warning printer.
Another method could be to abuse `__warningregistry__`, but there are at least two reasons why this is probably not viable (nevermind that it would be ugly as well).
Doesn't nose maybe provide *something*? I mean seriously, testing warnings tends to be hell broke lose? Change one thing, suddenly dozens appear from nowhere, never sure you found all cases, etc.
AFAICT nose doesn't provide much of anything for working with warnings. :-/
-n
_______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org https://mail.scipy.org/mailman/listinfo/numpy-discussion

On Do, 2016-01-21 at 17:07 -0800, Nathaniel Smith wrote:
Warnings filters can be given a regex matching the warning text, I think?
Doesn't cut it, because you need to set the warning to "always", so then if you don't want to print it, you are stuck.... I wrote a context manager + func decorator + class decorator, which can do it (it overrides how the warning gets printed). Still got to decorate every weird test or at least test class, or escalate it to the global level. Needs some more, but that suppression functionality (in what exact form it could be in the end), is much more reasonable then our typical warning context, since that drops even printing (though since we would use Error during development it should not matter much). The stuff is basically a safe version of: with warnings.catch_warnings(): warnings.filterwarnings("ignore", ....) # oh noe not ignore! which also still prints other warnings. - sebastian
On Jan 21, 2016 5:00 PM, "Sebastian Berg" <sebastian@sipsolutions.net
On Thu, Jan 21, 2016 at 4:44 PM, Sebastian Berg <sebastian@sipsolutions.net> wrote:
On Do, 2016-01-21 at 16:15 -0800, Nathaniel Smith wrote:
On Thu, Jan 21, 2016 at 4:05 PM, Sebastian Berg <sebastian@sipsolutions.net> wrote:
Hi all,
should we try to set FutureWarnings to errors in dev tests? I am seriously annoyed by FutureWarnings getting lost all over for two reasons. First, it is hard to impossible to find even our own errors for our own FutureWarning changes. Secondly, we currently would not even see any Futurewarnings from someone else. For numpy
may not be a big issue, but still.
Yeah, I noticed this recently too :-(. Definitely it is the right thing to do, I think. And this is actually more true the more annoying it is, because if we're triggering lots of FutureWarnings
should fix that :-).
Yeah, the problem is that some FutureWarnings that are given in
dozens. Injecting the filter on the module level is possible, but not quite correct. Maybe one could do evil things similar to a "module decorator" to add the warning context + filter to every single function in a module starting with "test_".
Can we remove the FutureWarnings by making whatever change
wrote: On Do, 2016-01-21 at 16:51 -0800, Nathaniel Smith wrote: that then we the they're
warning about? :-)
That would be equivalent of not issueing the futurewarning at all ;). Well, you can write a context manager and put this stuff into the
if __name__ == '__main__':
block actually. It is still annoying, since ideally we want to filter a single warning, which means the necessaty to install our own warning printer.
Another method could be to abuse `__warningregistry__`, but there are at least two reasons why this is probably not viable (nevermind that it would be ugly as well).
Doesn't nose maybe provide *something*? I mean seriously, testing warnings tends to be hell broke lose? Change one thing, suddenly dozens appear from nowhere, never sure you found all cases, etc.
AFAICT nose doesn't provide much of anything for working with warnings. :-/
-n
_______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org https://mail.scipy.org/mailman/listinfo/numpy-discussion
_______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org https://mail.scipy.org/mailman/listinfo/numpy-discussion
participants (2)
-
Nathaniel Smith
-
Sebastian Berg