Test Design Question for Stats.Models Code
![](https://secure.gravatar.com/avatar/272540e56a9b1b5c01fa5ef3c7a91edd.jpg?s=120&d=mm&r=g)
Hello all, I am polishing up the generalized linear models right now for the stats.models project and I have a question about using decorators with my tests. The GLM framework has a central model with shared properties and then several variations on this model, so to test I have just as a simplified example: from numpy.testing import * DECIMAL = 4 class check_glm(object): ''' res2 results will be obtained from R or the RModelwrap ''' def test_params(self): assert_almost_equal(self.res1.params, self.res2.params, DECIMAL) def test_resids(self): assert_almost_equal(self.res1.resids, self.res2.resids, DECIMAL) class test_glm_gamma(check_glm): def __init__(self): # Preprocessing to setup results self.res1 = ResultsFromGLM self.res2 = R_Results if __name__=="__main__": run_module_suite() My question is whether I can skip, for arguments sake, test_resids depending, for example, on the class of self.res2 or because I defined the test condition as True in the test_<> class. I tried putting in the check_glm class @dec.skipif(TestCondition, "Skipping this test because of ...") def test_resids(self): ... TestCondition should be None by default, but how can I get the value of TestCondition to evaluate to True if appropriate? I have tried a few different ways, but I am a little stumped. Does this make sense/is it possible? I'm sure I'm missing something obvious, but any insights would be appreciated. Cheers, Skipper
![](https://secure.gravatar.com/avatar/764323a14e554c97ab74177e0bce51d4.jpg?s=120&d=mm&r=g)
On Fri, Jul 17, 2009 at 12:18, Skipper Seabold<jsseabold@gmail.com> wrote:
Hello all,
I am polishing up the generalized linear models right now for the stats.models project and I have a question about using decorators with my tests. The GLM framework has a central model with shared properties and then several variations on this model, so to test I have just as a simplified example:
from numpy.testing import * DECIMAL = 4 class check_glm(object): ''' res2 results will be obtained from R or the RModelwrap '''
def test_params(self): assert_almost_equal(self.res1.params, self.res2.params, DECIMAL)
def test_resids(self): assert_almost_equal(self.res1.resids, self.res2.resids, DECIMAL)
class test_glm_gamma(check_glm): def __init__(self): # Preprocessing to setup results self.res1 = ResultsFromGLM self.res2 = R_Results
if __name__=="__main__": run_module_suite()
My question is whether I can skip, for arguments sake, test_resids depending, for example, on the class of self.res2 or because I defined the test condition as True in the test_<> class. I tried putting in the check_glm class
@dec.skipif(TestCondition, "Skipping this test because of ...") def test_resids(self): ...
TestCondition should be None by default, but how can I get the value of TestCondition to evaluate to True if appropriate? I have tried a few different ways, but I am a little stumped. Does this make sense/is it possible? I'm sure I'm missing something obvious, but any insights would be appreciated.
I don't think you can do that with the decorator. Just do the test in code inside the method and raise nose.SkipTest explicitly: def test_resids(self): if not isinstance(self.res1, GLMResults): raise nose.SkipTest("Not a GLM test") -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
![](https://secure.gravatar.com/avatar/38d5ac232150013cbf1a4639538204c0.jpg?s=120&d=mm&r=g)
On 07/17/2009 12:26 PM, Robert Kern wrote:
On Fri, Jul 17, 2009 at 12:18, Skipper Seabold<jsseabold@gmail.com> wrote:
Hello all,
I am polishing up the generalized linear models right now for the stats.models project and I have a question about using decorators with my tests. The GLM framework has a central model with shared properties and then several variations on this model, so to test I have just as a simplified example:
from numpy.testing import * DECIMAL = 4 class check_glm(object): ''' res2 results will be obtained from R or the RModelwrap '''
def test_params(self): assert_almost_equal(self.res1.params, self.res2.params, DECIMAL)
def test_resids(self): assert_almost_equal(self.res1.resids, self.res2.resids, DECIMAL)
class test_glm_gamma(check_glm): def __init__(self): # Preprocessing to setup results self.res1 = ResultsFromGLM self.res2 = R_Results
if __name__=="__main__": run_module_suite()
My question is whether I can skip, for arguments sake, test_resids depending, for example, on the class of self.res2 or because I defined the test condition as True in the test_<> class. I tried putting in the check_glm class
@dec.skipif(TestCondition, "Skipping this test because of ...") def test_resids(self): ...
TestCondition should be None by default, but how can I get the value of TestCondition to evaluate to True if appropriate? I have tried a few different ways, but I am a little stumped. Does this make sense/is it possible? I'm sure I'm missing something obvious, but any insights would be appreciated.
I don't think you can do that with the decorator. Just do the test in code inside the method and raise nose.SkipTest explicitly:
def test_resids(self): if not isinstance(self.res1, GLMResults): raise nose.SkipTest("Not a GLM test")
I think you should follow (or do something similar to) the approach given under the section 'Creating many similar tests' in the Numpy guidelines under Tips & Tricks: http://projects.scipy.org/numpy/wiki/TestingGuidelines This gives the option to group tests from related models together. Bruce
![](https://secure.gravatar.com/avatar/272540e56a9b1b5c01fa5ef3c7a91edd.jpg?s=120&d=mm&r=g)
On Fri, Jul 17, 2009 at 1:42 PM, Bruce Southey<bsouthey@gmail.com> wrote:
On 07/17/2009 12:26 PM, Robert Kern wrote:
On Fri, Jul 17, 2009 at 12:18, Skipper Seabold<jsseabold@gmail.com> wrote:
Hello all,
I am polishing up the generalized linear models right now for the stats.models project and I have a question about using decorators with my tests. The GLM framework has a central model with shared properties and then several variations on this model, so to test I have just as a simplified example:
from numpy.testing import * DECIMAL = 4 class check_glm(object): ''' res2 results will be obtained from R or the RModelwrap '''
def test_params(self): assert_almost_equal(self.res1.params, self.res2.params, DECIMAL)
def test_resids(self): assert_almost_equal(self.res1.resids, self.res2.resids, DECIMAL)
class test_glm_gamma(check_glm): def __init__(self): # Preprocessing to setup results self.res1 = ResultsFromGLM self.res2 = R_Results
if __name__=="__main__": run_module_suite()
My question is whether I can skip, for arguments sake, test_resids depending, for example, on the class of self.res2 or because I defined the test condition as True in the test_<> class. I tried putting in the check_glm class
@dec.skipif(TestCondition, "Skipping this test because of ...") def test_resids(self): ...
TestCondition should be None by default, but how can I get the value of TestCondition to evaluate to True if appropriate? I have tried a few different ways, but I am a little stumped. Does this make sense/is it possible? I'm sure I'm missing something obvious, but any insights would be appreciated.
I don't think you can do that with the decorator. Just do the test in code inside the method and raise nose.SkipTest explicitly:
def test_resids(self): if not isinstance(self.res1, GLMResults): raise nose.SkipTest("Not a GLM test")
Thanks for the suggestion. This definitely works for the isinstance check, and I will include it for when that's appropriate. Without going into too much mundane details, it's not quite flexible enough for my other needs. I suppose I could have a barrage of if tests with each test in the parent class, but it would be ugly...
I think you should follow (or do something similar to) the approach given under the section 'Creating many similar tests' in the Numpy guidelines under Tips & Tricks: http://projects.scipy.org/numpy/wiki/TestingGuidelines
This gives the option to group tests from related models together.
Bruce
Thanks as well. The tests I had were similar to the example. The difference is that the data is defined for each family (test subclass) of the same model (parent test class). I have moved some of the asserts now to the subclasses like the example, so I can decorate/skip each one as needed (this isn't quite as haphazard as it sounds), but this somewhat defeats the purpose of having the parent class with the tests to reuse in the first place. It's possible that my needs won't allow me to be as lazy as I wanted to be ;). I really wish I could just define the TestConditions "on the fly" how I was originally thinking, but I'm not sure even this would have worked given that I couldn't do it simply with the explicit if tests. Once I finish this round of refactoring perhaps I will link to the tests so that what's going on is clearer and to see if anyone can/wants to point out a better way. Thanks, Skipper
![](https://secure.gravatar.com/avatar/96dd777e397ab128fedab46af97a3a4a.jpg?s=120&d=mm&r=g)
On Fri, Jul 17, 2009 at 12:40 PM, Skipper Seabold <jsseabold@gmail.com>wrote:
On Fri, Jul 17, 2009 at 1:42 PM, Bruce Southey<bsouthey@gmail.com> wrote:
On 07/17/2009 12:26 PM, Robert Kern wrote:
On Fri, Jul 17, 2009 at 12:18, Skipper Seabold<jsseabold@gmail.com> wrote:
Hello all,
I am polishing up the generalized linear models right now for the stats.models project and I have a question about using decorators with my tests. The GLM framework has a central model with shared properties and then several variations on this model, so to test I have just as a simplified example:
from numpy.testing import * DECIMAL = 4 class check_glm(object): ''' res2 results will be obtained from R or the RModelwrap '''
def test_params(self): assert_almost_equal(self.res1.params, self.res2.params, DECIMAL)
def test_resids(self): assert_almost_equal(self.res1.resids, self.res2.resids, DECIMAL)
class test_glm_gamma(check_glm): def __init__(self): # Preprocessing to setup results self.res1 = ResultsFromGLM self.res2 = R_Results
if __name__=="__main__": run_module_suite()
My question is whether I can skip, for arguments sake, test_resids depending, for example, on the class of self.res2 or because I defined the test condition as True in the test_<> class. I tried putting in the check_glm class
@dec.skipif(TestCondition, "Skipping this test because of ...") def test_resids(self): ...
TestCondition should be None by default, but how can I get the value of TestCondition to evaluate to True if appropriate? I have tried a few different ways, but I am a little stumped. Does this make sense/is it possible? I'm sure I'm missing something obvious, but any insights would be appreciated.
I don't think you can do that with the decorator. Just do the test in code inside the method and raise nose.SkipTest explicitly:
def test_resids(self): if not isinstance(self.res1, GLMResults): raise nose.SkipTest("Not a GLM test")
Thanks for the suggestion. This definitely works for the isinstance check, and I will include it for when that's appropriate. Without going into too much mundane details, it's not quite flexible enough for my other needs. I suppose I could have a barrage of if tests with each test in the parent class, but it would be ugly...
I think you should follow (or do something similar to) the approach given under the section 'Creating many similar tests' in the Numpy guidelines under Tips & Tricks: http://projects.scipy.org/numpy/wiki/TestingGuidelines
This gives the option to group tests from related models together.
Bruce
Thanks as well. The tests I had were similar to the example. The difference is that the data is defined for each family (test subclass) of the same model (parent test class). I have moved some of the asserts now to the subclasses like the example, so I can decorate/skip
It is best to explicitly raise AssertionError rather than use assert because assert disappears in a production release. that is to say, it is for debugging, not production code.If you are using tools from numpy.testing there is an assert_ function that you can use instead of assert. Chuck
![](https://secure.gravatar.com/avatar/ad13088a623822caf74e635a68a55eae.jpg?s=120&d=mm&r=g)
On Fri, Jul 17, 2009 at 8:40 PM, Skipper Seabold<jsseabold@gmail.com> wrote:
On Fri, Jul 17, 2009 at 1:42 PM, Bruce Southey<bsouthey@gmail.com> wrote:
On 07/17/2009 12:26 PM, Robert Kern wrote:
On Fri, Jul 17, 2009 at 12:18, Skipper Seabold<jsseabold@gmail.com> wrote:
Hello all,
I am polishing up the generalized linear models right now for the stats.models project and I have a question about using decorators with my tests. The GLM framework has a central model with shared properties and then several variations on this model, so to test I have just as a simplified example:
from numpy.testing import * DECIMAL = 4 class check_glm(object): ''' res2 results will be obtained from R or the RModelwrap '''
def test_params(self): assert_almost_equal(self.res1.params, self.res2.params, DECIMAL)
def test_resids(self): assert_almost_equal(self.res1.resids, self.res2.resids, DECIMAL)
class test_glm_gamma(check_glm): def __init__(self): # Preprocessing to setup results self.res1 = ResultsFromGLM self.res2 = R_Results
if __name__=="__main__": run_module_suite()
My question is whether I can skip, for arguments sake, test_resids depending, for example, on the class of self.res2 or because I defined the test condition as True in the test_<> class. I tried putting in the check_glm class
@dec.skipif(TestCondition, "Skipping this test because of ...") def test_resids(self): ...
TestCondition should be None by default, but how can I get the value of TestCondition to evaluate to True if appropriate? I have tried a few different ways, but I am a little stumped. Does this make sense/is it possible? I'm sure I'm missing something obvious, but any insights would be appreciated.
I don't think you can do that with the decorator. Just do the test in code inside the method and raise nose.SkipTest explicitly:
def test_resids(self): if not isinstance(self.res1, GLMResults): raise nose.SkipTest("Not a GLM test")
Thanks for the suggestion. This definitely works for the isinstance check, and I will include it for when that's appropriate. Without going into too much mundane details, it's not quite flexible enough for my other needs. I suppose I could have a barrage of if tests with each test in the parent class, but it would be ugly...
I think you should follow (or do something similar to) the approach given under the section 'Creating many similar tests' in the Numpy guidelines under Tips & Tricks: http://projects.scipy.org/numpy/wiki/TestingGuidelines
This gives the option to group tests from related models together.
Bruce
Thanks as well. The tests I had were similar to the example. The difference is that the data is defined for each family (test subclass) of the same model (parent test class). I have moved some of the asserts now to the subclasses like the example, so I can decorate/skip each one as needed (this isn't quite as haphazard as it sounds), but this somewhat defeats the purpose of having the parent class with the tests to reuse in the first place. It's possible that my needs won't allow me to be as lazy as I wanted to be ;). I really wish I could just define the TestConditions "on the fly" how I was originally thinking, but I'm not sure even this would have worked given that I couldn't do it simply with the explicit if tests.
Once I finish this round of refactoring perhaps I will link to the tests so that what's going on is clearer and to see if anyone can/wants to point out a better way.
Thanks,
Skipper _______________________________________________ Scipy-dev mailing list Scipy-dev@scipy.org http://mail.scipy.org/mailman/listinfo/scipy-dev
If you have groups of cases that would require similar skips, you could try a second level in the class hierarchy, where common subgroups of cases inherit from the same intermediate class. But I never tried it with nose, and I'm currently unable to try anything. Josef
participants (5)
-
Bruce Southey
-
Charles R Harris
-
josef.pktd@gmail.com
-
Robert Kern
-
Skipper Seabold