![](https://secure.gravatar.com/avatar/96dd777e397ab128fedab46af97a3a4a.jpg?s=120&d=mm&r=g)
On Fri, Jul 17, 2009 at 12:40 PM, Skipper Seabold <jsseabold@gmail.com>wrote:
On Fri, Jul 17, 2009 at 1:42 PM, Bruce Southey<bsouthey@gmail.com> wrote:
On 07/17/2009 12:26 PM, Robert Kern wrote:
On Fri, Jul 17, 2009 at 12:18, Skipper Seabold<jsseabold@gmail.com> wrote:
Hello all,
I am polishing up the generalized linear models right now for the stats.models project and I have a question about using decorators with my tests. The GLM framework has a central model with shared properties and then several variations on this model, so to test I have just as a simplified example:
from numpy.testing import * DECIMAL = 4 class check_glm(object): ''' res2 results will be obtained from R or the RModelwrap '''
def test_params(self): assert_almost_equal(self.res1.params, self.res2.params, DECIMAL)
def test_resids(self): assert_almost_equal(self.res1.resids, self.res2.resids, DECIMAL)
class test_glm_gamma(check_glm): def __init__(self): # Preprocessing to setup results self.res1 = ResultsFromGLM self.res2 = R_Results
if __name__=="__main__": run_module_suite()
My question is whether I can skip, for arguments sake, test_resids depending, for example, on the class of self.res2 or because I defined the test condition as True in the test_<> class. I tried putting in the check_glm class
@dec.skipif(TestCondition, "Skipping this test because of ...") def test_resids(self): ...
TestCondition should be None by default, but how can I get the value of TestCondition to evaluate to True if appropriate? I have tried a few different ways, but I am a little stumped. Does this make sense/is it possible? I'm sure I'm missing something obvious, but any insights would be appreciated.
I don't think you can do that with the decorator. Just do the test in code inside the method and raise nose.SkipTest explicitly:
def test_resids(self): if not isinstance(self.res1, GLMResults): raise nose.SkipTest("Not a GLM test")
Thanks for the suggestion. This definitely works for the isinstance check, and I will include it for when that's appropriate. Without going into too much mundane details, it's not quite flexible enough for my other needs. I suppose I could have a barrage of if tests with each test in the parent class, but it would be ugly...
I think you should follow (or do something similar to) the approach given under the section 'Creating many similar tests' in the Numpy guidelines under Tips & Tricks: http://projects.scipy.org/numpy/wiki/TestingGuidelines
This gives the option to group tests from related models together.
Bruce
Thanks as well. The tests I had were similar to the example. The difference is that the data is defined for each family (test subclass) of the same model (parent test class). I have moved some of the asserts now to the subclasses like the example, so I can decorate/skip
It is best to explicitly raise AssertionError rather than use assert because assert disappears in a production release. that is to say, it is for debugging, not production code.If you are using tools from numpy.testing there is an assert_ function that you can use instead of assert. Chuck