Help with parametrized fixtures
Hello, I have a parametrized fixture foo, and some test cases that use it: @pytest.fixture(params=(1,2,3,4,5)) def foo(request): # ... return foo_f def test_1(foo): # ... def test_2(foo): # .... Now, I'd like to add an additional test, but running it only makes sense for some of the fixture parameters. So far I've solved this by providing the parameter value as a fixture attribute and skipping "unsupported" values: @pytest.fixture(params=(1,2,3,4,5)) def foo(request): # ... foo_f.param = request.param return foo_f def test_3(foo): if foo.param not in (1,4,5): raise pytest.skip("doesn't make sense") This works, but I don't like it very much because it means that the test suite can never be executed without skipping some tests. I'd rather reserve skipping for cases where the test could in principle be executed, but for some reason cannot work at the moment (e.g. because there's no internet, or a required utility is not installed). Is there a way to solve the above scenario without skipping tests? Ideally, I'd be able to do something like this: @only_for_param((1,4,5)) def test_3(foo): ... Thanks! -Nikolaus -- GPG encrypted emails preferred. Key id: 0xD113FCAC3C4E599F Fingerprint: ED31 791B 2C5C 1613 AF38 8B8A D113 FCAC 3C4E 599F »Time flies like an arrow, fruit flies like a Banana.«
Hi Nikolaus, On 11/26/2014 01:21 PM, Nikolaus Rath wrote:
I have a parametrized fixture foo, and some test cases that use it:
@pytest.fixture(params=(1,2,3,4,5)) def foo(request): # ... return foo_f
def test_1(foo): # ...
def test_2(foo): # ....
Now, I'd like to add an additional test, but running it only makes sense for some of the fixture parameters. So far I've solved this by providing the parameter value as a fixture attribute and skipping "unsupported" values:
@pytest.fixture(params=(1,2,3,4,5)) def foo(request): # ... foo_f.param = request.param return foo_f
def test_3(foo): if foo.param not in (1,4,5): raise pytest.skip("doesn't make sense")
This works, but I don't like it very much because it means that the test suite can never be executed without skipping some tests. I'd rather reserve skipping for cases where the test could in principle be executed, but for some reason cannot work at the moment (e.g. because there's no internet, or a required utility is not installed).
Is there a way to solve the above scenario without skipping tests?
Ideally, I'd be able to do something like this:
@only_for_param((1,4,5)) def test_3(foo): ...
I think in order to do this you'll need to use the `pytest_generate_tests` hook: http://pytest.org/latest/parametrize.html#pytest-generate-tests I might be wrong (haven't had to use pytest_generate_tests much), but I don't think you'll be able to keep the simple parametrized fixture and layer this on top; I don't think pytest_generate_tests can tweak the fixture params per-test. You'll need to remove the fixture parametrization and do it instead via pytest_generate_tests, using indirect=True. Carl
Carl Meyer <carl-faUO1KlGllLR7s880joybQ@public.gmane.org> writes:
Hi Nikolaus,
On 11/26/2014 01:21 PM, Nikolaus Rath wrote:
I have a parametrized fixture foo, and some test cases that use it:
@pytest.fixture(params=(1,2,3,4,5)) def foo(request): # ... return foo_f
def test_1(foo): # ...
def test_2(foo): # ....
Now, I'd like to add an additional test, but running it only makes sense for some of the fixture parameters. So far I've solved this by providing the parameter value as a fixture attribute and skipping "unsupported" values:
@pytest.fixture(params=(1,2,3,4,5)) def foo(request): # ... foo_f.param = request.param return foo_f
def test_3(foo): if foo.param not in (1,4,5): raise pytest.skip("doesn't make sense")
This works, but I don't like it very much because it means that the test suite can never be executed without skipping some tests. I'd rather reserve skipping for cases where the test could in principle be executed, but for some reason cannot work at the moment (e.g. because there's no internet, or a required utility is not installed).
Is there a way to solve the above scenario without skipping tests?
Ideally, I'd be able to do something like this:
@only_for_param((1,4,5)) def test_3(foo): ...
I think in order to do this you'll need to use the `pytest_generate_tests` hook: http://pytest.org/latest/parametrize.html#pytest-generate-tests
Thanks for the link! The following seems to work nicely: #!/usr/bin/env python3 import pytest @pytest.fixture() def bar(request): s = 'bar-%d' % request.param print('preparing', s) return s def test_one(bar): pass def test_two(bar): pass test_two.test_with = (2,3) def pytest_generate_tests(metafunc): params = (1,2,3,4) if not 'bar' in metafunc.fixturenames: return test_with = getattr(metafunc.function, 'test_with', None) if test_with: params = test_with metafunc.parametrize("bar", params, indirect=True) Unfortunately, it seems there are several issues with parametrizing fixtures these way that are probably going to bite me as things get more complex: https://bitbucket.org/hpk42/pytest/issue/635 https://bitbucket.org/hpk42/pytest/issue/634 https://bitbucket.org/hpk42/pytest/issue/531 https://bitbucket.org/hpk42/pytest/issue/519 (Maybe mentioning them here gives them more attention :-). Best, -Nikolaus -- GPG encrypted emails preferred. Key id: 0xD113FCAC3C4E599F Fingerprint: ED31 791B 2C5C 1613 AF38 8B8A D113 FCAC 3C4E 599F »Time flies like an arrow, fruit flies like a Banana.«
On Thu, Nov 27, 2014 at 14:22 -0800, Nikolaus Rath wrote:
Carl Meyer <carl-faUO1KlGllLR7s880joybQ@public.gmane.org> writes:
Hi Nikolaus,
On 11/26/2014 01:21 PM, Nikolaus Rath wrote:
I have a parametrized fixture foo, and some test cases that use it:
@pytest.fixture(params=(1,2,3,4,5)) def foo(request): # ... return foo_f
def test_1(foo): # ...
def test_2(foo): # ....
Now, I'd like to add an additional test, but running it only makes sense for some of the fixture parameters. So far I've solved this by providing the parameter value as a fixture attribute and skipping "unsupported" values:
@pytest.fixture(params=(1,2,3,4,5)) def foo(request): # ... foo_f.param = request.param return foo_f
def test_3(foo): if foo.param not in (1,4,5): raise pytest.skip("doesn't make sense")
This works, but I don't like it very much because it means that the test suite can never be executed without skipping some tests. I'd rather reserve skipping for cases where the test could in principle be executed, but for some reason cannot work at the moment (e.g. because there's no internet, or a required utility is not installed).
Is there a way to solve the above scenario without skipping tests?
Ideally, I'd be able to do something like this:
@only_for_param((1,4,5)) def test_3(foo): ...
I think in order to do this you'll need to use the `pytest_generate_tests` hook: http://pytest.org/latest/parametrize.html#pytest-generate-tests
Thanks for the link! The following seems to work nicely:
#!/usr/bin/env python3 import pytest
@pytest.fixture() def bar(request): s = 'bar-%d' % request.param print('preparing', s) return s
def test_one(bar): pass
def test_two(bar): pass test_two.test_with = (2,3)
def pytest_generate_tests(metafunc): params = (1,2,3,4) if not 'bar' in metafunc.fixturenames: return
test_with = getattr(metafunc.function, 'test_with', None) if test_with: params = test_with metafunc.parametrize("bar", params, indirect=True)
Unfortunately, it seems there are several issues with parametrizing fixtures these way that are probably going to bite me as things get more complex:
https://bitbucket.org/hpk42/pytest/issue/635 https://bitbucket.org/hpk42/pytest/issue/634 https://bitbucket.org/hpk42/pytest/issue/531 https://bitbucket.org/hpk42/pytest/issue/519
(Maybe mentioning them here gives them more attention :-).
Heh, you are touching the edges of what is currently possible and implemented with respect to parametrization, including what are probably real bugs. For my own uses of parametrization those edges usually did not play a role yet IIRC. If you need help in trying to fix things yourself let me know. If you don't want to tackle that it's fine as well, maybe somebody else gets to it. In any case, thanks for creating the detailed issues! best, holger
participants (3)
-
Carl Meyer -
holger krekel -
Nikolaus Rath