Make pytest fail when tests marked as xfail pass#1280
Make pytest fail when tests marked as xfail pass#1280
Conversation
Tested these locally, and also in CI logs these seem to pass in multiple jobs in different distributions.
This will allow us to catch accidently passing tests, and in general gives more visibility to which tests actually pass or fail.
|
It seems the ifup and ifdown tests don't fail in centos7, since the |
| markers = | ||
| bashcomp | ||
| complete | ||
| xfail_strict=true |
There was a problem hiding this comment.
If we want this, I think we also want/need to go all things that can be xfails or xpasses, and mark them as such, specifically on per distro basis (i.e. semantically for example like "expected to fail on CentOS 7").
We could also consider changing some xfails to skips, but to me the downside of that would be that we'd get no indication if something that used to fail has started to pass due to something (which xpass gives us). There might be some skips that we'd want to convert to xfails for this reason.
There are currently six tests that "xpass" (at least in a recent job I checked). All of them pass correctly locally for me as well.
This removes the specific xfail markers for the passing tests and sets
xfail_strictto true so CI will fail if any other ones will succeed by accident.More info about xfails vs xpasses: https://docs.pytest.org/en/stable/how-to/skipping.html#xfail-mark-test-functions-as-expected-to-fail
This is split into 3 commits for different parts of this, the first one just includes some cleanup of test_mkdir.py as well as xfail removal.