We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
-x
When I run tests with the -x pytest option then I see this failure:
============================= test session starts ============================== platform sunos5 -- Python 3.9.19, pytest-8.3.1, pluggy-1.5.0 -- /usr/bin/python3.9 cachedir: .pytest_cache rootdir: /data/builds/oi-userland/components/python/pytest-system-statistics/build/amd64-3.9 configfile: pytest.ini testpaths: tests/ plugins: skip-markers-1.5.1, system-statistics-1.0.2 collecting ... collected 7 items tests/functional/test_syststats.py::test_sys_stats_not_verbose PASSED [ 14%] tests/functional/test_syststats.py::test_sys_stats_not_verbose_enough PASSED [ 28%] tests/functional/test_syststats.py::test_sys_stats_disabled PASSED [ 42%] tests/functional/test_syststats.py::test_basic_sys_stats PASSED [ 57%] tests/functional/test_syststats.py::test_basic_sys_stats_uss SKIPPED [ 71%] tests/functional/test_syststats.py::test_proc_sys_stats FAILED [ 85%] =================================== FAILURES =================================== _____________________________ test_proc_sys_stats ______________________________ pytester = <Pytester PosixPath('/tmp/pytest-of-marcel/pytest-12/test_proc_sys_stats0')> tmp_path = PosixPath('/tmp/pytest-of-marcel/pytest-12/test_proc_sys_stats1') def test_proc_sys_stats(pytester, tmp_path): executable = sys.executable script1 = tmp_path / "script1.py" script1.write_text( textwrap.dedent( """\ import time import multiprocessing if __name__ == '__main__': multiprocessing.freeze_support() try: while True: time.sleep(0.25) except Exception: pass """ ) ) script2 = tmp_path / "script2.py" script2.write_text( textwrap.dedent( """\ import sys import subprocess import multiprocessing if __name__ == '__main__': multiprocessing.freeze_support() proc = subprocess.run([r"{}", r"{}"]) """.format( executable, script1 ) ) ) pytester.makepyfile( """ import sys import pytest import subprocess import psutil import time @pytest.fixture def foo_process(stats_processes): proc = subprocess.Popen([r"{}", r"{}"]) try: time.sleep(0.25) assert psutil.Process(proc.pid).children() stats_processes.add("FooProcess", proc.pid) yield proc finally: stats_processes.remove("FooProcess") proc.terminate() def test_one(foo_process): assert True """.format( executable, script2 ) ) res = pytester.runpytest("-vv", "--sys-stats") > res.assert_outcomes(passed=1) E AssertionError: assert {'passed': 0, 'skipped': 0, 'failed': 0, 'errors': 1, 'xpassed': 0, 'xfailed': 0} == {'passed': 1, 'skipped': 0, 'failed': 0, 'errors': 0, 'xpassed': 0, 'xfailed': 0} E E Common items: E {'failed': 0, 'skipped': 0, 'xfailed': 0, 'xpassed': 0} E Differing items: E {'passed': 0} != {'passed': 1} E {'errors': 1} != {'errors': 0} E E Full diff: E { E - 'errors': 0, E ? ^ E + 'errors': 1, E ? ^ E 'failed': 0, E - 'passed': 1, E ? ^ E + 'passed': 0, E ? ^ E 'skipped': 0, E 'xfailed': 0, E 'xpassed': 0, E } /data/builds/oi-userland/components/python/pytest-system-statistics/build/amd64-3.9/tests/functional/test_syststats.py:182: AssertionError ----------------------------- Captured stdout call ----------------------------- ============================= test session starts ============================== platform sunos5 -- Python 3.9.19, pytest-8.3.1, pluggy-1.5.0 -- /usr/bin/python3.9 cachedir: .pytest_cache rootdir: /tmp/pytest-of-marcel/pytest-12/test_proc_sys_stats0 plugins: skip-markers-1.5.1, system-statistics-1.0.2 collecting ... collected 1 item test_proc_sys_stats.py::test_one ERROR [100%] ==================================== ERRORS ==================================== __________________________ ERROR at setup of test_one __________________________ stats_processes = StatsProcesses(processes=OrderedDict([('Test Suite Run', psutil.Process(pid=2917, name='python3.9', status='running', started='11:59:06'))])) @pytest.fixture def foo_process(stats_processes): proc = subprocess.Popen([r"/usr/bin/python3.9", r"/tmp/pytest-of-marcel/pytest-12/test_proc_sys_stats1/script2.py"]) try: time.sleep(0.25) > assert psutil.Process(proc.pid).children() E AssertionError: assert [] E + where [] = children() E + where children = psutil.Process(pid=2923, name='python3.9', status='running', started='11:59:10').children E + where psutil.Process(pid=2923, name='python3.9', status='running', started='11:59:10') = <class 'psutil.Process'>(2923) E + where <class 'psutil.Process'> = psutil.Process E + and 2923 = <Popen: returncode: None args: ['/usr/bin/python3.9', '/tmp/pytest-of-marcel...>.pid test_proc_sys_stats.py:12: AssertionError =========================== short test summary info ============================ ERROR test_proc_sys_stats.py::test_one - AssertionError: assert [] + where [] = children() + where children = psutil.Process(pid=2923, name='python3.9', status='running', started='11:59:10').children + where psutil.Process(pid=2923, name='python3.9', status='running', started='11:59:10') = <class 'psutil.Process'>(2923) + where <class 'psutil.Process'> = psutil.Process + and 2923 = <Popen: returncode: None args: ['/usr/bin/python3.9', '/tmp/pytest-of-marcel...>.pid =============================== 1 error in 0.36s =============================== =========================== short test summary info ============================ FAILED tests/functional/test_syststats.py::test_proc_sys_stats - AssertionError: assert {'passed': 0, 'skipped': 0, 'failed': 0, 'errors': 1, 'xpassed': 0, 'xfailed': 0} == {'passed': 1, 'skipped': 0, 'failed': 0, 'errors': 0, 'xpassed': 0, 'xfailed': 0} Common items: {'failed': 0, 'skipped': 0, 'xfailed': 0, 'xpassed': 0} Differing items: {'passed': 0} != {'passed': 1} {'errors': 1} != {'errors': 0} Full diff: { - 'errors': 0, ? ^ + 'errors': 1, ? ^ 'failed': 0, - 'passed': 1, ? ^ + 'passed': 0, ? ^ 'skipped': 0, 'xfailed': 0, 'xpassed': 0, } !!!!!!!!!!!!!!!!!!!!!!!!!! stopping after 1 failures !!!!!!!!!!!!!!!!!!!!!!!!!!! ==================== 1 failed, 4 passed, 1 skipped in 2.50s ====================
When I do not use -x then all tests pass:
============================= test session starts ============================== platform sunos5 -- Python 3.9.19, pytest-8.3.1, pluggy-1.5.0 -- /usr/bin/python3.9 cachedir: .pytest_cache rootdir: /data/builds/oi-userland/components/python/pytest-system-statistics/build/amd64-3.9 configfile: pytest.ini testpaths: tests/ plugins: skip-markers-1.5.1, system-statistics-1.0.2 collecting ... collected 7 items tests/functional/test_syststats.py::test_sys_stats_not_verbose PASSED [ 14%] tests/functional/test_syststats.py::test_sys_stats_not_verbose_enough PASSED [ 28%] tests/functional/test_syststats.py::test_sys_stats_disabled PASSED [ 42%] tests/functional/test_syststats.py::test_basic_sys_stats PASSED [ 57%] tests/functional/test_syststats.py::test_basic_sys_stats_uss SKIPPED [ 71%] tests/functional/test_syststats.py::test_proc_sys_stats PASSED [ 85%] tests/functional/test_syststats.py::test_proc_sys_stats_no_children PASSED [100%] ========================= 6 passed, 1 skipped in 1.76s =========================
The text was updated successfully, but these errors were encountered:
No branches or pull requests
When I run tests with the
-x
pytest option then I see this failure:When I do not use
-x
then all tests pass:The text was updated successfully, but these errors were encountered: