[Python-Dev] Making "-j0" the default setting for the test suite?
Terry Reedy
tjreedy at udel.edu
Mon Jan 29 22:06:39 EST 2018
On 1/28/2018 11:30 PM, Nick Coghlan wrote:
> On my current system, "make test" runs in around 3 minutes, while
> "./python -m test" runs in around 16 minutes. And that's with "make
> test" actually running more tests (since it enables several of the
> "-u" options).
Did you test with current 3.7.0a+, soon to be 3.7.0b1, repository? For
me, recent changes, since 3.7.0a4, have greatly reduced the benefit of
j0 = j14 on my system. For 3.6.4 (installed), the times are around
13:30 and 2:20, which is about the same ratio as you report. The times
for 3.7.0a4 (installed) are about the same parallel and would have been
for serial if not for a crash. For 3.7.0a4+ recompiled today (debug),
the times are around 24:00 and 11:10. Debug slows down by nearly half.
The extra slowdown by a factor of more than 2 for parallel is because
parallel tests are now blocked (on windows). Before, a new test was
starting in one of the 14 processes whenever the test in a process was
finished. All 12 cpu were kept busy until less than 12 tests remain.
Now, 14 new tests are started when the previous 14 finish. Therefore,
cpus wait while the slowest test in a batch of 14 finishes.
Beginning of old log:
Run tests in parallel using 14 child processes
0:00:00 [ 1/413] test__opcode passed
0:00:00 [ 2/413] test__locale passed
0:00:00 [ 3/413] test__osx_support passed
0:00:00 [ 4/413] test_abc passed
0:00:01 [ 5/413] test_abstract_numbers passed
0:00:01 [ 6/413] test_aifc passed
0:00:02 [ 7/413] test_array passed
0:00:02 [ 8/413] test_asdl_parser skipped
test_asdl_parser skipped -- test irrelevant for an installed Python
0:00:03 [ 9/413] test_argparse passed
0:00:04 [ 10/413] test_ast passed
0:00:04 [ 11/413] test_asyncgen passed
0:00:05 [ 12/413] test_unittest passed
0:00:06 [ 13/413] test_asynchat passed
0:00:06 [ 14/413] test_atexit passed
0:00:06 [ 15/413] test_audioop passed
0:00:06 [ 16/413] test_augassign passed
0:00:07 [ 17/413] test_asyncore passed
0:00:07 [ 18/413] test_baseexception passed
0:00:07 [ 19/413] test_base64 passed
0:00:07 [ 20/413] test_bigaddrspace passed
0:00:07 [ 21/413] test_bigmem passed
0:00:07 [ 22/413] test_binascii passed
0:00:07 [ 23/413] test_binop passed
0:00:07 [ 24/413] test_binhex passed
0:00:08 [ 25/413] test_bool passed
0:00:08 [ 26/413] test_bisect passed
0:00:10 [ 27/413] test_doctest passed
0:00:10 [ 28/413] test_types passed
0:00:10 [ 29/413] test___future__ passed
0:00:10 [ 30/413] test_dict passed
0:00:10 [ 31/413] test_exceptions passed
0:00:10 [ 32/413] test_support passed
0:00:10 [ 33/413] test_builtin passed
0:00:10 [ 34/413] test_opcodes passed
0:00:10 [ 35/413] test_grammar passed
0:00:10 [ 36/413] test_doctest2 passed
0:00:10 [ 37/413] test___all__ passed
0:00:10 [ 38/413] test_cmath passed
0:00:11 [ 39/413] test_cmd passed
0:00:15 [ 40/413] test_cmd_line passed
0:00:15 [ 41/413] test_buffer passed
0:00:15 [ 42/413] test_code passed
0:00:16 [ 43/413] test_code_module passed
0:00:16 [ 44/413] test_codeccallbacks passed
0:00:16 [ 45/413] test_charmapcodec passed
0:00:16 [ 46/413] test_class passed
0:00:16 [ 47/413] test_capi passed
0:00:16 [ 48/413] test_call passed
0:00:16 [ 49/413] test_bytes passed
0:00:16 [ 50/413] test_cgitb passed
0:00:16 [ 51/413] test_calendar passed
0:00:16 [ 52/413] test_c_locale_coercion passed
0:00:16 [ 53/413] test_bz2 passed
0:00:16 [ 54/413] test_cgi passed
0:00:17 [ 55/413] test_codecencodings_cn passed
0:00:17 [ 56/413] test_codecencodings_iso2022 passed
0:00:17 [ 57/413] test_codeop passed
0:00:17 [ 58/413] test_codecencodings_tw passed
0:00:17 [ 59/413] test_codecencodings_hk passed
0:00:17 [ 60/413] test_codecmaps_kr passed
0:00:17 [ 61/413] test_codecmaps_tw passed
0:00:17 [ 62/413] test_codecmaps_cn passed
0:00:17 [ 63/413] test_codecencodings_kr passed
0:00:17 [ 64/413] test_codecmaps_jp passed
0:00:17 [ 65/413] test_codecmaps_hk passed
0:00:17 [ 66/413] test_codecencodings_jp passed
0:00:19 [ 67/413] test_bufio passed
0:00:19 [ 68/413] test_collections passed
0:00:19 [ 69/413] test_contextlib_async passed
0:00:19 [ 70/413] test_copy passed
0:00:19 [ 71/413] test_copyreg passed
0:00:19 [ 72/413] test_coroutines passed
0:00:19 [ 73/413] test_codecs passed
0:00:20 [ 74/413] test_crashers passed
0:00:20 [ 75/413] test_crypt skipped
test_crypt skipped -- No module named '_crypt'
0:00:20 [ 76/413] test_cprofile passed
0:00:20 [ 77/413] test_curses skipped (resource denied)
test_curses skipped -- Use of the 'curses' resource not enabled
0:00:20 [ 78/413] test_csv passed
0:00:20 [ 79/413] test_dataclasses passed
0:00:21 [ 80/413] test_cmd_line_script passed
0:00:21 [ 81/413] test_dbm passed
0:00:21 [ 82/413] test_dbm_gnu skipped
test_dbm_gnu skipped -- No module named '_gdbm'
0:00:21 [ 83/413] test_ctypes passed
0:00:22 [ 84/413] test_dbm_ndbm skipped
test_dbm_ndbm skipped -- No module named '_dbm'
0:00:22 [ 85/413] test_decorators passed
0:00:22 [ 86/413] test_datetime passed
0:00:22 [ 87/413] test_defaultdict passed
0:00:23 [ 88/413] test_descr passed
0:00:23 [ 89/413] test_deque passed
0:00:23 [ 90/413] test_descrtut passed
0:00:23 [ 91/413] test_devpoll skipped
test_devpoll skipped -- test works only on Solaris OS family
0:00:23 [ 92/413] test_dict_version passed
0:00:24 [ 93/413] test_dictcomps passed
0:00:24 [ 94/413] test_dictviews passed
0:00:25 [ 95/413] test_dbm_dumb passed
0:00:25 [ 96/413] test_difflib passed
0:00:25 [ 97/413] test_dis passed
0:00:25 [ 98/413] test_dtrace passed
0:00:25 [ 99/413] test_dummy_thread passed
0:00:26 [100/413] test_dummy_threading passed
0:00:26 [101/413] test_dynamic passed
0:00:26 [102/413] test_dynamicclassattribute passed
0:00:26 [103/413] test_eintr passed
0:00:28 [104/413] test_decimal passed
0:00:28 [105/413] test_embed passed
0:00:29 [106/413] test_ensurepip passed
0:00:29 [107/413] test_enum passed
0:00:30 [108/413] test_enumerate passed
0:00:30 [109/413] test_eof passed
0:00:30 [110/413] test_epoll skipped
test_epoll skipped -- test works only on Linux 2.6
0:00:30 [111/413] test_errno passed
Beginning of new log:
Run tests in parallel using 14 child processes
running: test_grammar (30 sec), test_opcodes (30 sec), test_dict (30
sec), test_builtin (30 sec), test_exceptions (30 sec), test_types (30
sec), test_unittest (30 sec), test_doctest (30 sec), test_doctest2 (30
sec), test_support (30 sec), test___all__ (30 sec), test___future__ (30
sec), test__locale (30 sec), test__opcode (30 sec)
0:00:41 [ 1/414] test_support passed -- running: test_grammar (41 sec),
test_opcodes (41 sec), test_dict (41 sec), test_builtin (41 sec),
test_exceptions (41 sec), test_types (41 sec), test_doctest (41 sec),
test___all__ (41 sec), test___future__ (41 sec), test__locale (41 sec),
test__opcode (41 sec)
0:00:41 [ 2/414] test_doctest2 passed -- running: test_grammar (41
sec), test___all__ (41 sec), test__locale (41 sec)
0:00:41 [ 3/414] test_unittest passed -- running: test___all__ (41
sec), test__locale (41 sec)
0:00:41 [ 4/414] test__opcode passed
0:00:41 [ 5/414] test_dict passed
0:00:41 [ 6/414] test_types passed
0:00:41 [ 7/414] test___future__ passed
0:00:41 [ 8/414] test_builtin passed
0:00:41 [ 9/414] test_doctest passed
0:00:41 [ 10/414] test_opcodes passed
0:00:41 [ 11/414] test_exceptions passed
0:00:41 [ 12/414] test_grammar passed
0:00:41 [ 13/414] test___all__ passed (40 sec)
0:00:41 [ 14/414] test__locale passed
Slowest tests took 40 sec, rest waited.
[snip list of running tests]
0:01:25 [ 15/414] test_audioop passed
0:01:25 [ 17/414] test_abstract_numbers passed
0:01:25 [ 18/414] test_abc passed
0:01:25 [ 19/414] test_aifc passed
0:01:25 [ 20/414] test_asdl_parser passed
0:01:25 [ 21/414] test_asyncgen passed
0:01:25 [ 22/414] test_atexit passed
0:01:25 [ 23/414] test_asyncio passed (42 sec)
[snip output]
0:01:25 [ 25/414] test_ast passed
0:01:25 [ 26/414] test_asynchat passed
0:01:25 [ 27/414] test_array passed
0:01:25 [ 28/414] test__osx_support passed
28 tests done in 85 seconds versus 111 in 30 seconds.
I think whatever caused this should be reversed.
--
Terry Jan Reedy
More information about the Python-Dev
mailing list