[Python-ideas] Shrink recursion error tracebacks (was: Have REPL print less by default)
Steven D'Aprano
steve at pearwood.info
Fri Apr 22 23:50:10 EDT 2016
On Fri, Apr 22, 2016 at 10:00:12PM -0400, Émanuel Barry wrote:
> I’m also a very solid -1 on the idea of prompting the user to do
> something like full_ex() to get the full stack trace.
I'm not sure who suggested that idea. I certainly didn't.
> My rationale for
> such a change is that we need to be extremely precise in our message,
> to leave absolutely no room for a different interpretation than what
> we mean. Your message, for instance, is ambiguous.
I'm not sure who you are talking to here, or whose suggestion the
following faked stack trace is.
[...]
> Example:
> File "<stdin>", line 1, in f
> File "<stdin>", line 1, in g
> [Mutually recursive calls hidden: f (300), g (360)]
For the record, detecting subsequences of mutally recurive calls is not
a trivial task. (It's not the same as cycle detection.) And I don't
understand how you can get 300 calls to f and 360 calls to g in a
scenario where f calls g and g calls f. Surely there would be equal
number of calls to each?
> File "<stdin>", line 1, in h
> File "<stdin>", line 1, in f
> File "<stdin>", line 1, in g
> [Mutual-recursive calls hidden: f (103), g (200)]
> RuntimeError: maximum recursion depth exceeded
> [963 calls hidden. Call _full_ex() to print full trace.]
>
> This rule is easily modified to, "have been seen at least three times
> before." For functions that recurse at multiple lines, it can print
> out one message per line, but maybe it should count them all together
> in the "hidden" summary.
I think the above over-complicates the situation. I want to deal with
the low-hanging fruit that gives the biggest benefit for the least work
(and least complexity!), not minimise every conceivable bit of
redundancy in a stack trace. So, for the record, let me explicitly state
my proposal:
Stack traces should collapse blocks of repeated identical, contiguous
lines (or pairs of lines, where the source code is available for
display). For brevity, whenever I refer to a "line" in the stack trace,
I mean either a single line of the form:
File "<stdin>", line 1, in f
when source code is not available, or a pair of lines of the form:
File "path/to/file.py", line 1, in f
line of source code
when it is available.
Whenever a single, continguous block of lines from the traceback
consists of three or more identical lines (i.e. lines which compare
equal using string equality), they should be collapsed down to:
a single instance of that line
followed by a message reporting the number of repetitions.
For example:
File "<stdin>", line 1, in f
return f(arg)
File "<stdin>", line 1, in f
return f(arg)
File "<stdin>", line 1, in f
return f(arg)
File "<stdin>", line 1, in f
return f(arg)
File "<stdin>", line 1, in f
return f(arg)
File "<stdin>", line 1, in f
return f(arg)
would be collapsed to something like this:
File "<stdin>", line 1, in f
return f(arg)
[...previous call was repeated 5 more times...]
(In practice, you are more like to see "repeated 1000 more times" than
just five.)
This would not be collapsed, as the line numbers are not the same:
File "<stdin>", line 1, in f
File "<stdin>", line 2, in f
File "<stdin>", line 3, in f
File "<stdin>", line 4, in f
File "<stdin>", line 5, in f
File "<stdin>", line 6, in f
nor this:
File "<stdin>", line 1, in f
File "<stdin>", line 1, in g
File "<stdin>", line 1, in f
File "<stdin>", line 1, in g
File "<stdin>", line 1, in f
File "<stdin>", line 1, in g
My proposal does not include any provision for collapsing chains of
mutual recursion. If somebody else wants to champion that as a separate
proposal, please do, but I won't be making that proposal.
This will shrink the ugly and harmful huge stacktraces that we
get from many accidental recursion errors, without hiding potentially
useful information.
--
Steve
More information about the Python-ideas
mailing list