Debugging: some problems and possible solutions
This thread is about debugging. I suggest we start by collecting problems and possible solutions. And that after about a week of that, we start discussing what we've gathered. We already have a problem and possible solution, provided by Eric Smith and Larry Hastings. <begin> TITLE: f-string "debug" conversion URL: https://mail.python.org/pipermail/python-ideas/2018-October/053956.html PROBLEM Writing print('value = ', value) is tedious, particularly for more complicated expressions, such as print('big_array[5:20] =', big_array[5:20]) POSSIBLE SOLUTION For f-strings, we add a !d conversion operator, which produces the text of an expression followed by its value. Thus, the two previous examples can be written more concisely as print(f'{value !d}') print(f'{big_array[5:20] !d}') </end> I suggest for the moment that we just gather problem-solution pairs, much as above. I think they'll be between 5 and 15 such pairs. I'll post one to the current discussion thread in an hour or so. And that after about a week, we move to discussing what we have. Finally, many thanks to Eric and Larry for their useful contribution to the important problem of debugging. -- Jonathan
TITLE: Saving and sharing debug code PROBLEM: Sometimes debug-only code should be saved and shared. For example, one person's code written to solve a bug might be needed a second time, by someone else. Or perhaps the same person, again. This is particularly likely when the bug is difficult or stubborn. POSSIBLE SOLUTION At present, __debug__ is a keyword identifier, which can take only the values True and False. It gets its value on Python startup, and then its value can't be changed. The possible solution is to allow Python startup to give __debug__ an additional value, namely 2. Once this is done, code blocks such as if __debug__ >= 2: # my debug-only code goes here will effectively be ignored, except when requested. Python already does this for blocks such as if __debug__: # ignored if optimised code is being generated See also: https://docs.python.org/3/library/constants.html#__debug__ __debug__ This constant is true if Python was not started with an -O option. See also the assert statement.
On Thu, Oct 4, 2018 at 12:52 AM Jonathan Fine <jfine2358@gmail.com> wrote:
This thread is about debugging. I suggest we start by collecting problems and possible solutions. And that after about a week of that, we start discussing what we've gathered.
Why not just discuss each proposal on its own merits, independently of any other proposals? Do they interact with each other? ChrisA
Chris Angelico wrote:
Why not just discuss each proposal on its own merits, independently of any other proposals? Do they interact with each other?
Good question. I think they will interact. Not in terms of implementation, but in terms of benefit. To quote the Zen of Python:
There should be one-- and preferably only one --obvious way to do it. Although that way may not be obvious at first unless you're Dutch.
However, we won't know for sure until the proposals are in. -- Jonathan
On Thu, Oct 4, 2018 at 1:30 AM Jonathan Fine <jfine2358@gmail.com> wrote:
Chris Angelico wrote:
Why not just discuss each proposal on its own merits, independently of any other proposals? Do they interact with each other?
Good question. I think they will interact. Not in terms of implementation, but in terms of benefit. To quote the Zen of Python:
There should be one-- and preferably only one --obvious way to do it. Although that way may not be obvious at first unless you're Dutch.
However, we won't know for sure until the proposals are in.
Debugging is a massively broad area, and we commonly have and use multiple completely different tools. Let's not start assuming that proposals will interact or conflict until we actually have some that do. Your proposal for __debug__ to be able to have integer values seems like something that should be discussed in more detail, independently of exactly what the debugging code governed by it ends up doing. If you do "if __debug__>2: print(f'{blah!d}')", it'd use two different proposals, but they're completely orthogonal. ChrisA
TITLE: output debug information easily and quickly POSSIBLE SOLUTION Short form of keyword arguments where foo(=a, =1+bar) Is expanded at compile time to foo(**{'a': a, '1+bar': 1+bar}) Then we can just create a normal debug function: def debug_print(**vars): for k, v in vars.items(): print(f'{k}={v}') this is of course useful for many other things as Steven has pointed out. / Anders
foo(=a, =1+bar)
Unfortunately, that won't help with Jonathan's inital example expression "big_array[5:20]" as it's not a valid keyword.
I didn't understand that. The example you are referring to is print('big_array[5:20] =', big_array[5:20]) Right? Nothing is a keyword in that example or in my example. My suggestion is that we could do: my_func(=big_array[5:20]) And it would be compile time transformed into my_func(**{'big_array[5:20]': big_array[5:20]}) and then my_func is just a normal function: def my_func(**kwargs): Whatever It's a very simple textual transformation. / Anders
On Wed, Oct 3, 2018 at 9:29 AM Anders Hovmöller <boxed@killingar.net> wrote:
foo(=a, =1+bar)
Unfortunately, that won't help with Jonathan's inital example expression "big_array[5:20]" as it's not a valid keyword.
I didn't understand that. The example you are referring to is print('big_array[5:20] =', big_array[5:20])
Nothing is a keyword in that example or in my example. My suggestion is that we could do: my_func(=big_array[5:20])
And it would be compile time transformed into my_func(**{'big_array[5:20]': big_array[5:20]})
and then my_func is just a normal function:
def my_func(**kwargs): Whatever
You're right, that case will work. I was thinking of In [1]: foo(a+b=1) File "<ipython-input-1-61f24dcb4c20>", line 1 foo(a+b=1) ^ SyntaxError: keyword can't be an expression
foo(=a, =1+bar)
Unfortunately, that won't help with Jonathan's inital example expression "big_array[5:20]" as it's not a valid keyword.
I didn't understand that. The example you are referring to is print('big_array[5:20] =', big_array[5:20])
Nothing is a keyword in that example or in my example. My suggestion is that we could do: my_func(=big_array[5:20])
And it would be compile time transformed into my_func(**{'big_array[5:20]': big_array[5:20]})
and then my_func is just a normal function:
def my_func(**kwargs): Whatever
You're right, that case will work. I was thinking of
In [1]: foo(a+b=1) File "<ipython-input-1-61f24dcb4c20>", line 1 foo(a+b=1) ^ SyntaxError: keyword can't be an expression
Sure. But no one suggested it either so I don't see how it's relevant. / Anders
On Thu, Oct 4, 2018 at 2:30 AM Anders Hovmöller <boxed@killingar.net> wrote:
Nothing is a keyword in that example or in my example. My suggestion is that we could do:
my_func(=big_array[5:20])
And it would be compile time transformed into
my_func(**{'big_array[5:20]': big_array[5:20]})
and then my_func is just a normal function:
def my_func(**kwargs): Whatever
It's a very simple textual transformation.
That is not guaranteed to work. In another thread it was pointed out that this is merely a CPython implementation detail, NOT a language feature. ChrisA
Anders Hovmöller suggested
Short form of keyword arguments where foo(=a, =1+bar) Is expanded at compile time to foo(**{'a': a, '1+bar': 1+bar})
Chris Angelico wrote:
That is not guaranteed to work. In another thread it was pointed out that this is merely a CPython implementation detail, NOT a language feature.
Here's a variant of Anders' suggestion. First, here's a dict literal {'a':1, 'b': 2, 'c':3} and here's another way to write an equivalent dict dict(a=1, b=2, c=3) So how about extending Python so that, for example, {=(1 + bar), } is equivalent to {'1 + bar': 1 + bar, } The basic idea is Anders's, recast to avoid Chris's problem. Anders: Are you willing to accept this change, if need be? Chris: Please speak up, if you think this may depend on CPython. Off topic: https://data.grammarbook.com/blog/apostrophes/apostrophes-with-names-ending-... To show singular possession of a name ending in s or z, some writers add just an apostrophe. Others also add another s. -- Jonathan
On Thu, Oct 4, 2018 at 4:12 AM Jonathan Fine <jfine2358@gmail.com> wrote:
Anders Hovmöller suggested
Short form of keyword arguments where foo(=a, =1+bar) Is expanded at compile time to foo(**{'a': a, '1+bar': 1+bar})
Chris Angelico wrote:
That is not guaranteed to work. In another thread it was pointed out that this is merely a CPython implementation detail, NOT a language feature.
Here's a variant of Anders' suggestion. First, here's a dict literal {'a':1, 'b': 2, 'c':3} and here's another way to write an equivalent dict dict(a=1, b=2, c=3)
So how about extending Python so that, for example, {=(1 + bar), } is equivalent to {'1 + bar': 1 + bar, }
The basic idea is Anders's, recast to avoid Chris's problem.
Anders: Are you willing to accept this change, if need be? Chris: Please speak up, if you think this may depend on CPython.
And then you just pass the dictionary as-is? That would be plausible, but I'm not a big fan of the syntax. Feels very clunky and forced. Spun off as a new thread because this isn't really specific to debugging. ChrisA
It's a very simple textual transformation.
That is not guaranteed to work. In another thread it was pointed out that this is merely a CPython implementation detail, NOT a language feature.
Pedantry. Ok. Let's be pedantic: it's a simple textual transformation AND a promotion of an implementation detail to a spec requirement. Happy? / Anders
On Thu, Oct 4, 2018 at 4:36 AM Anders Hovmöller <boxed@killingar.net> wrote:
It's a very simple textual transformation.
That is not guaranteed to work. In another thread it was pointed out that this is merely a CPython implementation detail, NOT a language feature.
Pedantry. Ok. Let's be pedantic: it's a simple textual transformation AND a promotion of an implementation detail to a spec requirement. Happy?
Then I'm strongly -1 on it. Happy? :) ChrisA
On Thu, Oct 4, 2018 at 4:38 AM Chris Angelico <rosuav@gmail.com> wrote:
On Thu, Oct 4, 2018 at 4:36 AM Anders Hovmöller <boxed@killingar.net> wrote:
It's a very simple textual transformation.
That is not guaranteed to work. In another thread it was pointed out that this is merely a CPython implementation detail, NOT a language feature.
Pedantry. Ok. Let's be pedantic: it's a simple textual transformation AND a promotion of an implementation detail to a spec requirement. Happy?
Then I'm strongly -1 on it. Happy? :)
And In case it's not clear why I said that, btw: It's not mere pedantry. By restating your proposal in those terms, you make it far broader than a simple textual transformation. Imagine if you'd said it like this: "Okay, let's be pedantic. As well as my other proposals, I'm also requiring that you be able to use 'a+b' as a variable name." That is most definitely not a simple proposal. And that means it should be discussed as a much much broader change: disconnecting keyword arguments from variable names. That should NOT just slide through as part of a separate change. ChrisA
So how about extending Python so that, for example, {=(1 + bar), } is equivalent to {'1 + bar': 1 + bar, }
The basic idea is Anders's, recast to avoid Chris's problem.
Chris' problem isn't an actual problem though. Its just a few sentences in a PEP. It might be a problem for other python implementations but I'm gonna put say 100 dollars on that it's not actually so. Pypy, jython, ironpython, who else? Without looking I'm betting they have the same implementation detail. Any takers on such a bet? One is not allowed to look before taking the bet but I can't check that so scout honor applies.
Anders: Are you willing to accept this change, if need be?
I would but I think it's worse in every possible way :P It looks like a set, not a dict for example. / Anders
Then I'm strongly -1 on it. Happy? :)
And In case it's not clear why I said that, btw: It's not mere pedantry.
Good to see you understood yourself why that mail wasn't so good.
By restating your proposal in those terms, you make it far broader than a simple textual transformation. Imagine if you'd said it like this:
"Okay, let's be pedantic. As well as my other proposals, I'm also requiring that you be able to use 'a+b' as a variable name."
That is most definitely not a simple proposal. And that means it should be discussed as a much much broader change: disconnecting keyword arguments from variable names. That should NOT just slide through as part of a separate change.
Imagine if I said something other totally irrelevant and that is bigger change indeed. But I didn't. I suggested not a change of CPython or PyPy or IronPython but a few sentences in a PEP. I also didn't suggest that it be snuck into the same PEP as my proposed syntax changes. I agree that would be bad. It should obviously be a separate PEP. You could try first discussing the idea before requiring that I first state the legalese minutiae in exactly the right way before you even discuss the idea itself. We're pretty far away from a PEP at this stage anyway so hold your horses. This is python-ideas@ after all, not pep-lawyering@. I noticed you wouldn't take the bet too. / Anders
On Thu, Oct 4, 2018 at 4:54 AM Anders Hovmöller <boxed@killingar.net> wrote:
Then I'm strongly -1 on it. Happy? :)
And In case it's not clear why I said that, btw: It's not mere pedantry.
Good to see you understood yourself why that mail wasn't so good.
By restating your proposal in those terms, you make it far broader than a simple textual transformation. Imagine if you'd said it like this:
"Okay, let's be pedantic. As well as my other proposals, I'm also requiring that you be able to use 'a+b' as a variable name."
That is most definitely not a simple proposal. And that means it should be discussed as a much much broader change: disconnecting keyword arguments from variable names. That should NOT just slide through as part of a separate change.
Imagine if I said something other totally irrelevant and that is bigger change indeed. But I didn't. I suggested not a change of CPython or PyPy or IronPython but a few sentences in a PEP. I also didn't suggest that it be snuck into the same PEP as my proposed syntax changes. I agree that would be bad. It should obviously be a separate PEP.
I'm not sure what you're calling irrelevant here. But sure. If you want to propose that, propose it. Start a new thread in which you propose that, as a language feature, kwargs are allowed to be invalid variable names.
I noticed you wouldn't take the bet too.
No, because the bet isn't the point. You're asking if any existing Python implementation assumes that kwargs are valid identifiers. I could easily create one and win the bet, and that still wouldn't be the point. You're proposing a change to the language specification, and that's not something to just gloss over. When PEP 572 started proposing changes to other semantics than just assignment expressions, there was a lot of pushback because that was seen as an independent proposal (even though it was fairly tightly bound to the assignment expressions themselves, in that it'd be extremely hard to observe the difference else). What you're proposing here is, similarly, a completely independent proposal, and not all that tightly bound. ChrisA
Hi Wolfram You tried
def f(a): .. print(a) f(**{"a":2}) 2 f(**{"a+1":2}) Traceback (most recent call last): File "python", line 1, in <module> TypeError: f() got an unexpected keyword argument 'a+1'
This is exactly what I would have expected. Please consider the following:
def f(a): pass f(**dict(b=1)) TypeError: f() got an unexpected keyword argument 'b'
Does CPython count as "other python implementation"?
Yes and No. Both are half correct. CPython is the reference implementation. Please also consider
def f(a, **kwargs): pass f(a=1, **{'a+1': 2})
f(a=1, **{(0, 1): 2}) TypeError: f() keywords must be strings
So far as I know, that a keyword be a string is the only constraint, at least in CPython. For example
def f(a, **kwargs): pass f(a=1, **{'': 2}) f(a=1, **{'def': 2})
So I think Anders proposal works in CPython. I think you forgot the **kwargs in the parameters to f. best regards Jonathan
Am 03.10.2018 um 21:52 schrieb Jonathan Fine:
def f(a, **kwargs): pass f(a=1, **{'': 2}) f(a=1, **{'def': 2}) So I think Anders proposal works in CPython. I think you forgot the **kwargs in the parameters to f.
Ah, yes. Thank you. So it works in CPython 2.7. But I'm curious, does it work in very old versions? I'm not saying that this is important, because language changes always are for new versions. However, Anders' claim that this not a language change seemed too broad to me. It may be that this change has very little cost, but it should not be dismissed. Wolfram
Imagine if I said something other totally irrelevant and that is bigger change indeed. But I didn't. I suggested not a change of CPython or PyPy or IronPython but a few sentences in a PEP. I also didn't suggest that it be snuck into the same PEP as my proposed syntax changes. I agree that would be bad. It should obviously be a separate PEP.
I'm not sure what you're calling irrelevant here. But sure. If you want to propose that, propose it. Start a new thread in which you propose that, as a language feature, kwargs are allowed to be invalid variable names.
But wouldn't it make sense to have a motivating example? Like the one we're discussing? Not just suggest it out of the blue?
You're proposing a change to the language specification, and that's not something to just gloss over.
Many people are suggesting language spec changes in this thread and quite a few others. This is the forum for it.
When PEP 572 started proposing changes to other semantics than just assignment expressions, there was a lot of pushback because that was seen as an independent proposal (even though it was fairly tightly bound to the assignment expressions themselves, in that it'd be extremely hard to observe the difference else). What you're proposing here is, similarly, a completely independent proposal, and not all that tightly bound.
Sure. But I'm only proposing it in the "what if?" way. It's a discussion to see what other solutions exists for the problem that the thread started discussing. A You and me keep derailing. It's quite depressing. I don't want to be in a constant shouting game with you. I want to discuss ideas. / Anders
Ah, yes. Thank you. So it works in CPython 2.7. But I'm curious, does it work in very old versions?
My bet is still on. I take paypay. I will not accept python 1 let's say. It's just easier that way. If **kwargs exists they take strings.
I'm not saying that this is important, because language changes always are for new versions. However, Anders' claim that this not a language change seemed too broad to me.
Not what I meant. I meant it's not a change to any python implementation. It might be a change to promote an implementation detail to the spec as pointed out by Chris.
It may be that this change has very little cost, but it should not be dismissed.
I wasn't dismissing it. I just didn't think it was yet time to bring it up. We were still discussing larger themes. But if we do have to bring it up: It has zero technical cost in actuality right now but might not in the future have zero cost. It's the same argument against the guaranteed order of dicts in 3.7: it might have a cost in the future. / Anders
On Thu, Oct 04, 2018 at 04:17:41AM +1000, Chris Angelico wrote:
{=(1 + bar), } is equivalent to {'1 + bar': 1 + bar, }
What is so special about dicts that this only works there? If we're going to have syntax to capture the source (and AST) of an expression, it ought to work everywhere. And it ought to be callable, without having to call eval() on the source, which is slow. -- Steve
{=(1 + bar), } is equivalent to {'1 + bar': 1 + bar, }
What is so special about dicts that this only works there?
If we're going to have syntax to capture the source (and AST) of an expression, it ought to work everywhere. And it ought to be callable, without having to call eval() on the source, which is slow.
So... what are you thinking? We store the AST in a way that can be inspected by the inspect module? Seems a bit vague how the API would look. Would you look at the stack and traverse it and the frames contain detailed information on where a function call originated in the AST? That's how I interpret the "works everywhere" idea. It seems pretty complex! I hope you have a simpler idea. On a related note: any access of the AST will be annoying to use if the standard library doesn't have an unparse feature, and if it does, it'll be slightly annoying if all formatting and comments are thrown away like the current AST does. / Anders
On Wed, Oct 3, 2018 at 10:48 AM, Chris Angelico <rosuav@gmail.com> wrote:
On Thu, Oct 4, 2018 at 2:30 AM Anders Hovmöller <boxed@killingar.net> wrote:
Nothing is a keyword in that example or in my example. My suggestion is that we could do:
my_func(=big_array[5:20])
And it would be compile time transformed into
my_func(**{'big_array[5:20]': big_array[5:20]})
and then my_func is just a normal function:
def my_func(**kwargs): Whatever
It's a very simple textual transformation.
That is not guaranteed to work. In another thread it was pointed out that this is merely a CPython implementation detail, NOT a language feature.
I'm curious where this is written down. Can you point to the relevant part of the language spec or pronouncement or whatever it was? -n -- Nathaniel J. Smith -- https://vorpus.org
That is not guaranteed to work. In another thread it was pointed out that this is merely a CPython implementation detail, NOT a language feature.
I'm curious where this is written down. Can you point to the relevant part of the language spec or pronouncement or whatever it was?
I'm going to be charitable here and suggest that Chris is referring to something implicit. The spec clearly says that identifiers must be valid keys because otherwise normal functions wouldn't work. But that's it. So in Chris' view you're probably asking him to prove a negative. The burden of proof would lie on those who say otherwise to show how the spec allows arbitrary strings. But no one has made such a claim so it's a bit moot. / Anders
Good morning, I read about a "time machine" debugger a long time ago. The debugger would collect all the information of all the calls and the programmer can just execute the code without breakpoints. Later, the programmer can follow the evolution of a variable until it reaches an erroneous value. I've never worked with that and it sounds really memory intensive, but the examples were quite interesting. When you are in a loop, you want to break at a certain iteration. When you are in a recursive function, you want to stop at the right point. etc. Cheers, Hans On 03/10/18 16:51, Jonathan Fine wrote:
This thread is about debugging. I suggest we start by collecting problems and possible solutions. And that after about a week of that, we start discussing what we've gathered.
We already have a problem and possible solution, provided by Eric Smith and Larry Hastings.
<begin> TITLE: f-string "debug" conversion URL: https://mail.python.org/pipermail/python-ideas/2018-October/053956.html PROBLEM Writing print('value = ', value) is tedious, particularly for more complicated expressions, such as print('big_array[5:20] =', big_array[5:20])
POSSIBLE SOLUTION
For f-strings, we add a !d conversion operator, which produces the text of an expression followed by its value. Thus, the two previous examples can be written more concisely as print(f'{value !d}') print(f'{big_array[5:20] !d}') </end>
I suggest for the moment that we just gather problem-solution pairs, much as above. I think they'll be between 5 and 15 such pairs. I'll post one to the current discussion thread in an hour or so.
And that after about a week, we move to discussing what we have. Finally, many thanks to Eric and Larry for their useful contribution to the important problem of debugging.
Ah, yes. Thank you. So it works in CPython 2.7. But I'm curious, does it work in very old versions? I'm not saying that this is important, because language changes always are for new versions. However, Anders' claim that this not a language change seemed too broad to me. It may be that this change has very little cost, but it should not be dismissed.
It works in: Python 1 Python 2 Python 3 PyPy 6 IronPython Jython micropython Are there more I should try? I highly recommend https://tio.run/ <https://tio.run/> for trying this out. It's pretty cool! It doesn't have micropython but it has the others. / Anders
On Thu, Oct 4, 2018 at 6:33 PM Anders Hovmöller <boxed@killingar.net> wrote:
Ah, yes. Thank you. So it works in CPython 2.7. But I'm curious, does it work in very old versions? I'm not saying that this is important, because language changes always are for new versions. However, Anders' claim that this not a language change seemed too broad to me. It may be that this change has very little cost, but it should not be dismissed.
It works in:
Python 1 Python 2 Python 3 PyPy 6 IronPython Jython micropython
Are there more I should try?
I've no idea what you actually tried or what actually worked, since you haven't shown your code. However, it doesn't matter. This IS a language change, and it makes no difference how many implementations are lax enough to permit it currently. Citation: https://mail.python.org/pipermail/python-dev/2018-October/155437.html It's just like a program assuming that there will always be a user with UID 0, or assuming that every human being has a name that consists of a given name and a family name, or assuming that data sent across the internet will arrive unchanged. You can show millions, billions, trillions of examples that support your assumption, but that doesn't make any difference - the assumption is false as soon as there is a single counter-example, or as soon as the specification is shown to *permit* a counter-example. ChrisA
Ah, yes. Thank you. So it works in CPython 2.7. But I'm curious, does it work in very old versions? I'm not saying that this is important, because language changes always are for new versions. However, Anders' claim that this not a language change seemed too broad to me. It may be that this change has very little cost, but it should not be dismissed.
It works in:
Python 1 Python 2 Python 3 PyPy 6 IronPython Jython micropython
Are there more I should try?
I've no idea what you actually tried or what actually worked, since you haven't shown your code.
Please take a deep breath. The question was not directed at you and I just answered a clear and simply stated question. All the examples cited, including the one you link to below work. No need to get angry about this. If you are upset if I discuss implementation details don't reply. This is all I'm doing at this point.
However, it doesn't matter.
Of course it matters. It's the difference between changing the spec and changing the spec AND some implementation. There is a difference between those two things. You might not care but that's another topic.
This IS a language change
Yes I agree. I have said so many times.
, and it makes no difference how many implementations are lax enough to permit it currently.
Sure it does. See above.
You can show millions, billions, trillions of examples that support your assumption, but that doesn't make any difference - the assumption is false as soon as there is a single counter-example,
...which there isn't. But that's irrelevant. We both agree it's irrelevant.
or as soon as the specification is shown to *permit* a counter-example.
Maybe. But I haven't argued that this implementation detail is already in the spec have I? I have just argued that it's easy to IMPLEMENT because it is in fact already implemented in all existing pythons. I don't see why this is such a hard concept for you to grasp. Yes I know it would be a change to the spec. I have conceded this point MANY TIMES. You don't need to argue that point, you've already won it. By walk over even because I never argued against it. Can we drop this now? Your point has been made. / Anders
On 04/10/18 13:27, Anders Hovmöller wrote: [I think >> = ChrisA]
However, it doesn't matter. Of course it matters. It's the difference between changing the spec and changing the spec AND some implementation. There is a difference between those two things. You might not care but that's another topic.
In terms of defining specs, not there isn't a difference between those things. Changing the language spec is not a thing to do lightly regardless of how many implementations happen to do what you already want. While gather evidence that the chaos your change will cause is minimal is a good thing, it doesn't actually lower the bar for getting a language spec change through. Think of it this way: you are proposing to break a promise. Even if it's only a little thing and you're fairly sure everyone will forgive you, that's still not something you should ever want to get comfortable doing. -- Rhodri James *-* Kynesim Ltd
[I think >> = ChrisA]
However, it doesn't matter. Of course it matters. It's the difference between changing the spec and changing the spec AND some implementation. There is a difference between those two things. You might not care but that's another topic.
In terms of defining specs, not there isn't a difference between those things.
So in terms of X there is no difference in Y. No kidding. Which is why I said so. Repeatedly.
Changing the language spec is not a thing to do lightly regardless of how many implementations happen to do what you already want.
Where do you think I'm taking it lightly? I'm taking changing all python implementations seriously. That has non-zero weight to me. Do you and Chris think that changing the implementations of all python implementations and the complexity of that change is totally irrelevant to all discussions? I sure hope not! Changing the spec AND all implementations is pretty much by definition a bigger change and should be taken more seriously than only the spec. I don't take changing the spec lightly because I think changing the spec AND all implementation is MORE. I can't believe I'm arguing that 1 + 1 > 1 but here we are. I would hope you could drop this idea that I'm taking things "lightly" whatever that means. If I were taking it lightly I would have already submitted a PEP.
While gather evidence that the chaos your change will cause is minimal is a good thing, it doesn't actually lower the bar for getting a language spec change through.
Why do you repeat that? No one is saying the opposite. I'm saying the facts lowers the bar for IMPLEMENTATION. Because it's literally zero. That is totally orthogonal to the bar for getting a spec change through of course. Which I have already conceded like ten times now. How many times do I have to concede this point before you stop saying I need to concede it? Do I need to write it with my own blood on parchment or something?
Think of it this way: you are proposing to break a promise. Even if it's only a little thing and you're fairly sure everyone will forgive you, that's still not something you should ever want to get comfortable doing.
Every new thread on this mailing list does this. This is the entire point of this list. And I've conceded this point so you're just beating a dead horse. / Anders
Hello all I think we've established enough basic facts for now, about using a non-identifier string as a keyword in a kwargs dictionary. I'd be most grateful if discussion of this particular topic could be suspended for now, to be resumed when it again becomes relevant. If you are keen to discuss this particular topic right now, please start a new discussion thread for that purpose. Later today, I'll be posting another problem and proposed solution to this thread, and would like your help on that. Many thanks Jonathan
TITLE: PROBLEM: Debug print() statements cause doctests to fail Adding debug print(...) statements to code can cause doctests to fail. This is because both use sys.stdout as the output stream. POSSIBLE SOLUTION: Provide and use a special stream for debug output. In other words, something like
import sys sys.stddebug = sys.stderr debug = lambda *argv, **kwargs: print(*argv, file=sys.stddebug, flush=True, **kwargs)
Note: Need to use current value of sys.stddebug, so can't use functools.partial.
On 04/10/18 18:41, Jonathan Fine wrote:
TITLE: PROBLEM: Debug print() statements cause doctests to fail Adding debug print(...) statements to code can cause doctests to fail. This is because both use sys.stdout as the output stream.
POSSIBLE SOLUTION: Provide and use a special stream for debug output. In other words, something like
import sys sys.stddebug = sys.stderr debug = lambda *argv, **kwargs: print(*argv, file=sys.stddebug, flush=True, **kwargs)
Note: Need to use current value of sys.stddebug, so can't use functools.partial. _______________________________________________ Python-ideas mailing list Python-ideas@python.org https://mail.python.org/mailman/listinfo/python-ideas Code of Conduct: http://python.org/psf/codeofconduct/
Or write your debug output to stderr? -- Rhodri James *-* Kynesim Ltd
In response to my problem-solution pair (fixing a typo)
TITLE: Debug print() statements cause doctests to fail
Rhodri James wrote:
Or write your debug output to stderr?
Perhaps I've been too concise. If so, I apologise. My proposal is that the system be set up so that debug(a, b, c) sends output to the correct stream, whatever it should be. Rhodri: Thank you for your contribution. Are you saying that because the developer can write print(a, b, c, file=sys.stderr) there's not a problem to solve here? -- Jonathan
I wrote:
Perhaps I've been too concise. If so, I apologise. My proposal is that the system be set up so that debug(a, b, c) sends output to the correct stream, whatever it should be.
See also: https://docs.python.org/3/whatsnew/3.7.html#pep-553-built-in-breakpoint
Python 3.7 includes the new built-in breakpoint() function as an easy and consistent way to enter the Python debugger.
So I'm suggesting a new built-in debug() function as an easy and consistent way to write out debugging information. -- Jonathan
So I'm suggesting a new built-in debug() function as an easy and consistent way to write out debugging information.
Python definitely needs a dedicated debug print command. Since "debug" is take in 3.7, perhaps "debug.print" would work? I've built python devtools with has such a command: https://github.com/samuelcolvin/python-devtools (I'm not trying to promote devtools, just demonstrate what is possible.) The packge needs a little work but it's already extremely useful, I use it all day every day. A few things with devtools' debug does: * pretty printing of objects (like pprint but IMHO clearer) * coloured output of objects * extra info about objects, eg. length, type etc. * line numbers and function names where debug() was called * names of the varible (or expression) which is being printed It would be difficult to fully integrate a package like this into the standard lib since it relies on pygments for colouring output, but not impossible. Is this the kind of thing you were thinking about? Samuel Colvin
Samuel Colvin wrote:
Python definitely needs a dedicated debug print command.
I've built python devtools with has such a command: https://github.com/samuelcolvin/python-devtools
Is this the kind of thing you were thinking about?
Thank you for this comment, Samuel. And also very much for your work on devtools. To answer your question: Yes, and No. I'm thinking of providing a builtin debug() command that 'does the right thing' according to the context. And the context would include the user's preferences. Which might be to use the print command in your devtools package. But the user might be in an IDE, which provides a separate window for the debug output. I suggest the builtin debug() would determine the API, and provide a reference implementation. And that many users (perhaps through their IDE) would choose a different implementation. -- Jonathan
Jonathan Fine writes:
I'm thinking of providing a builtin debug() command
"Command" doesn't make much sense in this context. I presume you mean "function".
that 'does the right thing' according to the context. And the context would include the user's preferences. Which might be to use the print command in your devtools package.
"Do what I mean" is inappropriate for a built-in because you have no idea who might be calling it. I can see an API (say, that of class Pdb ;-) and various implementations (starting with Pdb and finally evolving as Jonathan Fine's Magic Mind-Reading Debugging Environment). Wait, wut???!!! Look here:
I suggest the builtin debug() would determine the API, and provide a reference implementation. And that many users (perhaps through their IDE) would choose a different implementation.
The necessary builtin for this is already available. Its name is "breakpoint".
Samuel Colvin writes:
Python definitely needs a dedicated debug print command.
For *commands* (ie, a REPL), you have Pdb already. It's unlikely a *statement* would be accepted, given that print was demoted from a statement to a function. And it's not obvious why a "debug.print()" function can't just as well be imported from a 3rd party library you get from PyPI. (There's already pdb.Pdb.prettyprint in the stdlib, though it may not be terribly easy to use outside of the Pdb REPL.)
(I'm not trying to promote devtools, just demonstrate what is possible.)
It seems to me that promoting devtools (unless you know a solution that's better for some reason) is exactly what you should do. Not that you should pretend it's already optimal. But it would be a concrete proposal that would serve to focus suggestions, and evaluate whether there is a single module that is sufficiently useful to enough people to deserve inclusion in the stdlib, or whether this is yet another case where a number of modules with varied feature sets available from PyPI is a better solution.
It would be difficult to fully integrate a package like this into the standard lib since it relies on pygments for colouring output, but not impossible.
I don't see why that's a problem, as long as the coloring feature is optional: have_color = None try: import pygments have_color = 'pygments' except ImportError: pass # or try other color libraries here and you don't even try to colorize if have_color is None. See the thread on xz and other optional external libraries that don't exist on some platforms and don't give nice errors when the import fails for the wrapper Python library. A bigger problem is if devtools does stuff on which there is no consensus that it's wanted in the stdlib. That "stuff" would need to be split out, which would possibly be a pain for you and any other existing users.
On 04/10/18 19:10, Jonathan Fine wrote:
In response to my problem-solution pair (fixing a typo)
TITLE: Debug print() statements cause doctests to fail
Rhodri James wrote:
Or write your debug output to stderr?
Perhaps I've been too concise. If so, I apologise. My proposal is that the system be set up so that debug(a, b, c) sends output to the correct stream, whatever it should be.
Rhodri: Thank you for your contribution. Are you saying that because the developer can write print(a, b, c, file=sys.stderr) there's not a problem to solve here?
Exactly so. If you want a quick drop of debug information, print() will do that just fine. If you want detailed or tunable information, that's what the logging module is for. I'm not sure where on the line between the two your debug() sits and what it's supposed to offer that is better than either of the alternatives. -- Rhodri James *-* Kynesim Ltd
participants (11)
-
Anders Hovmöller
-
Chris Angelico
-
Hans Polak
-
Jonathan Fine
-
Michael Selik
-
Nathaniel Smith
-
Rhodri James
-
Samuel Colvin
-
Stephen J. Turnbull
-
Steven D'Aprano
-
Wolfram Hinderer