re: None potato, one potato, two potato, more..
JC writes -
But it's also much harder than it ought to be, and often intially for many pretty basic counter-intuitive reasons. And if not counter-intuitive, at least contra-status quo.
So we can focus on the point at hand let's (and I will never argue with it as toggle option):
from __future__ import division
We know " ** " is the power operator and want to raise to a fractional power: first try:
16**3/4 1024.0
Which is clearly not the answer I am expecting second try:
16**(3/4) 8.0
Which is. And I am sure long debates could ensue about which is more intuitive and whether the language could be improved by adjusting the parser to recognize what I meant in try one, i.e. adjust the operator precedence order. But my second try is good enough for me, and I now know the rules for my 3rd to 3 millionth try. The burden now is where it rightly belongs, with me. And if I choose to give up programming in disgust with the fact it took me two tries - well that's up to me. But Guido can relax and enjoy his New Year, I hope. Art
So we can focus on the point at hand let's (and I will never argue with it as toggle option):
Hi Art, No argument with your example, but it misses completely the [fresh] point at hand I was hoping to discuss. Not where the burden lies, but rather the implications and influence of computational literacy upon everyday language and culture. No blame to programmers or language developers intended. We've been over that ground too many times already I think. But from a media-culture/language perspective, yes programming expects learned behaviors which could be construed as [almost] in denial of contemporary speech and use of language. The bait in my post was use of the word 'human'. Python of course is as 100% a human language as much as English is. But since we all learn to speak [and count] before we can program, there are implicit precedents set there/then. These lead to the some of the 'counter-intuitive allegations about programming. The axis of happy new year question is what happens if/when /programming is learned early enough that it is a fundamental part of literacy and language. How might everyday language be changed? And reverse-engineering the thought, how differently might programming languages be learnt, taught and improved...? - Jason
16**3/4 1024.0
Which is clearly not the answer I am expecting
Fits my expectations. 16**3 is "one thing" and /4 comes after. So I see that as (16*16*16)/4. I guess I must parse the way Python does.
second try:
16**(3/4) 8.0
Which is.
And I am sure long debates could ensue about which is more intuitive and whether the language could be improved by adjusting the parser to recognize what I meant in try one, i.e. adjust the operator precedence order.
Exponentiation has higher precedence in all languages that I know about, where precedence is used at all (it's not used in J). It has higher precedence in ordinary math notation too, but the superscript notation resolves a lot of the ambiguity even without parentheses. More likely to trip people up (or did me) is:
-1**2 -1
As a unary operator, - has less precedence than **.
But my second try is good enough for me, and I now know the rules for my 3rd to 3 millionth try. The burden now is where it rightly belongs, with me.
We agree here. You encounter a language, you learn its quirks (or accept them as perfectly normal as the case may be) and move on. Everyday math notation has its own quirks and weirdnesses. Nothing is above criticism.
And if I choose to give up programming in disgust with the fact it took me two tries - well that's up to me.
We agree again.
But Guido can relax and enjoy his New Year, I hope.
Art
Maybe you say "but" because you believe Guido is afraid of ever losing a beginner and thinks his job is to bend over backwards to accommodate them. Just guessing. I don't really know why you say "But". Kirby
participants (3)
-
Arthur
-
Jason Cunliffe
-
Kirby Urner