At 12:24 PM 7/26/2001 -0700, Kirby Urner wrote:
languages. This is especially troublesome since python is so often used in a hybrid C or Java environment.
To argue the other side, C and Java both require you to explicity declare the type of your variable, and the rules of coercian tell you, in the code itself, when a change is happening. But in Python, I have much less to go on when trying to track types, especially when user input is considered. So there's something to be said for forcing a type where you might be in doubt given lack of knowledge about the args.
Last point, then I'll shut up re this thread: you might counter that by my argument a + b is as potentially ambiguous as a/b, as we don't necessarily know the types of a, b in either case, going in. That's true, but the core point to realize is that where ints and floats are concerned, a + b doesn't lose info (longs are another story). The problem with / is that it might lose info without telling us. b*(a/b) might not give us something close to a, whereas b+(a-b) always will return a (if a,b = int and/or floats). So the point is, when pushing ambiguous types through a potentially lossy operator, do we want to have an ambiguous or unambiguous result? The proposed design change opts for the latter. It's a reality check to make up for the weak typing of variables, the difficulty of tracking, lexically, the types of args involved. I think Arthur points too much emphasis on someone writing their own programs and following the unambiguous rules in the tutorial, vs. someone inheriting code written by others, and trying to puzzle out what's going on. The pro programmer (paid) is as likely to be in the latter situation as the former. It's the readability of code that we're after, long after the original coders are out of the picture, not just how easy it is to dash off something and have it do what you want (that's programming for self alone as reader, and is the hallmark of non-professional coding). Kirby