Python handles globals badly.

Steven D'Aprano steve at pearwood.info
Wed Sep 9 19:22:41 CEST 2015


On Thu, 10 Sep 2015 02:09 am, Christian Gollwitzer wrote:

> Am 09.09.15 um 05:23 schrieb Steven D'Aprano:
>> And yes, the fellow Joe who completely missed the point of the blog post,
>> and made the comment "You don’t think you’re wrong and that’s part of a
>> much larger problem, but you’re still wrong" completely deserved to be
>> called out on his lack of reading comprehension and smugness.
> 
> This sentence is a very arrogant one, yes, but it is quoted from the
> article. Overall I have the feeling that the point he wants to make is a
> very subtle one.

Subtle enough that his quoting the author's words back at him escaped me,
but I suspect not subtle enough to escape the author's notice. Hence the
reply "Thanks for leaving your comment. I’m sure it’s made you feel very
clever."

I think my favourite is the guy who claims that the reason natural languages
all count from 1 is because the Romans failed to invent zero. (What about
languages that didn't derive from Latin, say, Chinese?) And that now that
we have zero, counting from it should be more natural. So if you give
somebody two apples, but no banana, and ask them to count their apples,
they would count "Zero, one... therefore I have two apples". And if you
then ask them to count their bananas, they would do what?

+++ Divide By Cucumber Error. Please Reinstall Universe And Reboot +++



-- 
Steven



More information about the Python-list mailing list