On Wed, May 20, 2009 at 4:02 PM, Steven D'Aprano email@example.com:
On Thu, 21 May 2009 03:44:15 am Raymond Hettinger wrote:
As pointed out elsewhere in this thread, very often the superior solution is not to re-format the same statement to fit, but to re-factor the code so it's less deeply indented or does less in each statement.
Yes, "refactor" is a much better word than "reformat" for what I was thinking.
So, it seems that support for an 80 character limit is rooted in a desire to have other people program differently than they do, somehow making their programs better just because they are adhering to an arbitrary limit on horizontal text width. That seems somewhat magical.
Can I remind you that the 80 character limit is for the standard library, not your own personal code? Use whatever limit you like in your own personal libraries, or no limit at all. If your productivity is enhanced by writing 100-character lines, or 300 for that matter, then go right ahead.
I see part of the problem now. People perceive the PEP-8 as the way they should write all code, not just the standard library. It seems to be passed around as the be-all-end-all, but in fact it might only represent what is good for the standard library and only the standard library. Perhaps commercial code bases, or other types of code bases should not blindly subscribe to this ideal. The 80 character width limit seems like this might be a candidate for consideration of flexibility depending on some critical factors. However, having a line width limit seems like a uniformly good one...to say which specific character to break up the lines might depend greatly (or solely) on the tools you (or your company) use(s).
As time marches forward and email editors don't wrap (mine doesn't), printers are used less (which is already happening), etc. then standards for core libraries will probably change as well. Forward thinking is important and backwards compatibility is also important. Writing code at 80 characters takes more time, but it definitely ensures that any future standard will be compatible with it (i.e. a future 100 character width standard won't be offended by 80 character wrapped lines). So, at some point, I would predict 100 will become more prudent for standard libraries in Python and then (assuming the language is still thriving after many many years) it might become 120, etc. But it will just take time.
Either way, the style for coding within a non-standard library might want to be revisited much sooner. In other words, programming to a standard which is common to all people who use it means you probably must accommodate the lowest common denominator. This would not be true for anything but the standard library.
Some people find their productivity is enhanced with an 80 character limit. There's nothing "magical" about this -- we've given some reasons. 80 character lines fit comfortably in email and printed pages without wrapping, they can be viewed even on small screens without horizontal scrolling, and, yes, some people find that this constraint enhances our productivity and so we use the same limit in our own code, not just the stdlib. We're not forcing you to do the same.
The 80 character limit is a lowest-common denominator. Having a 36" high-resolution monitor, or the visual acuity to read 7pt text, should not be a prerequisite for reading the stdlib. For this common code base, forcing everyone to limit line length is less of an imposition than forcing everyone to deal with long lines.
Maybe the limit should be 40 chars, then everyone will have to refactor, and line wrap, and use smaller idents (like Google does), and use more abbreviated variable names.
No, 40 characters would be just foolish. The cost-benefit of constraints is not linear: there is a sweet-spot, where the cost is the minimum and the benefit is the maximum, and I believe that is around 80 characters, or possibly even a little shorter: I use a soft-limit of 70 characters in my own code, but I don't force that on anyone else.
Maybe the origin of the 80 char limit isn't an anachronism. Maybe it had nothing to do with teletypes and low resolution CRTs. Perhaps, the programming gods of old were reaching into the future with certain knowledge that they were forcing everyone to do the right thing.
What possible relevance is the origin? Standards can change their justification over time. Commons used to exist so that the peasants would have somewhere to graze their goats. Not a lot of people have goats any more, but we still have commons, only now they're called "parks".
If the character limit didn't remain relevant, we'd remove it, but it is relevant: people do print out code onto paper, people do still have poor eyesight requiring larger font sizes, or small monitors, or large monitors with four side-by-side editor windows.
This is a minor point, but I would argue that the origin is relevant. It helps to realize what assumptions are being made about the code base, which may or may not be true anymore. In the case of a "park", it is obvious that the origin was questioned at some point, since "parks" needn't have grass anymore. (where grass=80 character limit width)
> > >
-- Steven D'Aprano
Python-ideas mailing list Pythonfirstname.lastname@example.org http://mail.python.org/mailman/listinfo/python-ideas