On Thu, 21 May 2009 03:44:15 am Raymond Hettinger wrote:
As pointed out elsewhere in this thread, very often the superior solution is not to re-format the same statement to fit, but to re-factor the code so it's less deeply indented or does less in each statement.
Yes, "refactor" is a much better word than "reformat" for what I was thinking.
So, it seems that support for an 80 character limit is rooted in a desire to have other people program differently than they do, somehow making their programs better just because they are adhering to an arbitrary limit on horizontal text width. That seems somewhat magical.
Can I remind you that the 80 character limit is for the standard library, not your own personal code? Use whatever limit you like in your own personal libraries, or no limit at all. If your productivity is enhanced by writing 100-character lines, or 300 for that matter, then go right ahead.
Some people find their productivity is enhanced with an 80 character limit. There's nothing "magical" about this -- we've given some reasons. 80 character lines fit comfortably in email and printed pages without wrapping, they can be viewed even on small screens without horizontal scrolling, and, yes, some people find that this constraint enhances our productivity and so we use the same limit in our own code, not just the stdlib. We're not forcing you to do the same.
The 80 character limit is a lowest-common denominator. Having a 36" high-resolution monitor, or the visual acuity to read 7pt text, should not be a prerequisite for reading the stdlib. For this common code base, forcing everyone to limit line length is less of an imposition than forcing everyone to deal with long lines.
Maybe the limit should be 40 chars, then everyone will have to refactor, and line wrap, and use smaller idents (like Google does), and use more abbreviated variable names.
No, 40 characters would be just foolish. The cost-benefit of constraints is not linear: there is a sweet-spot, where the cost is the minimum and the benefit is the maximum, and I believe that is around 80 characters, or possibly even a little shorter: I use a soft-limit of 70 characters in my own code, but I don't force that on anyone else.
Maybe the origin of the 80 char limit isn't an anachronism. Maybe it had nothing to do with teletypes and low resolution CRTs. Perhaps, the programming gods of old were reaching into the future with certain knowledge that they were forcing everyone to do the right thing.
What possible relevance is the origin? Standards can change their justification over time. Commons used to exist so that the peasants would have somewhere to graze their goats. Not a lot of people have goats any more, but we still have commons, only now they're called "parks".
If the character limit didn't remain relevant, we'd remove it, but it is relevant: people do print out code onto paper, people do still have poor eyesight requiring larger font sizes, or small monitors, or large monitors with four side-by-side editor windows.
-- Steven D'Aprano