Pythonic Y2K
Avi Gross
avigross at verizon.net
Fri Jan 18 14:45:48 EST 2019
Michael,
I certainly agree with you on most of what you said. My internal humor makes
me see patterns that are not at all applicable. Such as the common "2" in
python 2.X and Y2K which influence my choice of phrases. And, perhaps even
more oddly, I see Y and K in both Y2K and the slightly odd spelling of
pYthoniK, and even 2/to. Just the way my mind works as it searches for
patterns knowing that most of them do not really exist and then trying to
winnow it down to those that do. Can't expect anyone to follow and would be
afraid of anyone that can.
As stated, nothing horrible happens immediately when python 2.X is
officially declared as no longer supported by the main python organizations.
I am sure they can go to NetFLIX or FOX and be revived as a series for a
while longer like Longmire and Last Man Standing after their original
networks abandoned them.
But it can take something noisy to get the attention of some people. For
Y2K, there was a very real threat even if it only impacted one program in a
hundred or even 10,000. That one could be quite serious if it was in a
mission critical component. What if it decided to delete lots of data or
declare people dead or send a missile the wrong way (as if there is a right
way!) The problem was not just the two digits used but that many languages
might allow longer items to write into other parts of memory leading to
further corruption. Or a program simply changed to handle wider data might
encounter old data and do odd things or now be too big to fit into memory or
even have some weird sporadic race condition now that it took a tad longer
to run and so on. And, fixing something quickly (or even slowly) might
introduce brand new errors. There was plenty to be scared about. Change is
scary but sometimes what is even scarier is no change.
In the end not much happened BUT I wonder what could have happened if nobody
went through most of the code and often adjusted it. Some vaccines have
worked so well that people say we don't need them because hardly anyone gets
that infection any more. Really? Does that mean it was a waste to use it in
the first place or perhaps there is some cause/effect there.
I am often asked to proofread something before it is sent out. I find an
amazing number of things including many quite relevant and obvious. What is
scary is I keep seeing errors in things that get out there. Most are
harmless and are either not noticed at all, such as a repeated "of" but in
a computer language, something like that can just break things. I was amused
recently at a book that seemed to use the word "score" when talking about a
python module that does statistical work (more like Machine Learning) and
noted it was a mistake that should have said "core = -1" because it was an
option allowing the algorithm to run in parallel on as many cores (i.e. CPU)
as the machine had available. Again, the spelling did matter but I suspect
few would notice. That is why Y2K often required teams of people with
different backgrounds to slog through code reviews so more problems would be
found than any one person might notice.
Anyone who has been involved in projects where you get errors reported and
recorded as modification requests may relate to this. The requests are often
routed around to be validated and see if they are duplicates and then
assigned priorities and some make it into future versions. What bothered me
is we would see reports coming from unit testing and then more in system
tests and then from the field after deployment but they just kept coming.
Some problems will only be seen if you take some convoluted pathway.
Y2K was a problem they should have seen decades away and some did.
I don't make fun of COBOL. It was a tool largely designed for a purpose that
made some people who like Business stuff to be happy or be able to do more
than they used to. FORTRAN and LISP and other such tools made other such
groups happy. Yes, Modern Languages may make many early ones look like toys
NOW as we stand on the shoulders of Lilliputians.
But my view of the world sees marketing and politics and frankly radicals
using scare tactics as major drivers of events and they tend to overdo to
the point where everything becomes a crisis and therefore nothing can be
taken seriously even if it turns out to BE a crisis. The Software industry
is no different.
So I suggest a sober discussion on benefits that may occur. I note some
improvements are NOT. My machine keeps rebooting with other strange behavior
because it is trying to load an improvement/UPDATE that oddly was already
done last month. Turns out I am not alone and a fix involves uninstalling
the old one the hard way. Microsoft does things like this frequently when an
update messes things up and another update is needed to the point where many
simply decide to wait a while.
That brings me to the point. One horrible thing about some programs such as
C was that they could fail drastically at run time. Blue Screen of Death
type of fail. You had to save your work regularly. Languages like python
allow you to catch exceptions and deal intelligently with them to at least
close down gracefully or even recover. Heck, many programs depend on this
and instead of cluttering their code with lots of tests, wait for the error
to happen and adjust in the rare case it does.
So how reasonable would it be to still have lots of legacy software using
languages and programmers that have no easy ways to make their programs more
robust? How would such software react if it received information say in
UNICODE?
I predict that many people and companies have ignored warnings that the 2.X
train would someday be diverted to a dead-end track. It was always far
enough off in the future. But when a LAST DATE for updates is announced,
some may sit up and take notice. It may literally take something like
Insurance Companies (or the VC types) refusing to continue supporting them
if they do not change, to get them the hint. And, over time, many companies
do go under or are bought by another and often that will cause old projects
to go away or morph.
But there is nothing fundamentally wrong with using 2.X. As I said jokingly,
if anyone wanted to keep it and support it as a DIFFERENT language than the
more modern python, fine.
-----Original Message-----
From: Python-list <python-list-bounces+avigross=verizon.net at python.org> On
Behalf Of Michael Torrie
Sent: Friday, January 18, 2019 10:36 AM
To: python-list at python.org
Subject: Re: Pythonic Y2K
On 01/16/2019 12:02 PM, Avi Gross wrote:
> I recall the days before the year 2000 with the Y2K scare when people
> worried that legacy software might stop working or do horrible things
> once the clock turned. It may even have been scary enough for some
> companies to rewrite key applications and even switch from languages like
COBOL.
Of course it wasn't just a scare. The date rollover problem was very real.
It's interesting that now we call it the Y2K "scare" and since most things
came through that okay we often suppose that the people who were warning
about this impending problem were simply being alarmist and prophets of
doom. We often deride them. But the fact is, people did take these
prophets of doom seriously and there was a massive, even heroic effort, to
fix a lot of these critical backend systems so that disaster was avoided
(just barely). I'm not talking about PCs rolling over to 00. I'm talking
about banking software, mission critical control software. It certainly was
scary enough for a lot of companies to spend a lot of money rewriting key
software. The problem wasn't with COBOL necessarily.
In the end disaster was averted (rather narrowly) thanks to the hard work of
a lot of people, and thanks to the few people who were vocal in warning of
the impending issue.
That said, I'm not sure Python 2.7's impending EOL is comparable to the Y2K
crisis.
--
https://mail.python.org/mailman/listinfo/python-list
More information about the Python-list
mailing list