[Python-Dev] Pymalloc and backward compatibility

Moore, Paul Paul.Moore@atosorigin.com
Thu, 4 Apr 2002 14:08:16 +0100

I've been watching the pymalloc discussions going on recently, and I'm
concerned that there may be some fairly major compatibility changes coming.
I'd like to check that I understand the intention correctly.

To provide some background - I wrote the Python interface for the Vim
editor, a long time ago. At the time, Python 1.4 was the current version. I
was *very* new to Python at the time, and this was my first real C extension
(and is still my biggest - I haven't had much of a need to write my own C
extensions since...) I didn't particularly understand the details of the
API, so I basically just copied code from existing samples, and was pleased
at how easily it all worked.

The code has survived essentially unchanged through to Python 2.2, and still
works fine. However, I'd be surprised if it doesn't take some fairly extreme
liberties (mixing API "families", relying on all Del/Free calls mapping to
the same underlying free(), etc). I can't even risk "fixing" it as I don't
have any way of checking which API calls appeared in which version (I recall
a message in one of the pymalloc threads to the effect that 1.4-compatible
code was bound to include some of these bugs, because certain APIs weren't
available in 1.4)

OK, so maybe it's time to give up on Python 1.4 compatibility. But ideally,
I'd prefer to go for an approach which leaves the code used under Python 2.2
and earlier completely unchanged, and only adds the minimum necessary for
newer versions.

Longer term, it might be worth considering upgrading the code to use some of
the newer features in the Python API, but in all honesty that's not a great
priority for me (and unlikely to be even remotely on the cards until I can
realistically desupport everything before 2.0...)

It seems to me that there are two options open for me:

1. The Python API from 1.4 is still guaranteed to work unchanged (even given
the liberties mentioned above). In that case, do nothing.
2. There are changes required. In that case, make them, protected by python
API checks (annoying ifdef fun).

While (1) is superficially nicer from my point of view, it just defers the
problem - at *some* point, the backward compatibility load becomes too much.
So the 1.4-compatible code is deprecated, at best. In that case, I'd rather
change things now, while the issues are fresh.

The problem with (2) is that my immediate reaction is to do something like
(pseudocode because I'm too lazy to look up the real API names and calls)

    #ifdef PYTHON_VERSION >= 2.3
        PyObject *myObj = Py_WhizzyNewAllocator();
        PyObject *myObj = Py_CruftyOldAllocator();

That works fine - and looks OK from a maintenance standpoint, as long as the
new API really is new. If the new API merely blesses existing calls as the
"official right thing", the above code is going to look odd to a future
maintainer (Why is the call to Py_WhizzyNewAllocator only enabled for 2.3
onwards? That API call was available from 1.5.2...)

So I think what I'm saying is that my preference is for a completely new set
of allocation APIs for Python 2.3, with all old APIs deprecated (and
scheduled for removal at some time in the future). This should be clearly
documented in the manual, including a list of all deprecated APIs to allow
extension writers to grep for code that needs changing. Sample
ifdef-protected type code like the above, for people like me who tend to
treat all this as black magic anyway, would probably be a nice bonus.

If this is what was planned anyway, just ignore me. I'll go back to sleep