2.6, 3.0, and truly independent intepreters

Andy O'Meara andy55 at gmail.com
Thu Oct 30 14:12:19 CET 2008

> Okay, here's the bottom line:
> * This is not about the GIL.  This is about *completely* isolated
> interpreters; most of the time when we want to remove the GIL we want
> a single interpreter with lots of shared data.
> * Your use case, although not common, is not extraordinarily rare
> either.  It'd be nice to support.
> * If CPython had supported it all along we would continue to maintain
> it.
> * However, since it's not supported today, it's not worth the time
> invested, API incompatibility, and general breakage it would imply.
> * Although it's far more work than just solving your problem, if I
> were to remove the GIL I'd go all the way and allow shared objects.

Great recap (although saying "it's not about the GIL" may cause some
people lose track of the root issues here, but your following comment
GIL removal shows that we're on the same page).

> So there's really only two options here:
> * get a short-term bodge that works, like hacking the 3rd party
> library to use your shared-memory allocator.  Should be far less work
> than hacking all of CPython.

The problem there is that we're not talking about a single 3rd party
API/allocator--there's many, including the OS which has its own
internal allocators.  My video encoding example is meant to illustrate
a point, but the real-world use case is where there's allocators all
over the place from all kinds of APIs, and when you want your C module
to reenter the interpreter often to execute python helper code.

> * invest yourself in solving the *entire* problem (GIL removal with
> shared python objects).

Well, as I mentioned, I do represent a company willing an able to
expend real resources here.  However, as you pointed out, there's some
serious work at hand here (sadly--it didn't have to be this way) and
there seems to be some really polarized people here that don't seem as
interested as I am to make python more attractive for app developers
shopping for an interpreter to embed.

>From our point of view, there's two other options which unfortunately
seem to be the only out the more we seem to uncover with this

3) Start a new python implementation, let's call it "CPythonES",
specifically targeting performance apps and uses an explicit object/
context concept to permit the free threading under discussion here.
The idea would be to just implement the core language, feature set,
and a handful of modules.  I refer you to that list I made earlier of
"essential" modules.

4) Drop python, switch to Lua.

The interesting thing about (3) is that it'd be in the same spirit as
how OpenGL ES came to be (except in place of the need for free
threading was the fact the standard OpenGL API was too overgrown and
painful for the embedded scale).

We're currently our own in-house version of (3), but we unfortunately
have other priorities at the moment that would otherwise slow this
down.  Given the direction of many-core machines these days, option
(3) or (4), for us, isn't a question of *if*, it's a question of
*when*.  So that's basically where we're at right now.

As to my earlier point about representing a company ready to spend
real resources, please email me off-list if anyone here would have an
interest in an open "CPythonES" project (and get full compensation).
I can say for sure that we'd be able to lead with API framework design
work--that's my personal strength and we have a lot of real world
experience there.


More information about the Python-list mailing list