[capi-sig] Embedded Python in C application

Swapnil Talekar swapnil.st at gmail.com
Sat Sep 27 07:24:10 CEST 2008


On Sat, Sep 27, 2008 at 3:02 AM, Adam Olsen <rhamph at gmail.com> wrote:

> On Fri, Sep 26, 2008 at 9:16 AM, Eljay Love-Jensen <eljay at adobe.com>
> wrote:
> > Hi everyone,
> >
> > First, my apologies if I'm in the wrong forum for my "embedding Python in
> a
> > C application" questions.  Please redirect me if I've wandered into the
> > wrong place.
> >
> > I have two needs for using Python in my application that I hope has an
> easy
> > answer without rewriting Python's internals.
> >
> > I need to use Python* in a multi-threaded application, where separate
> > threads may be working on very long lasting Python scripts, and other
> > threads may be involved in short Python scripts.  None of the Python
> scripts
> > running concurrently have any shared state with any of the other Python
> > scripts running concurrently.  Number of threads is in the 100-1000
> range.
> >
> > I need to manage Python's use of the heap by providing a memory pool for
> > Python to use, rather than allowing Python to use malloc/free.  This is
> to
> > prevent memory fragmentation, and to allow easy disposal of a memory pool
> > used for a closed Python interpreter instance.
> >
> > A quick view of Py_Initialize() indicates that Python does not return
> some
> > sort of "Py_State" pointer which represents the entire state of a Python
> > interpreter.  (Nor some sort of Py_Alloc().)  Nor accepts a custom
> > malloc/free function pointers.  Hmmm.
> >
> > Does anyone have experience with using Python in this fashion?
>
>           >Don't use multiple interpreters.  They're not really separate,
> they're
>           >buggy, they offer *NO* advantage to you over just using multiple
>           >threads.
>

>they're buggy? sure. they'r not really separate? well, now if you want
>to have multiple threads running scripts, I don't see how you can get away
>without having multiple interpreters (in the same process)and they
>REALLY have to be separate. That's not a easy task though. As I said,
>the separation has to be more than just separate PyInterpreterStates

>
> Likewise, you can't force memory to be freed, as it'd still be used by
> python.
>
> The only way to force cleanup is to spawn a subprocess.  This'd also
> let you use multiple cores.  You can probably mitigate the startup
> cost by having a given subprocess run several short scripts or one
> long script.


>Well, if you have your own memory manager, i.e. other than
>Python's and you are embedding the Interpreter in your application.
>I don't see any reason why you should not be able to cleanup at any
>appropriate point you think. Python is still using the memory? sure it is.
>But not after its done with the script



-- Swapnil Talekar


More information about the capi-sig mailing list