[Numpy-discussion] numpy array sharing between processes?
Andrew Straw
strawman at astraw.com
Sat May 12 17:45:15 EDT 2007
Ray Schumacher wrote:
>
> After Googling for examples on this, in the Cookbook
> http://www.scipy.org/Cookbook/Multithreading
> MPI and POSH (dead?), I don't think I know the answer...
> We have a data collection app running on dual core processors; I start
> one thread collecting/writing new data directly into a numpy circular
> buffer, another thread does correlation on the newest data and
> occasional FFTs, both now use 50% CPU, total.
> The threads never need to access the same buffer slices.
> I'd prefer to have two processes, forking the FFT process off and
> utilizing the second core. The processes would only need to share two
> variables (buffer insert position and a short_integer result from the
> FFT process, each process would only read or write), in addition to
> the numpy array itself.
>
> Should I pass the numpy address to the second process and just create
> an identical array there, as in
> http://projects.scipy.org/pipermail/numpy-discussion/2006-October/023647.html
> ?
>
> Use a file-like object to share the other variables? mmap?
> http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/413807
>
> I also thought ctypes
> ctypes.string_at(address[, size])
> might do both easily enough, although would mean a copy. We already
> use it for the collection thread.
> Does anyone have a lightweight solution to this relatively simple sort
> of problem?
I'll pitch in a few donuts (and my eternal gratitude) for an example of
shared memory use using numpy arrays that is cross platform, or at least
works in linux, mac, and windows.
-Andrew
More information about the NumPy-Discussion
mailing list