[Cython] [cython-users] Cython .pxd introspection: listing defined constants
robertwb at math.washington.edu
Thu Feb 17 22:25:10 CET 2011
On Thu, Feb 17, 2011 at 5:29 AM, W. Trevor King <wking at drexel.edu> wrote:
> This thread is coming over to cython-dev (and the new cython-devel)
> from cython-users because it turns out it will probably require
> chaning the Cython code. To get everyone who hasn't been following on
> cython-users up to speed, here's a summary of what I'm trying to do:
> That's what I was trying to give with this:
> On Wed, Feb 09, 2011 at 12:23:25PM -0500, W. Trevor King wrote:
>> I'm wrapping an external C library with Cython, so I have `mylib.pxd`:
>> cdef extern from 'mylib.h'
>> enum: CONST_A
>> enum: CONST_B
>> where I declare each constant macro from the library's header `mylib.h`:
>> #define CONST_A 1
>> #define CONST_B 2
>> Now I want to expose those constants in Python, so I have `expose.pyx`:
>> cimport mylib
>> CONST_A = mylib.CONST_A
>> CONST_B = mylib.CONST_B
>> But the last part seems pretty silly. I'd like to do something like
>> cimport mylib
>> import sys
>> for name in dir(mylib):
>> setattr(sys.modules[__name__], name, getattr(mylib, name))
>> which compiles fine, but fails to import with...
> Looking into the Cython internals, everything defined in mylib.pxd is
> stored as `Entry`s in a `ModuleScope`, and...
> On Wed, Feb 16, 2011 at 03:55:19PM -0800, Robert Bradshaw wrote:
>> On Wed, Feb 16, 2011 at 8:17 AM, W. Trevor King <wking at drexel.edu> wrote:
>> > What I'm missing is a way to bind the ModuleScope namespace to a name
>> > in expose.pyx so that commands like `dir(mylib)` and `getattr(mylib,
>> > name)` will work in expose.pyx.
>> You have also hit into the thorny issue that .pxd files are used for
>> many things. They may be pure C library declarations with no Python
>> module backing, they may be declarations of (externally implemented)
>> Python modules (such as numpy.pxd), or they may be declarations for
>> Cython-implemented modules.
>> > It seems like it would be easier to generate some kind of wrapper
>> > class (PxdModule?) for mylib when it is cimported (at compile time),
>> > and then further interactions would take care of themselves (at run
>> > time).
>> Would such an object be created anew for every module that cimports
>> the declaration file?
> Hmm, That doesn't sound very nice, does it. However, .pxd files
> declaring C libraries have no Python-space presence, so that was my
> initial idea.
>> I have toyed with the idea of subclassing the module object itself for
>> better support of C-level attributes from the Python (and Cython)
> Sorry, I don't understand "better support of C-level attributes". Can
> you give an example?
The extern cpdef declarations are an example of this.
>> Here's another idea, what if extern blocks could contain cpdef
>> declarations, which would automatically generate a Python-level
>> wrappers for the declared members (if possible, otherwise an error)?
> Ah, this sounds good! Of the three .pxd roles you list above,
> external Python modules (e.g. numpy) and Cython-implemented modules
> (e.g. matched .pxd/.pyx) both already have a presence in Python-space.
> What's missing is a way to give (where possible) declarations of
> external C libraries a Python presence. cpdef fills this hole nicely,
> since its whole purpose is to expose Python interfaces to
> C-based elements.
In the case of external Python modules, I'm not so sure we want to
monkey-patch our stuff in (and where would we do it--on the first
import of a cimporting module?)
> A side effect of this cpdef change would be that now even bare .pxd
> files (no matching .pyx) would have a Python presence,
Where would it live? Would we just create this module (in essence,
acting as if there was an empty .pyx file sitting there as well)? On
this note, it may be worth pursuing the idea of a "cython helper"
module where common code and objects could live.
> so You could do
> something like
> cimport mylib as mylib_c
> import mylib as mylib_py
> import sys
> # Access through Python
> for name in dir(mylib_py):
> setattr(sys.modules[__name__], name, getattr(mylib_py, name))
I think this smells worse than "import *"
> # Direct C access
> cdef get_a():
> return mylib_c.CONST_A
> Where the Python access would be the new feature, and list all
> cpdef-ed stuff.
> However, from Parsing.py:2369:
> error(pos, "C struct/union/enum cannot be declared cpdef")
> From pyrex_differences.rst:
> If a function is declared :keyword:`cpdef` it can be called from
> and overridden by both extension and normal python subclasses.
> I believe the reason that cpdef-ed enums and similar are currently
> illegal is confusion between "can be called from Python" and "can be
> overridden from Python".
The reason that error statement is there is because it had no meaning,
so an error was better than just ignoring it.
> I think these should be just like methods
> already are, in that you can "override" a method by subclassing it,
> but not by rebinding the name in the base class:
> >>> import pyximport; pyximport.install()
> >>> import rectangle as R
> >>> r = R.Rectangle(1, 2, 3, 4)
> >>> r.area = lambda(self): r.x1'
> Traceback (most recent call last):
> File "<string>", line 1, in <module>
> AttributeError: 'rectangle.Rectangle' object attribute 'area' is read-only
> Where rectangle.pyx is a minorly patched version of the last example
> from early_binding_for_speed.rst  and `area` is a cpdef-ed method.
> Why can't enums share this handling, with the enum taking the place of
> the method and the enum's module taking the place of the class? After
> all, enums have a Python-side tyoe (int or long). Unions don't really
> have a Python parallel,
They can be a cdef class wrapping the union type.
> but structs do, so long as you can select
> which attributes should have (writable) Python interfaces. If we
> change the struct declaration syntax to be closer to the `cdef class`
> declaration syntax:
> cpdef struct Foo:
> cpdef public int intA
> cpdef readonly int intB
> cdef void *ptr
> We would both declare the important members of the C struct (as we can
> already do in Cython) and also have Cython automatically generate a
> Python class wrapping the struct (because of `cpdef struct`). The
> Python class would have:
> * Cython-generated getter/setter for intA (because of `cpdef public`)
> using the standard Python<->int coercion.
> * Similar Cython-generated getter for int B (because of `cpdef
> * No Python access to ptr (standard C-access still possible through
> Doing something crazy like `cdef public void *ptr` would raise a
> compile-time error.
Yes, all of the above was exactly what I was proposing.
> I'm definately willing to help out with this (if someone will point me
> in the right direction),
That would be great.
> as the enum stuff would fix my original
> problem, and the struct stuff would allow me to rip out a bunch of
> boilerplate like
> cdef class Foo (object):
> cdef mylib.Foo _Foo
> def _intA_get(self):
> return self._Foo.intA
> def _intA_set(self, value):
> self._Foo.intA = value
> intA = property(fget=_intA_get, fset=_intA_set)
> def _intB_get(self):
> return self._Foo.intB
> intB = property(fget=_intB_get)
> from my wrapper code.
>  While testing my overriding method example, I found a small typo
> in cython-docs' early_binding_for_speed.rst. Patch attached.
More information about the cython-devel