[Cython] Wacky idea: proper macros

mark florisson markflorisson88 at gmail.com
Sun Apr 29 11:56:23 CEST 2012


On 29 April 2012 08:42, Nathaniel Smith <njs at pobox.com> wrote:
> On Sat, Apr 28, 2012 at 10:25 PM, mark florisson
> <markflorisson88 at gmail.com> wrote:
>> On 28 April 2012 22:04, Nathaniel Smith <njs at pobox.com> wrote:
>>> Was chatting with Wes today about the usual problem many of us have
>>> encountered with needing to use some sort of templating system to
>>> generate code handling multiple types, operations, etc., and a wacky
>>> idea occurred to me. So I thought I'd through it out here.
>>>
>>> What if we added a simple macro facility to Cython, that worked at the
>>> AST level? (I.e. I'm talking lisp-style macros, *not* C-style macros.)
>>> Basically some way to write arbitrary Python code into a .pyx file
>>> that gets executed at compile time and can transform the AST, plus
>>> some nice convenience APIs for simple transformations.
>>>
>>> E.g., if we steal the illegal token sequence @@ as our marker, we
>>> could have something like:
>>>
>>> @@ # alone on a line, starts a block of Python code
>>> from Cython.MacroUtil import replace_ctype
>>> def expand_types(placeholder, typelist):
>>>  def my_decorator(function_name, ast):
>>>    functions = {}
>>>    for typename in typelist:
>>>      new_name = "%s_%s" % (function_name, typename)
>>>      functions[name] = replace_ctype(ast, placeholder, typename)
>>>    return functions
>>>  return function_decorator
>>> @@ # this token sequence cannot occur in Python, so it's a safe end-marker
>>>
>>> # Compile-time function decorator
>>> # Results in two cdef functions named sum_double and sum_int
>>> @@expand_types("T", ["double", "int"])
>>> cdef T sum(np.ndarray[T] arr):
>>>  cdef T start = 0;
>>>  for i in range(arr.size):
>>>    start += arr[i]
>>>  return start
>>>
>>> I don't know if this is a good idea, but it seems like it'd be very
>>> easy to do on the Cython side, fairly clean, and be dramatically less
>>> horrible than all the ad-hoc templating stuff people do now.
>>> Presumably there'd be strict limits on how much backwards
>>> compatibility we'd be willing to guarantee for code that went poking
>>> around in the AST by hand, but a small handful of functions like my
>>> notional "replace_ctype" would go a long way, and wouldn't impose much
>>> of a compatibility burden.
>>>
>>> -- Nathaniel
>>> _______________________________________________
>>> cython-devel mailing list
>>> cython-devel at python.org
>>> http://mail.python.org/mailman/listinfo/cython-devel
>>
>> Have you looked at http://wiki.cython.org/enhancements/metaprogramming ?
>>
>> In general I would like better meta-programming support, maybe even
>> allow defining new operators (although I'm not sure any of it is very
>> pythonic), but for templates I think fused types should be used, or
>> improved when they fall short. Maybe a plugin system could also help
>> people.
>
> I hadn't seen that, no -- thanks for the link.
>
> I have to say that the examples in that link, though, give me the
> impression of a cool solution looking for a problem. I've never wished
> I could symbolically differentiate Python expressions at compile time,
> or create a mutant Python+SQL hybrid language. Actually I guess I've
> only missed define-syntax once in maybe 10 years of hacking in
> Python-the-language: it's neat how if you do 'plot(x, log(y))' in R it
> will peek at the caller's syntax tree to automagically label the axes
> as "x" and "log(y)", and that can't be done in Python. But that's not
> exactly a convincing argument for a macro system.
>
> But generating optimized code is Cython's whole selling point, and
> people really are doing klugey tricks with string-based preprocessors
> just to generate multiple copies of loops in Cython and C.
>
> Also, fused types are great, but: (1) IIUC you can't actually do
> ndarray[fused_type] yet, which speaks to the feature's complexity, and

What? Yes you can do that.

> (2) to handle Wes's original example on his blog (duplicating a bunch
> of code between a "sum" path and a "product" path), you'd actually
> need something like "fused operators", which aren't even on the
> horizon. So it seems unlikely that fused types will grow to cover all
> these cases in the near future.

Although it doesn't handle contiguity or dimensional differences,
currently the efficient fused operator is a function pointer. Wouldn't
passing in a float64_t (*reducer)(float64_t, float64_t) work in this
case (in the face of multiple types, you can have fused parameters in
the function pointer as well)?

I agree with Dag that Julia has nice metaprogramming support, maybe
functions could take arbitrary compile time expressions as extra
arguments.

> Of course some experimentation would be needed to find the right
> syntax and convenience functions for this feature too, so maybe I'm
> just being over-optimistic and it would also turn out to be very
> complicated :-). But it seems like some simple AST search/replace
> functions would get you a long way.
>
> - N
> _______________________________________________
> cython-devel mailing list
> cython-devel at python.org
> http://mail.python.org/mailman/listinfo/cython-devel


More information about the cython-devel mailing list