"Phillip J. Eby"
Me, I could probably live with:
def p_expr_term(self,args) [ spark.rule( """expr ::= expr + term term ::= term * factor""" ) ]: return AST(...)
Come to think, I tried to do something rather similar while attempting to write an expect clone in python. I would suggest the syntax def p_expr_term(self, args) {rule : """expr ::= expr + term term ::= term * factor"""}: return AST(...) I.e. allow a dictionary constructor expression between the argument list and the colon; this initializes the property dictionary of the code object. One could stick a call to spark.rule() in there, or have a metaclass do it automatically. This allows the square bracket notation to be reserved for [classmethod] and the like. I suppose there's nothing stopping square brackets from being used for both, but I like having consistency with other dictionary constructor expressions. zw