[Python-ideas] Hooking between lexer and parser

Andrew Barnert abarnert at yahoo.com
Mon Jun 8 06:31:21 CEST 2015


On Jun 7, 2015, at 21:23, Neil Girdhar <mistersheik at gmail.com> wrote:
> 
> Yes, but in this case the near term "problem" was as far as I can tell just parsing floats as decimals, which is easily done with a somewhat noisy function call.  I don't see why it's important.

This isn't the only case anyone's ever wanted. The tokenize module has been there since at least 1.5, and presumably it wasn't added for no good reason, or made to work with 3.x just for fun. And it has an example use in the docs. 

The only thing that's changed is that, now that postprocessing the AST has become a lot easier and less hacky because of the ast module and the succession of changes to the import process, the fact that tokenize is still clumsy and hacky is more noticeable.


More information about the Python-ideas mailing list