lexical analysis of python
andrew cooke
andrew at acooke.org
Tue Mar 10 22:35:23 EDT 2009
robert.muller2 at gmail.com wrote:
> I understand the method, but when you say you "count one DEDENT for
> each level"
> well lets say you counted 3 of them. Do you have a way to interject 3
> consecutive
> DEDENT tokens into the token stream so that the parser receives them
> before it
> receives the next real token?
i don't know any details of your lexer, but perhaps you could emit a
single token that included the current value (DEDENTS, 3) for example.
then it would only require a very simple wrapper around the lexer to
convert that into three tokens.
in that way you would stay with a machine-generated lexer and parser, with
just a simple hand-written shim between them.
andrew
More information about the Python-list
mailing list