puccio_13 at yahoo.it
Wed Apr 23 11:45:46 CEST 2003
Alessio Pace wrote:
> Hi, I was wondering which is a good lexer written in Python, I heard about
> SPARK and PLY. I need only to tokenize texts, not to build trees on their
> string.split() I guess it is not enough, because I can't use reg exp as
> token separators..How could I do otherwise?
Ehm.... it seems like I solved...
tokens = re.split("reg_exp", string_text)
More information about the Python-list